WorldWideScience

Sample records for distributed mammogram analysis

  1. MAMMOGRAMS ANALYSIS USING SVM CLASSIFIER IN COMBINED TRANSFORMS DOMAIN

    Directory of Open Access Journals (Sweden)

    B.N. Prathibha

    2011-02-01

    Full Text Available Breast cancer is a primary cause of mortality and morbidity in women. Reports reveal that earlier the detection of abnormalities, better the improvement in survival. Digital mammograms are one of the most effective means for detecting possible breast anomalies at early stages. Digital mammograms supported with Computer Aided Diagnostic (CAD systems help the radiologists in taking reliable decisions. The proposed CAD system extracts wavelet features and spectral features for the better classification of mammograms. The Support Vector Machines classifier is used to analyze 206 mammogram images from Mias database pertaining to the severity of abnormality, i.e., benign and malign. The proposed system gives 93.14% accuracy for discrimination between normal-malign and 87.25% accuracy for normal-benign samples and 89.22% accuracy for benign-malign samples. The study reveals that features extracted in hybrid transform domain with SVM classifier proves to be a promising tool for analysis of mammograms.

  2. Application of texture analysis method for mammogram density classification

    Science.gov (United States)

    Nithya, R.; Santhi, B.

    2017-07-01

    Mammographic density is considered a major risk factor for developing breast cancer. This paper proposes an automated approach to classify breast tissue types in digital mammogram. The main objective of the proposed Computer-Aided Diagnosis (CAD) system is to investigate various feature extraction methods and classifiers to improve the diagnostic accuracy in mammogram density classification. Texture analysis methods are used to extract the features from the mammogram. Texture features are extracted by using histogram, Gray Level Co-Occurrence Matrix (GLCM), Gray Level Run Length Matrix (GLRLM), Gray Level Difference Matrix (GLDM), Local Binary Pattern (LBP), Entropy, Discrete Wavelet Transform (DWT), Wavelet Packet Transform (WPT), Gabor transform and trace transform. These extracted features are selected using Analysis of Variance (ANOVA). The features selected by ANOVA are fed into the classifiers to characterize the mammogram into two-class (fatty/dense) and three-class (fatty/glandular/dense) breast density classification. This work has been carried out by using the mini-Mammographic Image Analysis Society (MIAS) database. Five classifiers are employed namely, Artificial Neural Network (ANN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), K-Nearest Neighbor (KNN), and Support Vector Machine (SVM). Experimental results show that ANN provides better performance than LDA, NB, KNN and SVM classifiers. The proposed methodology has achieved 97.5% accuracy for three-class and 99.37% for two-class density classification.

  3. Anatomic breast coordinate system for mammogram analysis

    DEFF Research Database (Denmark)

    Karemore, Gopal; Brandt, S.; Karssemeijer, N.

    2011-01-01

    was represented by geodesic distance (s) from nipple and parametric angle (¿) as shown in figure 1. The scoring technique called MTR (mammographic texture resemblance marker) used this breast coordinate system to extract Gaussian derivative features. The features extracted using the (x,y) and the curve......Purpose Many researchers have investigated measures also other than density in the mammogram such as measures based on texture to improve breast cancer risk assessment. However, parenchymal texture characteristics are highly dependent on the orientation of vasculature structure and fibrous tissue...... methodologies as seen from table 2 in given temporal study. Conclusion The curve-linear anatomical breast coordinate system facilitated computerized analysis of mammograms. The proposed coordinate system slightly improved the risk segregation by Mammographic Texture Resemblance and minimized the geometrical...

  4. Use of prior mammograms in the transition to digital mammography: A performance and cost analysis

    International Nuclear Information System (INIS)

    Taylor-Phillips, S.; Wallis, M.G.; Duncan, A.; Gale, A.G.

    2012-01-01

    Breast screening in Europe is gradually changing from film to digital imaging and reporting of cases. In the transition period prior mammograms (from the preceding screening round) are films thereby potentially causing difficulties in comparison to current digital mammograms. To examine this breast screening performance was measured at a digital mammography workstation with prior mammograms displayed in different formats, and the associated costs calculated. 160 selected difficult cases (41% malignant) were read by eight UK qualified mammography readers in three conditions: with film prior mammograms; with digitised prior mammograms; or without prior mammograms. Lesion location and probability of malignancy were recorded, alongside a decision of whether to recall each case for further tests. JAFROC analysis showed a difference between conditions (p = .006); performance with prior mammograms in either film or digitised formats was superior to that without prior mammograms (p < .05). There was no difference in the performance when the prior mammograms were presented in film or digitised form. The number of benign or normal cases recalled was 26% higher without prior mammograms than with digitised or film prior mammograms (p < .05). This would correspond to an increase in recall rate at the study hospital from 4.3% to 5.5% with no associated increase in cancer detection rate. The cost of this increase was estimated to be £11,581 (€13,666) per 10,000 women screened, which is higher than the cost of digitised (£11,114/€13,115), or film display (£6451/€7612) of the prior mammograms. It is recommended that, where available, prior mammograms are used in the transition to digital breast screening.

  5. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  6. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  7. TU-F-18C-09: Mammogram Surveillance Using Texture Analysis for Breast Cancer Patients After Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Kuo, H; Tome, W; FOX, J; Hong, L; Yaparpalvi, R; Mehta, K; Bodner, W; Kalnicki, S [Montefiore Medical Center/Albert Einstein College of Medicine, Bronx, New York (United States); Huang, Y [Memorial Sloan-Kettering Cancer Center, Great Neck, NY (United States)

    2014-06-15

    Purpose: To study the feasibility of applying cancer risk model established from treated patients to predict the risk of recurrence on follow-up mammography after radiation therapy for both ipsilateral and contralateral breast. Methods: An extensive set of textural feature functions was applied to a set of 196 Mammograms from 50 patients. 56 Mammograms from 28 patients were used as training set, 44 mammograms from 22 patients were used as test set and the rest were used for prediction. Feature functions include Histogram, Gradient, Co-Occurrence Matrix, Run-Length Matrix and Wavelet Energy. An optimum subset of the feature functions was selected by Fisher Coefficient (FO) or Mutual Information (MI) (up to top 10 features) or a method combined FO, MI and Principal Component (FMP) (up to top 30 features). One-Nearest Neighbor (1-NN), Linear Discriminant Analysis (LDA) and Nonlinear Discriminant Analysis (NDA) were utilized to build a risk model of breast cancer from the training set of mammograms at the time of diagnosis. The risk model was then used to predict the risk of recurrence from mammogram taken one year and three years after RT. Results: FPM with NDA has the best classification power in classifying the training set of the mammogram with lesions versus those without lesions. The model of FPM with NDA achieved a true positive (TP) rate of 82% compared to 45.5% of using FO with 1-NN. The best false positive (FP) rates were 0% and 3.6% in contra-lateral breast of 1-year and 3-years after RT, and 10.9% in ipsi-lateral breast of 3-years after RT. Conclusion: Texture analysis offers high dimension to differentiate breast tissue in mammogram. Using NDA to classify mammogram with lesion from mammogram without lesion, it can achieve rather high TP and low FP in the surveillance of mammogram for patient with conservative surgery combined RT.

  8. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  9. Mammogram CAD, hybrid registration and iconic analysis

    Science.gov (United States)

    Boucher, A.; Cloppet, F.; Vincent, N.

    2013-03-01

    This paper aims to develop a computer aided diagnosis (CAD) based on a two-step methodology to register and analyze pairs of temporal mammograms. The concept of "medical file", including all the previous medical information on a patient, enables joint analysis of different acquisitions taken at different times, and the detection of significant modifications. The developed registration method aims to superimpose at best the different anatomical structures of the breast. The registration is designed in order to get rid of deformation undergone by the acquisition process while preserving those due to breast changes indicative of malignancy. In order to reach this goal, a referent image is computed from control points based on anatomical features that are extracted automatically. Then the second image of the couple is realigned on the referent image, using a coarse-to-fine approach according to expert knowledge that allows both rigid and non-rigid transforms. The joint analysis detects the evolution between two images representing the same scene. In order to achieve this, it is important to know the registration error limits in order to adapt the observation scale. The approach used in this paper is based on an image sparse representation. Decomposed in regular patterns, the images are analyzed under a new angle. The evolution detection problem has many practical applications, especially in medical images. The CAD is evaluated using recall and precision of differences in mammograms.

  10. Evaluation of mammogram compression efficiency

    International Nuclear Information System (INIS)

    Przelaskowski, A.; Surowski, P.; Kukula, A.

    2005-01-01

    Lossy image coding significantly improves performance over lossless methods, but a reliable control of diagnostic accuracy regarding compressed images is necessary. The acceptable range of compression ratios must be safe with respect to as many objective criteria as possible. This study evaluates the compression efficiency of digital mammograms in both numerically lossless (reversible) and lossy (irreversible) manner. Effective compression methods and concepts were examined to increase archiving and telediagnosis performance. Lossless compression as a primary applicable tool for medical applications was verified on a set 131 mammograms. Moreover, nine radiologists participated in the evaluation of lossy compression of mammograms. Subjective rating of diagnostically important features brought a set of mean rates given for each test image. The lesion detection test resulted in binary decision data analyzed statistically. The radiologists rated and interpreted malignant and benign lesions, representative pathology symptoms, and other structures susceptible to compression distortions contained in 22 original and 62 reconstructed mammograms. Test mammograms were collected in two radiology centers for three years and then selected according to diagnostic content suitable for an evaluation of compression effects. Lossless compression efficiency of the tested coders varied, but CALIC, JPEG-LS, and SPIHT performed the best. The evaluation of lossy compression effects affecting detection ability was based on ROC-like analysis. Assuming a two-sided significance level of p=0.05, the null hypothesis that lower bit rate reconstructions are as useful for diagnosis as the originals was false in sensitivity tests with 0.04 bpp mammograms. However, verification of the same hypothesis with 0.1 bpp reconstructions suggested their acceptance. Moreover, the 1 bpp reconstructions were rated very similarly to the original mammograms in the diagnostic quality evaluation test, but the

  11. Detection of microcalcifications in television-enhanced nonmagnified screen-film mammograms compared with matching magnification unenhanced mammograms

    International Nuclear Information System (INIS)

    Kimme-Smith, C.; Gormley, L.S.; Gold, R.H.; Bassett, L.W.

    1988-01-01

    The object of this investigation was to determine which imaging method was associated with greater accuracy in the interpretation of breast microcalcifications: 1.5X to 2.0X magnification with a microfocal spot or Damon DETECT-enhanced mammograms. The authors' test series consisted of matched pairs of images of 31 breasts, each containing a cluster of microcalcifications within a biopsy-proved benign (N = 21) or malignant (N =10) lesion. Three experienced mammographers and three senior radiology residents with 2 weeks of training in mammography interpreted the calcifications. On the basis of receiver operating characteristic analysis, the authors conclude that (1) inexperienced mammographers should not use television image enhancement alone to evaluate microcalcifications and (2) television-enhanced mammograms are not a substitute for microfocal spot magnification mammograms

  12. An anatomically oriented breast coordinate system for mammogram analysis

    DEFF Research Database (Denmark)

    Brandt, Sami; Karemore, Gopal; Karssemeijer, Nico

    2011-01-01

    and the shape of the breast boundary because these are the most robust features independent of the breast size and shape. On the basis of these landmarks, we have constructed a nonlinear mapping between the parameter frame and the breast region in the mammogram. This mapping makes it possible to identify...... the corresponding positions and orientations among all of the ML or MLO mammograms, which facilitates an implicit use of the registration, i.e., no explicit image warping is needed. We additionally show how the coordinate transform can be used to extract Gaussian derivative features so that the feature positions...... and orientations are registered and extracted without non-linearly deforming the images. We use the proposed breast coordinate transform in a cross-sectional breast cancer risk assessment study of 490 women, in which we attempt to learn breast cancer risk factors from mammograms that were taken prior to when...

  13. Detection of architectural distortion in prior screening mammograms using Gabor filters, phase portraits, fractal dimension, and texture analysis

    International Nuclear Information System (INIS)

    Rangayyan, Rangaraj M.; Prajna, Shormistha; Ayres, Fabio J.; Desautels, J.E.L.

    2008-01-01

    Mammography is a widely used screening tool for the early detection of breast cancer. One of the commonly missed signs of breast cancer is architectural distortion. The purpose of this study is to explore the application of fractal analysis and texture measures for the detection of architectural distortion in screening mammograms taken prior to the detection of breast cancer. A method based on Gabor filters and phase portrait analysis was used to detect initial candidates for sites of architectural distortion. A total of 386 regions of interest (ROIs) were automatically obtained from 14 ''prior mammograms'', including 21 ROIs related to architectural distortion. From the corresponding set of 14 ''detection mammograms'', 398 ROIs were obtained, including 18 related to breast cancer. For each ROI, the fractal dimension and Haralick's texture features were computed. The fractal dimension of the ROIs was calculated using the circular average power spectrum technique. The average fractal dimension of the normal (false-positive) ROIs was significantly higher than that of the ROIs with architectural distortion (p = 0.006). For the ''prior mammograms'', the best receiver operating characteristics (ROC) performance achieved, in terms of the area under the ROC curve, was 0.80 with a Bayesian classifier using four features including fractal dimension, entropy, sum entropy, and inverse difference moment. Analysis of the performance of the methods with free-response receiver operating characteristics indicated a sensitivity of 0.79 at 8.4 false positives per image in the detection of sites of architectural distortion in the ''prior mammograms''. Fractal dimension offers a promising way to detect the presence of architectural distortion in prior mammograms. (orig.)

  14. Improved Screening Mammogram Workflow by Maximizing PACS Streamlining Capabilities in an Academic Breast Center.

    Science.gov (United States)

    Pham, Ramya; Forsberg, Daniel; Plecha, Donna

    2017-04-01

    The aim of this study was to perform an operational improvement project targeted at the breast imaging reading workflow of mammography examinations at an academic medical center with its associated breast centers and satellite sites. Through careful analysis of the current workflow, two major issues were identified: stockpiling of paperwork and multiple worklists. Both issues were considered to cause significant delays to the start of interpreting screening mammograms. Four workflow changes were suggested (scanning of paperwork, worklist consolidation, use of chat functionality, and tracking of case distribution among trainees) and implemented in July 2015. Timestamp data was collected 2 months before (May-Jun) and after (Aug-Sep) the implemented changes. Generalized linear models were used to analyze the data. The results showed significant improvements for the interpretation of screening mammograms. The average time elapsed for time to open a case reduced from 70 to 28 min (60 % decrease, p workflow for diagnostic mammograms at large unaltered even with increased volume of mammography examinations (31 % increase of 4344 examinations for May-Jun to 5678 examinations for Aug-Sep). In conclusion, targeted efforts to improve the breast imaging reading workflow for screening mammograms in a teaching environment provided significant performance improvements without affecting the workflow of diagnostic mammograms.

  15. Use of prior mammograms in the classification of benign and malignant masses

    International Nuclear Information System (INIS)

    Varela, Celia; Karssemeijer, Nico; Hendriks, Jan H.C.L.; Holland, Roland

    2005-01-01

    The purpose of this study was to determine the importance of using prior mammograms for classification of benign and malignant masses. Five radiologists and one resident classified mass lesions in 198 mammograms obtained from a population-based screening program. Cases were interpreted twice, once without and once with comparison of previous mammograms, in a sequential reading order using soft copy image display. The radiologists' performances in classifying benign and malignant masses without and with previous mammograms were evaluated with receiver operating characteristic (ROC) analysis. The statistical significance of the difference in performances was calculated using analysis of variance. The use of prior mammograms improved the classification performance of all participants in the study. The mean area under the ROC curve of the readers increased from 0.763 to 0.796. This difference in performance was statistically significant (P = 0.008)

  16. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  17. Analysis and comparison of breast density according to age on mammogram between Korean and Western women

    International Nuclear Information System (INIS)

    Kim, Seung Hyung; Kim, Mi Hye; Oh, Ki Keun

    2000-01-01

    To compare changes in breast parenchymal density among diverse age groups in asymptomatic Korean women with those of Western women, and to evaluate the effect of different patterns of breast parenchymal density on the sensitivity of screening mammography in Korean women. We analyzed the distribution of breast parenchymal density among diverse age groups in 823 asymptomatic Korean women aged 30-64 who underwent screening mammography between January and December 1998. On the basis of ACR BI-RADS breast composition, four density patterns were designated: patterns 1 and 2 related to fatty mammograms, and patterns 3 and 4 to dense mammograms. We compared the results with those for western women. In Korean women, the frequency of dense mammogram was 88.1% (30-34 years old), 91.1% (35-39), 78.3% (40-44), 61.1% (45-49), 30.1% (50-54), 21.1% (55-59), and 7.0% (60-64). Korean women in their 40s thus showed a higher frequency of dense mammograms, but this frequency decreased abruptly between the ages of 40 and 54. In Western women, however, there was little difference between 40 and 54-year-olds: the figures were 47.2% (40-44 years), 44.8% (45-49), and 44.4% (50-54). Because the frequency of their dense mammograms shows little change between Western women in their forties and in their fifties, it is clear that between these two age groups, mammographic sensitivity is only slightly different. Because the frequency of dense mammograms is much greater among Korean women in their forties than among Western women of the same age, and among korean women this frequency decreases abruptly, it appears, however, that the mammographic sensitivity of korean women is less among those in their forties than among those in their fifties. It is therefore thought that mammography combined with ultrasonography may increase screening sensitivity among Korean women under 50, who have a relatively higher incidence of breast cancer in the younger age groups than do Western women. (author)

  18. Detecting mammographically occult cancer in women with dense breasts using Radon Cumulative Distribution Transform: a preliminary analysis

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M.; Rohde, Gustavo K.

    2018-02-01

    We propose using novel imaging biomarkers for detecting mammographically-occult (MO) cancer in women with dense breast tissue. MO cancer indicates visually occluded, or very subtle, cancer that radiologists fail to recognize as a sign of cancer. We used the Radon Cumulative Distribution Transform (RCDT) as a novel image transformation to project the difference between left and right mammograms into a space, increasing the detectability of occult cancer. We used a dataset of 617 screening full-field digital mammograms (FFDMs) of 238 women with dense breast tissue. Among 238 women, 173 were normal with 2 - 4 consecutive screening mammograms, 552 normal mammograms in total, and the remaining 65 women had an MO cancer with a negative screening mammogram. We used Principal Component Analysis (PCA) to find representative patterns in normal mammograms in the RCDT space. We projected all mammograms to the space constructed by the first 30 eigenvectors of the RCDT of normal cases. Under 10-fold crossvalidation, we conducted quantitative feature analysis to classify normal mammograms and mammograms with MO cancer. We used receiver operating characteristic (ROC) analysis to evaluate the classifier's output using the area under the ROC curve (AUC) as the figure of merit. Four eigenvectors were selected via a feature selection method. The mean and standard deviation of the AUC of the trained classifier on the test set were 0.74 and 0.08, respectively. In conclusion, we utilized imaging biomarkers to highlight differences between left and right mammograms to detect MO cancer using novel imaging transformation.

  19. Clustering microcalcifications techniques in digital mammograms

    Science.gov (United States)

    Díaz, Claudia. C.; Bosco, Paolo; Cerello, Piergiorgio

    2008-11-01

    Breast cancer has become a serious public health problem around the world. However, this pathology can be treated if it is detected in early stages. This task is achieved by a radiologist, who should read a large amount of mammograms per day, either for a screening or diagnostic purpose in mammography. However human factors could affect the diagnosis. Computer Aided Detection is an automatic system, which can help to specialists in the detection of possible signs of malignancy in mammograms. Microcalcifications play an important role in early detection, so we focused on their study. The two mammographic features that indicate the microcalcifications could be probably malignant are small size and clustered distribution. We worked with density techniques for automatic clustering, and we applied them on a mammography CAD prototype developed at INFN-Turin, Italy. An improvement of performance is achieved analyzing images from a Perugia-Assisi Hospital, in Italy.

  20. Should previous mammograms be digitised in the transition to digital mammography?

    International Nuclear Information System (INIS)

    Taylor-Phillips, S.; Gale, A.G.; Wallis, M.G.

    2009-01-01

    Breast screening specificity is improved if previous mammograms are available, which presents a challenge when converting to digital mammography. Two display options were investigated: mounting previous film mammograms on a multiviewer adjacent to the workstation, or digitising them for soft copy display. Eight qualified screen readers were videotaped undertaking routine screen reading for two 45-min sessions in each scenario. Analysis of gross eye and head movements showed that when digitised, previous mammograms were examined a greater number of times per case (p=0.03), due to a combination of being used in 19% more cases (p=0.04) and where used, looked at a greater number of times (28% increase, p=0.04). Digitising previous mammograms reduced both the average time taken per case by 18% (p=0.04) and the participants' perceptions of workload (p < 0.05). Digitising previous analogue mammograms may be advantageous, in particular in increasing their level of use. (orig.)

  1. Computerized classification of mass lesions in digital mammograms

    International Nuclear Information System (INIS)

    Giger, M.L.; Doi, K.; Yin, F.F.; Schmidt, R.A.; Vyborny, C.J.

    1989-01-01

    Subjective classification of masses on mammograms is a difficult task. On average, about 25% of masses referred for surgical biopsy are actually malignant. The authors are developing, as an aid to radiologists, a computerized scheme for the classification of lesions in mammograms to reduce the false-negative and false-positive diagnoses of malignancies. The classification scheme involves the extraction of border information from the mammographic lesion in order to quantify the degree of spiculation, which is related to the possibility of malignancy. Clinical film mammograms are digitized with an optical drum scanner (0.1-mm pixel size) for analysis on a Micro VAX 3500 computer. Border information (fluctuations) is obtained from the difference between the lesion border and its smoothed border. Using the rms variation of the frequency content of these fluctuations, approximately 85% of the cancerous lesions were correctly classified as malignant, while 15% of benign lesions were misclassified, in a preliminary study

  2. Computerized detection of mass lesions in digital mammograms

    International Nuclear Information System (INIS)

    Yin, F.F.; Giger, M.L.; Doi, K.; Metz, C.E.; Vyborny, C.J.; Schmidt, R.A.

    1989-01-01

    Early detection of breast cancer from the periodic screening of asymptomatic women could reduce breast cancer mortality by at least 40%. The authors are developing a computerized scheme for the detection of mass lesions in digital mammograms as an aid to radiologists in such high volume screening programs. Based on left-right architectural symmetry and gray-level histogram analysis, bilateral subtraction of left and right breast images is performed. False-positive detections included in bilateral-difference images are reduced with various images feature-extraction techniques. The database involves clinical film mammograms digitized by a TV camera and analyzed on a Micro-VAX workstation. Among five different bilateral subtraction techniques investigated, a nonlinear approach provided superior lesion enhancement. Feature-extraction techniques reduced substantially the remaining false-positives. Preliminary results, for 32 pairs of clinical mammograms, yielded a true-positive rate of approximately 95% with a false-positive rate of about 2 per image

  3. Mammogram segmentation using maximal cell strength updation in cellular automata.

    Science.gov (United States)

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  4. Computerized image analysis: estimation of breast density on mammograms

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.

    2000-06-01

    An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.

  5. Decision support system for breast cancer detection using mammograms.

    Science.gov (United States)

    Ganesan, Karthikeyan; Acharya, Rajendra U; Chua, Chua K; Min, Lim C; Mathew, Betty; Thomas, Abraham K

    2013-07-01

    Mammograms are by far one of the most preferred methods of screening for breast cancer. Early detection of breast cancer can improve survival rates to a greater extent. Although the analysis and diagnosis of breast cancer are done by experienced radiologists, there is always the possibility of human error. Interobserver and intraobserver errors occur frequently in the analysis of medical images, given the high variability between every patient. Also, the sensitivity of mammographic screening varies with image quality and expertise of the radiologist. So, there is no golden standard for the screening process. To offset this variability and to standardize the diagnostic procedures, efforts are being made to develop automated techniques for diagnosis and grading of breast cancer images. This article presents a classification pipeline to improve the accuracy of differentiation between normal, benign, and malignant mammograms. Several features based on higher-order spectra, local binary pattern, Laws' texture energy, and discrete wavelet transform were extracted from mammograms. Feature selection techniques based on sequential forward, backward, plus-l-takeaway-r, individual, and branch-and-bound selections using the Mahalanobis distance criterion were used to rank the features and find classification accuracies for combination of several features based on the ranking. Six classifiers were used, namely, decision tree classifier, fisher classifier, linear discriminant classifier, nearest mean classifier, Parzen classifier, and support vector machine classifier. We evaluated our proposed methodology with 300 mammograms obtained from the Digital Database for Screening Mammography and 300 mammograms from the Singapore Anti-Tuberculosis Association CommHealth database. Sensitivity, specificity, and accuracy values were used to compare the performances of the classifiers. Our results show that the decision tree classifier demonstrated an excellent performance compared to

  6. Quantification and normalization of x-ray mammograms

    International Nuclear Information System (INIS)

    Tromans, Christopher E; Cocker, Mary R; Brady, Michael

    2012-01-01

    The analysis of (x-ray) mammograms remains qualitative, relying on the judgement of clinicians. We present a novel method to compute a quantitative, normalized measure of tissue radiodensity traversed by the primary beam incident on each pixel of a mammogram, a measure we term the standard attenuation rate (SAR). SAR enables: the estimation of breast density which is linked to cancer risk; direct comparison between images; the full potential of computer aided diagnosis to be utilized; and a basis for digital breast tomosynthesis reconstruction. It does this by removing the effects of the imaging conditions under which the mammogram is acquired. First, the x-ray spectrum incident upon the breast is calculated, and from this, the energy exiting the breast is calculated. The contribution of scattered radiation is calculated and subtracted. The SAR measure is the scaling factor that must be applied to the reference material in order to match the primary attenuation of the breast. Specifically, this is the scaled reference material attenuation which when traversed by an identical beam to that traversing the breast, and when subsequently detected, results in the primary component of the pixel intensity observed in the breast image. We present results using two tissue equivalent phantoms, as well as a sensitivity analysis to detector response changes over time and possible errors in compressed thickness measurement. (paper)

  7. Diagnostic image quality of mammograms in German outpatient medical care

    International Nuclear Information System (INIS)

    Pfandzelter, R.; Wuelfing, U.; Boedeker, B.

    2010-01-01

    Purpose: A total of 79 115 mammograms from statutory health insurance (SHI) physicians within German outpatient care were evaluated with respect to the diagnostic image quality. Materials and Methods: Mammograms were randomly selected between 2006 and 2008 by the regional Associations of Statutory Health Insurance Physicians and submitted to regional boards of experts for external evaluation. The mammogram quality was evaluated using a 3-point scale (adequate, borderline, failure) and documented using a nationally standardized protocol. Results: 87.6 % of the mammograms were classified as adequate, 11.0 % as borderline and 1.4 % as failure. Mediolateral oblique mammograms (mlo) had worse ratings than craniocaudal mammograms (cc). Main reasons for classifying the mammograms as borderline or failure were 'inframammary fold not adequately visualized' (mlo), 'pectoral muscle not in the correct angle or not to the level with the nipple' (mlo), 'the nipple not in profile' (mlo, cc) and 'breast not completely or not adequately visualized' (cc). Conclusion: The results show a good overall quality of mammograms in German outpatient medical care. Failures can be associated predominantly with incorrect positioning of the breast. More precisely defined quality criteria using objective measures are recommended, especially for craniocaudal mammograms (cc). (orig.)

  8. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  9. Association between Radiologists' Experience and Accuracy in Interpreting Screening Mammograms

    Directory of Open Access Journals (Sweden)

    Maristany Maria-Teresa

    2008-04-01

    Full Text Available Abstract Background Radiologists have been observed to differ, sometimes substantially, both in their interpretations of mammograms and in their recommendations for follow-up. The aim of this study was to determine how factors related to radiologists' experience affect the accuracy of mammogram readings. Methods We selected a random sample of screening mammograms from a population-based breast cancer screening program. The sample was composed of 30 women with histopathologically-confirmed breast cancer and 170 women without breast cancer after a 2-year follow-up (the proportion of cancers was oversampled. These 200 mammograms were read by 21 radiologists routinely interpreting mammograms, with different amount of experience, and by seven readers who did not routinely interpret mammograms. All readers were blinded to the results of the screening. A positive assessment was considered when a BI-RADS III, 0, IV, V was reported (additional evaluation required. Diagnostic accuracy was calculated through sensitivity and specificity. Results Average specificity was higher in radiologists routinely interpreting mammograms with regard to radiologists who did not (66% vs 56%; p Conclusion Among radiologists who read routinely, volume is not associated with better performance when interpreting screening mammograms, although specificity decreased in radiologists not routinely reading mammograms. Follow-up of cases for which further workup is recommended might reduce variability in mammogram readings and improve the quality of breast cancer screening programs.

  10. Detection of masses in mammograms by analysis of gradient vector convergence using sector filter

    International Nuclear Information System (INIS)

    Fakhari, Y.; Karimian, A.; Mohammadbeigi, M.

    2012-01-01

    Although mammography is the main diagnostic method for breast cancer, but the interpretation of mammograms is a difficult task and depends on the experience and skill of the radiologists. Computer Aided Detection (CADe) systems have been proposed to help radiologist in interpretation of mammograms. In this paper a novel filter called Sector filter is proposed to detect masses. This filter works based on the analysis of convergence of gradient vectors toward the center of filter. Using this filter, rounded convex regions, which are more likely to be pertained to a mass, could be detected in a gray scale image. After applying this filter on the images with two scales and their linear combination suspicious points were selected by a specific process. After implementation of the proposed method, promising results were achieved. The performance of the proposed method in this research was competitive or in some cases even better than that of other suggested methods in the literature. (authors)

  11. Interobserver variability in interpretation of mammogram

    International Nuclear Information System (INIS)

    Lee, Kyung Jae; Lee, Hae Kyung; Lee, Won Chul; Hwang, In Young; Park, Young Gyu; Jung, Sang Seol; Kim, Hoon Kyo; Kim, Mi Hye; Kim, Hak Hee

    2004-01-01

    The purpose of this study was to evaluate the performance of radiologists for mammographic screening, and to analyze interobserver agreement in the interpretation of mammograms. 50 women were selected as subjects from the patients who were screened with mammograms at two university hospitals. The images were analyzed by five radiologists working independently and without their having any knowledge of the final diagnosis. The interobserver variation was analyzed by using the kappa statistic. There were moderate agreements for the findings of the parenchymal pattern (k=0.44; 95% CI 0.39-0.49). calcification type (k=0.66; 95% CI 0.60-0.72) and calcification distribution (K=0.43; 95% CI 0.38-0.48). The mean kappa values ranged from 0.66 to 0.42 for the mass findings. The mean kappa value for the final conclusion was 0.44 (95% CI 0.38-0.51). In general, moderate agreement was evident for all the categories that were evaluated. The general agreement was moderate, but there was wide variability in some findings. To improve the accuracy and reduce variability among physicians in interpretation, proper training of radiologists and standardization of criteria are essential for breast screening

  12. Intelligent detection of microcalcification from digitized mammograms

    Indian Academy of Sciences (India)

    Permanent link: https://www.ias.ac.in/article/fulltext/sadh/036/01/0125-0139. Keywords. Breast cancer; digital mammogram; artificial neural networks; microcalcification detection. Abstract. This paper reports the design and implementation of an intelligent system for detection of microcalcification from digital mammograms.

  13. INDIAM--an e-learning system for the interpretation of mammograms.

    Science.gov (United States)

    Guliato, Denise; Bôaventura, Ricardo S; Maia, Marcelo A; Rangayyan, Rangaraj M; Simedo, Mariângela S; Macedo, Túlio A A

    2009-08-01

    We propose the design of a teaching system named Interpretation and Diagnosis of Mammograms (INDIAM) for training students in the interpretation of mammograms and diagnosis of breast cancer. The proposed system integrates an illustrated tutorial on radiology of the breast, that is, mammography, which uses education techniques to guide the user (doctors, students, or researchers) through various concepts related to the diagnosis of breast cancer. The user can obtain informative text about specific subjects, access a library of bibliographic references, and retrieve cases from a mammographic database that are similar to a query case on hand. The information of each case stored in the mammographic database includes the radiological findings, the clinical history, the lifestyle of the patient, and complementary exams. The breast cancer tutorial is linked to a module that simulates the analysis and diagnosis of a mammogram. The tutorial incorporates tools for helping the user to evaluate his or her knowledge about a specific subject by using the education system or by simulating a diagnosis with appropriate feedback in case of error. The system also makes available digital image processing tools that allow the user to draw the contour of a lesion, the contour of the breast, or identify a cluster of calcifications in a given mammogram. The contours provided by the user are submitted to the system for evaluation. The teaching system is integrated with AMDI-An Indexed Atlas of Digital Mammograms-that includes case studies, e-learning, and research systems. All the resources are accessible via the Web.

  14. Review and analysis of mammograms of the Servicio de Radiologia, Hospital Calderon Guardia from January 2013 to May 2013

    International Nuclear Information System (INIS)

    Lawrence Villalobos, Andrea; Solis Vargas, Carlos

    2013-01-01

    An analysis of 400 mammograms of the Radiology Service of the Calderon Guardia Hospital is carried out, linking the statistical findings between BIRADS Categorization, family hereditary factors, age groups and pathological personalities, of the patients treated in the period from January 2013 to May 2013 Calcifications are identified as the most frequent pathological findings in mammograms analyzed. Ultrasound is identified as the complementary method to mammography the most frequently used. Mammography is still the screening study for the early detection of breast cancer. The support of specialists in radiology of medical images is recommended to implement early reports in the second level of attention [es

  15. False Negative Mammogram of Breast Cancer : Analysis of Mammographic and Sonographic Findings and Correlation with Clinical Findings

    International Nuclear Information System (INIS)

    Lee, Kil Jun; Lee, Ji Yeon; Han, Sung Nim; Jeong, Seong Ki; Tae, Seok; Shin, Kyoung Ja; Lee, Sang Chun

    1995-01-01

    Recent mammographic equipment have been of good quality and yielded high diagnostic accuracy for the detection of breast cancer. However, negative mammogram does not necessarily rule out breast cancer. Therefore were viewed cause of false negative mammography in confirmed breast cancer to improve diagnostic accuracy and for adequate clinical approach. We reviewed 19 cases of confirmed breast cancer, which showed false negative mammography with positive sonographic findings. Retrospective analysis was done by correlating the patient's age, sonographic finding and mass size, mammographic breast pattern and cause of false negative mammogram, and clinical symptoms. Among the 5 patients below 35 years in age, mass was not visible due to dense breast in 4 and due to small size in 1 case. In 14 patients over 35 years in age, 11 had normal mammographic findings, 4 had dense breast, and 7 had small sized mass. Remaining 3 cases showed asymmetric density in 2 and architecture distortion in 1 case. All showed mass lesion in sonography : ill defined malignant appearance in 14,well defined malignant appearance in 2, and well defined benign in 3 cases. Negative mammogram should be correlated with sonography in case of dense breast, below 35 years in age with palpable mass and under risk for breast cancer

  16. Mammogram synthesis using a 3D simulation. I. Breast tissue model and image acquisition simulation

    International Nuclear Information System (INIS)

    Bakic, Predrag R.; Albert, Michael; Brzakovic, Dragana; Maidment, Andrew D. A.

    2002-01-01

    A method is proposed for generating synthetic mammograms based upon simulations of breast tissue and the mammographic imaging process. A computer breast model has been designed with a realistic distribution of large and medium scale tissue structures. Parameters controlling the size and placement of simulated structures (adipose compartments and ducts) provide a method for consistently modeling images of the same simulated breast with modified position or acquisition parameters. The mammographic imaging process is simulated using a compression model and a model of the x-ray image acquisition process. The compression model estimates breast deformation using tissue elasticity parameters found in the literature and clinical force values. The synthetic mammograms were generated by a mammogram acquisition model using a monoenergetic parallel beam approximation applied to the synthetically compressed breast phantom

  17. Global detection approach for clustered microcalcifications in mammograms using a deep learning network.

    Science.gov (United States)

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-04-01

    In computerized detection of clustered microcalcifications (MCs) from mammograms, the traditional approach is to apply a pattern detector to locate the presence of individual MCs, which are subsequently grouped into clusters. Such an approach is often susceptible to the occurrence of false positives (FPs) caused by local image patterns that resemble MCs. We investigate the feasibility of a direct detection approach to determining whether an image region contains clustered MCs or not. Toward this goal, we develop a deep convolutional neural network (CNN) as the classifier model to which the input consists of a large image window ([Formula: see text] in size). The multiple layers in the CNN classifier are trained to automatically extract image features relevant to MCs at different spatial scales. In the experiments, we demonstrated this approach on a dataset consisting of both screen-film mammograms and full-field digital mammograms. We evaluated the detection performance both on classifying image regions of clustered MCs using a receiver operating characteristic (ROC) analysis and on detecting clustered MCs from full mammograms by a free-response receiver operating characteristic analysis. For comparison, we also considered a recently developed MC detector with FP suppression. In classifying image regions of clustered MCs, the CNN classifier achieved 0.971 in the area under the ROC curve, compared to 0.944 for the MC detector. In detecting clustered MCs from full mammograms, at 90% sensitivity, the CNN classifier obtained an FP rate of 0.69 clusters/image, compared to 1.17 clusters/image by the MC detector. These results indicate that using global image features can be more effective in discriminating clustered MCs from FPs caused by various sources, such as linear structures, thereby providing a more accurate detection of clustered MCs on mammograms.

  18. Consensus double reading of mammograms in private practice

    International Nuclear Information System (INIS)

    Pacher, B.; Tscherney, R.; Litmann-Rowenta, B.; Liskutin, J.; Mazewski, I.; Leitner, H.; Tscholakoff, D.

    2004-01-01

    Purpose: To evaluate retrospectively the results of consensus double reading of mammograms in a private practice for a period of 1.5 years (November 2001 to March 2003). Materials and Method: Two independent experts with dedicated training read all mammograms on a weekly basis. All mammograms including sonographic examinations were evaluated independently and categorized using the Bl-RADS classification. The achieved consensus included a possible recommendation for recall or therapy. A total of 3936 mammograms and 1912 sonography studies were evaluated. All cases with BI-RADS 4 and 5 categories were compared with the histologic results. For a period of three months, the acceptance of double reading including a delay of the final report by one week was tested with a questionnaire and informed consent sheet. Results: BI-RADS categories 4 and 5 were found in 57 cases, with 41 consensus results by two independent readers and 26 carcinomas verified by histology. No consensus could be reached in 16 patients, of which 10 had a final histologic result, with 5 benign lesions and 5 carcinomas of less than 1 cm in diameter. Clinical symptoms or alterations were absent in all patients. The 5 carcinomas were discovered by the double reading procedure. The result of the questionnaire (695 questionnaires) showed a refusal rate of 0.7%, with only 5 women refusing the opportunity of double reading their mammograms. Conclusion: Double reading of mammograms by independent experts is feasible, shows a measurable increase in quality and is accepted by almost all women. (orig.)

  19. Risks of mammograms

    International Nuclear Information System (INIS)

    Swartz, H.M.; Reichling, B.A.

    1977-01-01

    In summary, the following practical guidelines for mammography are offered: 1. Any woman, regardless of age, with signs or symptoms that indicate breast cancer should have a mammogram. 2. A woman who has a high risk for breast cancer (e.g., strong family history, no pregnancy before 30 years of age, or a previous breast cancer) should receive periodic screening examinations, including mammography. 3. Periodic screening for asymptomatic women over the age of 50 is indicated. 4. The value of periodic screening for asymptomatic women who are not considered to be at high risk and are under the age of 50 years is not established. Such screening should be carried out only when useful data can be collected on the benefits and risks of this procedure. 5. For any individual woman, the risk of inducing breast cancer by mammography is very low. 6. Mammograms should be made only with modern equipment and techniques designed to provide optimum information with minimal dose

  20. Double versus single reading of mammograms in a breast cancer screening programme: a cost-consequence analysis.

    Science.gov (United States)

    Posso, Margarita C; Puig, Teresa; Quintana, Ma Jesus; Solà-Roca, Judit; Bonfill, Xavier

    2016-09-01

    To assess the costs and health-related outcomes of double versus single reading of digital mammograms in a breast cancer screening programme. Based on data from 57,157 digital screening mammograms from women aged 50-69 years, we compared costs, false-positive results, positive predictive value and cancer detection rate using four reading strategies: double reading with and without consensus and arbitration, and single reading with first reader only and second reader only. Four highly trained radiologists read the mammograms. Double reading with consensus and arbitration was 15 % (Euro 334,341) more expensive than single reading with first reader only. False-positive results were more frequent at double reading with consensus and arbitration than at single reading with first reader only (4.5 % and 4.2 %, respectively; p cancer detection rate were similar for both reading strategies (4.6 and 4.2 per 1000 screens; p = 0.283). Our results suggest that changing to single reading of mammograms could produce savings in breast cancer screening. Single reading could reduce the frequency of false-positive results without changing the cancer detection rate. These results are not conclusive and cannot be generalized to other contexts with less trained radiologists. • Double reading of digital mammograms is more expensive than single reading. • Compared to single reading, double reading yields a higher proportion of false-positive results. • The cancer detection rate was similar for double and single readings. • Single reading may be a cost-effective strategy in breast cancer screening programmes.

  1. Breast composition measurements using retrospective standard mammogram form (SMF)

    International Nuclear Information System (INIS)

    Highnam, R; Pan, X; Warren, R; Jeffreys, M; Smith, G Davey; Brady, M

    2006-01-01

    The standard mammogram form (SMF) representation of an x-ray mammogram is a standardized, quantitative representation of the breast from which the volume of non-fat tissue and breast density can be easily estimated, both of which are of significant interest in determining breast cancer risk. Previous theoretical analysis of SMF had suggested that a complete and substantial set of calibration data (such as mAs and kVp) would be needed to generate realistic breast composition measures and yet there are many interesting trials that have retrospectively collected images with no calibration data. The main contribution of this paper is to revisit our previous theoretical analysis of SMF with respect to errors in the calibration data and to show how and why that theoretical analysis did not match the results from the practical implementations of SMF. In particular, we show how by estimating breast thickness for every image we are, effectively, compensating for any errors in the calibration data. To illustrate our findings, the current implementation of SMF (version 2.2β) was run over 4028 digitized film-screen mammograms taken from six sites over the years 1988-2002 with and without using the known calibration data. Results show that the SMF implementation running without any calibration data at all generates results which display a strong relationship with when running with a complete set of calibration data, and, most importantly, to an expert's visual assessment of breast composition using established techniques. SMF shows considerable promise in being of major use in large epidemiological studies related to breast cancer which require the automated analysis of large numbers of films from many years previously where little or no calibration data is available

  2. Availability and accessibility of subsidized mammogram screening program in peninsular Malaysia: A preliminary study using travel impedance approach.

    Science.gov (United States)

    Mahmud, Aidalina; Aljunid, Syed Mohamed

    2018-01-01

    Access to healthcare is essential in the pursuit of universal health coverage. Components of access are availability, accessibility (spatial and non-spatial), affordability and acceptability. Measuring spatial accessibility is common approach to evaluating access to health care. This study aimed to determine the availability and spatial accessibility of subsidised mammogram screening in Peninsular Malaysia. Availability was determined from the number and distribution of facilities. Spatial accessibility was determined using the travel impedance approach to represent the revealed access as opposed to potential access measured by other spatial measurement methods. The driving distance of return trips from the respondent's residence to the facilities was determined using a mapping application. The travel expenditure was estimated by multiplying the total travel distance by a standardised travel allowance rate, plus parking fees. Respondents in this study were 344 breast cancer patients who received treatment at 4 referral hospitals between 2015 and 2016. In terms of availability, there were at least 6 major entities which provided subsidised mammogram programs. Facilities with mammogram involved with these programs were located more densely in the central and west coast region of the Peninsula. The ratio of mammogram facility to the target population of women aged 40-74 years ranged between 1: 10,000 and 1:80,000. In terms of accessibility, of the 3.6% of the respondents had undergone mammogram screening, their mean travel distance was 53.4 km (SD = 34.5, range 8-112 km) and the mean travel expenditure was RM 38.97 (SD = 24.00, range RM7.60-78.40). Among those who did not go for mammogram screening, the estimated travel distance and expenditure had a skewed distribution with median travel distance of 22.0 km (IQR 12.0, 42.0, range 2.0-340.0) and the median travel cost of RM 17.40 (IQR 10.40, 30.00, range 3.40-240.00). Higher travel impedance was noted among those who

  3. Automated detection of microcalcification clusters in mammograms

    Science.gov (United States)

    Karale, Vikrant A.; Mukhopadhyay, Sudipta; Singh, Tulika; Khandelwal, Niranjan; Sadhu, Anup

    2017-03-01

    Mammography is the most efficient modality for detection of breast cancer at early stage. Microcalcifications are tiny bright spots in mammograms and can often get missed by the radiologist during diagnosis. The presence of microcalcification clusters in mammograms can act as an early sign of breast cancer. This paper presents a completely automated computer-aided detection (CAD) system for detection of microcalcification clusters in mammograms. Unsharp masking is used as a preprocessing step which enhances the contrast between microcalcifications and the background. The preprocessed image is thresholded and various shape and intensity based features are extracted. Support vector machine (SVM) classifier is used to reduce the false positives while preserving the true microcalcification clusters. The proposed technique is applied on two different databases i.e DDSM and private database. The proposed technique shows good sensitivity with moderate false positives (FPs) per image on both databases.

  4. Detecting microcalcifications in digital mammogram using wavelets

    International Nuclear Information System (INIS)

    Yang Jucheng; Park Dongsun

    2004-01-01

    delegates of different kinds of the family wavelets. Among them, the Biothgonal wavelet is bi-orthogonal, but it is not orthogonal and symmetric. While other three wavelets are not only bi-orthogonal but also orthogonal, besides, they are near symmetric. These different characteristics will affect their detecting results. We first decompose the mammogram by using db4, bior3.7, coif3 and sym2 wavelet respectably, and for each family wavelet, the image is decompose into 4 levels (the first level is the original image). The detection of microcalcifications is accomplished by setting the wavelet coefficients of upper-left sub-band to zero in order to suppress the image background information before the reconstruction of the image. The reconstructed mammogram is expected to contain only high-frequency components, which include the microcalcifications. After the wavelets transform process, the third step is to locate microcalcifications through a thresholding operation. The labeling operation with a threshold changes each reconstructed image into a binary image. The threshold is determined through a series of simulation study. The final step of the proposed detection algorithm is the post-processing to eliminate tiny isolated points using binary morphological closing and opening operators. The digital mammogram database used in this work is the MIAS (Mammographic Image Analysis Society) database. The images in this database were scanned with a Joyce-Loebl microdensitometer SCANDIG-3, which has a linear response in the optical density range 0-3.2. Each pixel is 8-bits deep and at a resolution of 50 um x 50 um. And regions of microcalcifications have marked by the veteran diagnostician. Twenty five images (twelve with benign and thirteen with malignant microcalcifications) are selected for this experiment. The performance of the proposed algorithm is evaluated by a free-response receiver operating characteristic (FROC) in terms of true-positive (TP) fraction for a given number of

  5. Automated detection of microcalcification clusters in digital mammograms based on wavelet domain hidden Markov tree modeling

    International Nuclear Information System (INIS)

    Regentova, E.; Zhang, L.; Veni, G.; Zheng, J.

    2007-01-01

    A system is designed for detecting microcalcification clusters (MCC) in digital mammograms. The system is intended for computer-aided diagnostic prompting. Further discrimination of MCC as benign or malignant is assumed to be performed by radiologists. Processing of mammograms is based on the statistical modeling by means of wavelet domain hidden markov trees (WHMT). Segmentation is performed by the weighted likelihood evaluation followed by the classification based on spatial filters for a single microcalcification (MC) and a cluster of MC detection. The analysis is carried out on FROC curves for 40 mammograms from the mini-MIAS database and for 100 mammograms with 50 cancerous and 50 benign cases from DDSM database. The designed system is capable to detect 100% of true positive cases in these sets. The rate of false positives is 2.9 per case for mini-MIAS dataset; and 0.01 for the DDSM images. (orig.)

  6. Early detection of breast cancer mass lesions by mammogram segmentation images based on texture features

    International Nuclear Information System (INIS)

    Mahmood, F.H.

    2012-01-01

    Mammography is at present one of the available method for early detection of masses or abnormalities which is related to breast cancer.The calcifications. The challenge lies in early and accurate detection to overcome the development of breast cancer that affects more and more women throughout the world. Breast cancer is diagnosed at advanced stages with the help of the digital mammogram images. Masses appear in a mammogram as fine, granular clusters, which are often difficult to identify in a raw mammogram. The incidence of breast cancer in women has increased significantly in recent years. This paper proposes a computer aided diagnostic system for the extraction of features like mass lesions in mammograms for early detection of breast cancer. The proposed technique is based on a four-step procedure: (a) the preprocessing of the image is done, (b) regions of interest (ROI) specification, (c) supervised segmentation method includes two to stages performed using the minimum distance (M D) criterion, and (d) feature extraction based on Gray level Co-occurrence matrices GLC M for the identification of mass lesions. The method suggested for the detection of mass lesions from mammogram image segmentation and analysis was tested over several images taken from A L-llwiya Hospital in Baghdad, Iraq.The proposed technique shows better results.

  7. Mammogram retrieval through machine learning within BI-RADS standards.

    Science.gov (United States)

    Wei, Chia-Hung; Li, Yue; Huang, Pai Jung

    2011-08-01

    A content-based mammogram retrieval system can support usual comparisons made on images by physicians, answering similarity queries over images stored in the database. The importance of searching for similar mammograms lies in the fact that physicians usually try to recall similar cases by seeking images that are pathologically similar to a given image. This paper presents a content-based mammogram retrieval system, which employs a query example to search for similar mammograms in the database. In this system the mammographic lesions are interpreted based on their medical characteristics specified in the Breast Imaging Reporting and Data System (BI-RADS) standards. A hierarchical similarity measurement scheme based on a distance weighting function is proposed to model user's perception and maximizes the effectiveness of each feature in a mammographic descriptor. A machine learning approach based on support vector machines and user's relevance feedback is also proposed to analyze the user's information need in order to retrieve target images more accurately. Experimental results demonstrate that the proposed machine learning approach with Radial Basis Function (RBF) kernel function achieves the best performance among all tested ones. Furthermore, the results also show that the proposed learning approach can improve retrieval performance when applied to retrieve mammograms with similar mass and calcification lesions, respectively. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. The visibility of cancer on previous mammograms in retrospective review

    International Nuclear Information System (INIS)

    Saarenmaa, I.; Salminen, T.; Geiger, U.; Heikkinen, P.; Hyvarinen, S.; Isola, J.; Kataja, V.; Kokko, M.-L.; Kokko, R.; Kumpulainen, E.; Karkkainen, A.; Pakkanen, J.; Peltonen, P.; Piironen, A.; Salo, A.; Talviala, M.-L.; Hakama, M.

    2001-01-01

    AIM: To study how many tumours were visible in restrospect on mammograms originally reported as normal or benign in patients coming to surgery with proven breast cancer. The effect of making the pre--operative mammogram available was also assessed. MATERIALS AND METHODS: Three hundred and twenty initial mammograms of consecutive new breast cancer cases were analysed by a group of radiologists in the knowledge that all patients were later diagnosed with breast cancer. The films were read twice, first without and then with the later (pre-operative) mammograms available. The parenchymal density in the location of the tumour was classified as fatty, mixed or dense, and the tumours were classified as visible or not visible. The reasons for the invisibility of the tumour in the earlier examination were analysed. RESULTS: Fourteen per cent (45) of cancers were retrospectively visible in earlier mammograms without the pre-operative mammograms having been shown, and 29% (95) when pre-operative mammograms were shown. Breast parenchymal density decreased with age and the visibility of tumours increased with age. When considered simultaneously, the effect of age (over 55 vs under 55) was greater (OR = 2.9) than the effect of density (fatty vs others) (OR = 1.5). The most common reasons for non-detection were that the lesion was overlooked (55%), diagnosed as benign (33%) or was visible only in one projection (26%). Growing density was the most common (37%) feature of those lesions originally overlooked or regarded as benign. CONCLUSIONS: Tumours are commonly visible in retrospect, but few of them exhibit specific signs of cancer, and are recognized only if they grow or otherwise change. It is not possible to differentiate most of them from normal parenchymal densities. Saaremaa, I. (2001)

  9. Hospitalized women's willingness to pay for an inpatient screening mammogram.

    Science.gov (United States)

    Khaliq, Waseem; Harris, Ché Matthew; Landis, Regina; Bridges, John F P; Wright, Scott M

    2014-01-01

    Lower rates for breast cancer screening persist among low income and uninsured women. Although Medicare and many other insurance plans would pay for screening mammograms done during hospital stays, breast cancer screening has not been part of usual hospital care. This study explores the mean amount of money that hospitalized women were willing to contribute towards the cost of a screening mammogram. Of the 193 enrolled patients, 72% were willing to pay a mean of $83.41 (95% CI, $71.51-$95.31) in advance towards inpatient screening mammogram costs. The study's findings suggest that hospitalized women value the prospect of screening mammography during the hospitalization. It may be wise policy to offer mammograms to nonadherent hospitalized women, especially those who are at high risk for developing breast cancer. © 2014 Annals of Family Medicine, Inc.

  10. Rapid point-of-care breath test for biomarkers of breast cancer and abnormal mammograms.

    Directory of Open Access Journals (Sweden)

    Michael Phillips

    Full Text Available BACKGROUND: Previous studies have reported volatile organic compounds (VOCs in breath as biomarkers of breast cancer and abnormal mammograms, apparently resulting from increased oxidative stress and cytochrome p450 induction. We evaluated a six-minute point-of-care breath test for VOC biomarkers in women screened for breast cancer at centers in the USA and the Netherlands. METHODS: 244 women had a screening mammogram (93/37 normal/abnormal or a breast biopsy (cancer/no cancer 35/79. A mobile point-of-care system collected and concentrated breath and air VOCs for analysis with gas chromatography and surface acoustic wave detection. Chromatograms were segmented into a time series of alveolar gradients (breath minus room air. Segmental alveolar gradients were ranked as candidate biomarkers by C-statistic value (area under curve [AUC] of receiver operating characteristic [ROC] curve. Multivariate predictive algorithms were constructed employing significant biomarkers identified with multiple Monte Carlo simulations and cross validated with a leave-one-out (LOO procedure. RESULTS: Performance of breath biomarker algorithms was determined in three groups: breast cancer on biopsy versus normal screening mammograms (81.8% sensitivity, 70.0% specificity, accuracy 79% (73% on LOO [C-statistic value], negative predictive value 99.9%; normal versus abnormal screening mammograms (86.5% sensitivity, 66.7% specificity, accuracy 83%, 62% on LOO; and cancer versus no cancer on breast biopsy (75.8% sensitivity, 74.0% specificity, accuracy 78%, 67% on LOO. CONCLUSIONS: A pilot study of a six-minute point-of-care breath test for volatile biomarkers accurately identified women with breast cancer and with abnormal mammograms. Breath testing could potentially reduce the number of needless mammograms without loss of diagnostic sensitivity.

  11. Automatic classification for mammogram backgrounds based on bi-rads complexity definition and on a multi content analysis framework

    Science.gov (United States)

    Wu, Jie; Besnehard, Quentin; Marchessoux, Cédric

    2011-03-01

    Clinical studies for the validation of new medical imaging devices require hundreds of images. An important step in creating and tuning the study protocol is the classification of images into "difficult" and "easy" cases. This consists of classifying the image based on features like the complexity of the background, the visibility of the disease (lesions). Therefore, an automatic medical background classification tool for mammograms would help for such clinical studies. This classification tool is based on a multi-content analysis framework (MCA) which was firstly developed to recognize image content of computer screen shots. With the implementation of new texture features and a defined breast density scale, the MCA framework is able to automatically classify digital mammograms with a satisfying accuracy. BI-RADS (Breast Imaging Reporting Data System) density scale is used for grouping the mammograms, which standardizes the mammography reporting terminology and assessment and recommendation categories. Selected features are input into a decision tree classification scheme in MCA framework, which is the so called "weak classifier" (any classifier with a global error rate below 50%). With the AdaBoost iteration algorithm, these "weak classifiers" are combined into a "strong classifier" (a classifier with a low global error rate) for classifying one category. The results of classification for one "strong classifier" show the good accuracy with the high true positive rates. For the four categories the results are: TP=90.38%, TN=67.88%, FP=32.12% and FN =9.62%.

  12. Multiplexed wavelet transform technique for detection of microcalcification in digitized mammograms.

    Science.gov (United States)

    Mini, M G; Devassia, V P; Thomas, Tessamma

    2004-12-01

    Wavelet transform (WT) is a potential tool for the detection of microcalcifications, an early sign of breast cancer. This article describes the implementation and evaluates the performance of two novel WT-based schemes for the automatic detection of clustered microcalcifications in digitized mammograms. Employing a one-dimensional WT technique that utilizes the pseudo-periodicity property of image sequences, the proposed algorithms achieve high detection efficiency and low processing memory requirements. The detection is achieved from the parent-child relationship between the zero-crossings [Marr-Hildreth (M-H) detector] /local extrema (Canny detector) of the WT coefficients at different levels of decomposition. The detected pixels are weighted before the inverse transform is computed, and they are segmented by simple global gray level thresholding. Both detectors produce 95% detection sensitivity, even though there are more false positives for the M-H detector. The M-H detector preserves the shape information and provides better detection sensitivity for mammograms containing widely distributed calcifications.

  13. Three-Class Mammogram Classification Based on Descriptive CNN Features

    Directory of Open Access Journals (Sweden)

    M. Mohsin Jadoon

    2017-01-01

    Full Text Available In this paper, a novel classification technique for large data set of mammograms using a deep learning method is proposed. The proposed model targets a three-class classification study (normal, malignant, and benign cases. In our model we have presented two methods, namely, convolutional neural network-discrete wavelet (CNN-DW and convolutional neural network-curvelet transform (CNN-CT. An augmented data set is generated by using mammogram patches. To enhance the contrast of mammogram images, the data set is filtered by contrast limited adaptive histogram equalization (CLAHE. In the CNN-DW method, enhanced mammogram images are decomposed as its four subbands by means of two-dimensional discrete wavelet transform (2D-DWT, while in the second method discrete curvelet transform (DCT is used. In both methods, dense scale invariant feature (DSIFT for all subbands is extracted. Input data matrix containing these subband features of all the mammogram patches is created that is processed as input to convolutional neural network (CNN. Softmax layer and support vector machine (SVM layer are used to train CNN for classification. Proposed methods have been compared with existing methods in terms of accuracy rate, error rate, and various validation assessment measures. CNN-DW and CNN-CT have achieved accuracy rate of 81.83% and 83.74%, respectively. Simulation results clearly validate the significance and impact of our proposed model as compared to other well-known existing techniques.

  14. Computerized image analysis: Texture-field orientation method for pectoral muscle identification on MLO-view mammograms

    International Nuclear Information System (INIS)

    Zhou Chuan; Wei Jun; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Sahiner, Berkman; Douglas, Julie A.

    2010-01-01

    Purpose: To develop a new texture-field orientation (TFO) method that combines a priori knowledge, local and global information for the automated identification of pectoral muscle on mammograms. Methods: The authors designed a gradient-based directional kernel (GDK) filter to enhance the linear texture structures, and a gradient-based texture analysis to extract a texture orientation image that represented the dominant texture orientation at each pixel. The texture orientation image was enhanced by a second GDK filter for ridge point extraction. The extracted ridge points were validated and the ridges that were less likely to lie on the pectoral boundary were removed automatically. A shortest-path finding method was used to generate a probability image that represented the likelihood that each remaining ridge point lay on the true pectoral boundary. Finally, the pectoral boundary was tracked by searching for the ridge points with the highest probability lying on the pectoral boundary. A data set of 130 MLO-view digitized film mammograms (DFMs) from 65 patients was used to train the TFO algorithm. An independent data set of 637 MLO-view DFMs from 562 patients was used to evaluate its performance. Another independent data set of 92 MLO-view full field digital mammograms (FFDMs) from 92 patients was used to assess the adaptability of the TFO algorithm to FFDMs. The pectoral boundary detection accuracy of the TFO method was quantified by comparison with an experienced radiologist's manually drawn pectoral boundary using three performance metrics: The percent overlap area (POA), the Hausdorff distance (Hdist), and the average distance (AvgDist). Results: The mean and standard deviation of POA, Hdist, and AvgDist were 95.0±3.6%, 3.45±2.16 mm, and 1.12±0.82 mm, respectively. For the POA measure, 91.5%, 97.3%, and 98.9% of the computer detected pectoral muscles had POA larger than 90%, 85%, and 80%, respectively. For the distance measures, 85.4% and 98.0% of the

  15. Effect of JPEG2000 mammogram compression on microcalcifications segmentation

    International Nuclear Information System (INIS)

    Georgiev, V.; Arikidis, N.; Karahaliou, A.; Skiadopoulos, S.; Costaridou, L.

    2012-01-01

    The purpose of this study is to investigate the effect of mammographic image compression on the automated segmentation of individual microcalcifications. The dataset consisted of individual microcalcifications of 105 clusters originating from mammograms of the Digital Database for Screening Mammography. A JPEG2000 wavelet-based compression algorithm was used for compressing mammograms at 7 compression ratios (CRs): 10:1, 20:1, 30:1, 40:1, 50:1, 70:1 and 100:1. A gradient-based active contours segmentation algorithm was employed for segmentation of microcalcifications as depicted on original and compressed mammograms. The performance of the microcalcification segmentation algorithm on original and compressed mammograms was evaluated by means of the area overlap measure (AOM) and distance differentiation metrics (d mean and d max ) by comparing automatically derived microcalcification borders to manually defined ones by an expert radiologist. The AOM monotonically decreased as CR increased, while d mean and d max metrics monotonically increased with CR increase. The performance of the segmentation algorithm on original mammograms was (mean±standard deviation): AOM=0.91±0.08, d mean =0.06±0.05 and d max =0.45±0.20, while on 40:1 compressed images the algorithm's performance was: AOM=0.69±0.15, d mean =0.23±0.13 and d max =0.92±0.39. Mammographic image compression deteriorates the performance of the segmentation algorithm, influencing the quantification of individual microcalcification morphological properties and subsequently affecting computer aided diagnosis of microcalcification clusters. (authors)

  16. Automatic correspondence detection in mammogram and breast tomosynthesis images

    Science.gov (United States)

    Ehrhardt, Jan; Krüger, Julia; Bischof, Arpad; Barkhausen, Jörg; Handels, Heinz

    2012-02-01

    Two-dimensional mammography is the major imaging modality in breast cancer detection. A disadvantage of mammography is the projective nature of this imaging technique. Tomosynthesis is an attractive modality with the potential to combine the high contrast and high resolution of digital mammography with the advantages of 3D imaging. In order to facilitate diagnostics and treatment in the current clinical work-flow, correspondences between tomosynthesis images and previous mammographic exams of the same women have to be determined. In this paper, we propose a method to detect correspondences in 2D mammograms and 3D tomosynthesis images automatically. In general, this 2D/3D correspondence problem is ill-posed, because a point in the 2D mammogram corresponds to a line in the 3D tomosynthesis image. The goal of our method is to detect the "most probable" 3D position in the tomosynthesis images corresponding to a selected point in the 2D mammogram. We present two alternative approaches to solve this 2D/3D correspondence problem: a 2D/3D registration method and a 2D/2D mapping between mammogram and tomosynthesis projection images with a following back projection. The advantages and limitations of both approaches are discussed and the performance of the methods is evaluated qualitatively and quantitatively using a software phantom and clinical breast image data. Although the proposed 2D/3D registration method can compensate for moderate breast deformations caused by different breast compressions, this approach is not suitable for clinical tomosynthesis data due to the limited resolution and blurring effects perpendicular to the direction of projection. The quantitative results show that the proposed 2D/2D mapping method is capable of detecting corresponding positions in mammograms and tomosynthesis images automatically for 61 out of 65 landmarks. The proposed method can facilitate diagnosis, visual inspection and comparison of 2D mammograms and 3D tomosynthesis images for

  17. DETECTION OF MICROCALCIFICATION IN DIGITAL MAMMOGRAMS USING ONE DIMENSIONAL WAVELET TRANSFORM

    Directory of Open Access Journals (Sweden)

    T. Balakumaran

    2010-11-01

    Full Text Available Mammography is the most efficient method for breast cancer early detection. Clusters of microcalcifications are the early sign of breast cancer and their detection is the key to improve prognosis of breast cancer. Microcalcifications appear in mammogram image as tiny localized granular points, which is often difficult to detect by naked eye because of their small size. Automatic and accurately detection of microcalcifications has received much more attention from radiologists and physician. An efficient method for automatic detection of clustered microcalcifications in digitized mammograms is the use of Computer Aided Diagnosis (CAD systems. This paper presents a one dimensional wavelet-based multiscale products scheme for microcalcification detection in mammogram images. The detection of microcalcifications were achieved by decomposing the each line of mammograms by 1D wavelet transform into different frequency sub-bands, suppressing the low-frequency subband, and finally reconstructing the mammogram from the subbands containing only significant high frequencies features. The significant features are obtained by multiscale products. Preliminary results indicate that the proposed scheme is better in suppressing the background and detecting the microcalcification clusters than any other wavelet decomposition methods.

  18. Fibrocystic change of breast : relation with parenchymal pattern on mammogram and fibroadenoma

    International Nuclear Information System (INIS)

    Lee, Ki Yeol; Cha, In Ho; Kang, Eun Young; Kim, Jung Hyuk

    1996-01-01

    To determine the relationship between fibrocystic change and parenchymal pattern and fibroadenoma on mammogram. Mammograms of 135 patients with histologically- diagnosed fibrocystic disease after excisional biopsy were retrospectively analyzed and correlated with pathologic specimens. Classification of the parenchymal pattern was based on Wolfe's method. On mammogram, we observed abnormality in 88 out of the 135 cases;these latter consisted of 70 cases of DY, 30 of P2, 20 of P1, and 15 of Nl, following Wolfe's parenchymal patterns. Among the 88 abnormal cases we obseved 37 cases of mass with clear boundaries, five cases of mass with unclear boundaries, 22 with clustered microcalcifications, six with macrocalcifications and 18 with asymmetric dense breast. Histologic examination revealed a varying composition of stromal fibrosis, epithelial hyperplasia,cyst formation, apocrine metaplasia, etc. Histologically fibroadenomatoid change in 18 cases was appeared as a radiopaque mass on mammogram, especially in those cases where the change was well-defined, which were all except three. Fibrocystic disease was prevalent in Wolfe's P2 and DY patterns(about 80%). About 40% of fibrocystic change appearing as a well defined mass on mammogram showed fibroadenomatoid chage histologically and was difficult to differentiate from fibroadenoma. Fibrocystic disease should therefore be included in the differential diagnosis of a mass which on mammogram is well-defined

  19. Effect on sensitivity and specificity of mammography screening with or without comparison of old mammograms

    International Nuclear Information System (INIS)

    Thurfjell, M.G.; Vitak, B.; Azavedo, E.; Svane, G.; Thurfjell, E.

    2000-01-01

    In order to evaluate the effect of old mammograms on the specificity and sensitivity of radiologists in mammography screening, one hundred and fifty sets of screening mammograms were examined by 3 experienced screeners twice: once without and once in comparison with older mammograms. The films came from a population-based screening done during the first half of 1994 and comprised all 35 cancers detected during screening in 1994, 12/24 interval cancers, 14/34 cancers detected in the following screening and 89 normal mammograms. Without old mammograms, the screeners detected an average of 40.3 cancers (range 37-42), with a specificity of 87% (85-88%). With old mammograms, the screeners detected 37.7 cancers (range 34-42) with a specificity of 96% (94-99%). The change in detection rate was not significant. However, the increase in specificity was significant for each screener. Mammography screening with old mammograms available for comparison decreased the false-positive recall rate. The effect on sensitivity, however, was unclear

  20. Association between mammogram density and background parenchymal enhancement of breast MRI

    Science.gov (United States)

    Aghaei, Faranak; Danala, Gopichandh; Wang, Yunzhi; Zarafshani, Ali; Qian, Wei; Liu, Hong; Zheng, Bin

    2018-02-01

    Breast density has been widely considered as an important risk factor for breast cancer. The purpose of this study is to examine the association between mammogram density results and background parenchymal enhancement (BPE) of breast MRI. A dataset involving breast MR images was acquired from 65 high-risk women. Based on mammography density (BIRADS) results, the dataset was divided into two groups of low and high breast density cases. The Low-Density group has 15 cases with mammographic density (BIRADS 1 and 2), while the High-density group includes 50 cases, which were rated by radiologists as mammographic density BIRADS 3 and 4. A computer-aided detection (CAD) scheme was applied to segment and register breast regions depicted on sequential images of breast MRI scans. CAD scheme computed 20 global BPE features from the entire two breast regions, separately from the left and right breast region, as well as from the bilateral difference between left and right breast regions. An image feature selection method namely, CFS method, was applied to remove the most redundant features and select optimal features from the initial feature pool. Then, a logistic regression classifier was built using the optimal features to predict the mammogram density from the BPE features. Using a leave-one-case-out validation method, the classifier yields the accuracy of 82% and area under ROC curve, AUC=0.81+/-0.09. Also, the box-plot based analysis shows a negative association between mammogram density results and BPE features in the MRI images. This study demonstrated a negative association between mammogram density and BPE of breast MRI images.

  1. Fibrocystic change of breast : relation with parenchymal pattern on mammogram and fibroadenoma

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ki Yeol; Cha, In Ho; Kang, Eun Young; Kim, Jung Hyuk [Korea Univ. College of Medicine, Seoul (Korea, Republic of)

    1996-10-01

    To determine the relationship between fibrocystic change and parenchymal pattern and fibroadenoma on mammogram. Mammograms of 135 patients with histologically- diagnosed fibrocystic disease after excisional biopsy were retrospectively analyzed and correlated with pathologic specimens. Classification of the parenchymal pattern was based on Wolfe's method. On mammogram, we observed abnormality in 88 out of the 135 cases;these latter consisted of 70 cases of DY, 30 of P2, 20 of P1, and 15 of Nl, following Wolfe's parenchymal patterns. Among the 88 abnormal cases we obseved 37 cases of mass with clear boundaries, five cases of mass with unclear boundaries, 22 with clustered microcalcifications, six with macrocalcifications and 18 with asymmetric dense breast. Histologic examination revealed a varying composition of stromal fibrosis, epithelial hyperplasia,cyst formation, apocrine metaplasia, etc. Histologically fibroadenomatoid change in 18 cases was appeared as a radiopaque mass on mammogram, especially in those cases where the change was well-defined, which were all except three. Fibrocystic disease was prevalent in Wolfe's P2 and DY patterns(about 80%). About 40% of fibrocystic change appearing as a well defined mass on mammogram showed fibroadenomatoid chage histologically and was difficult to differentiate from fibroadenoma. Fibrocystic disease should therefore be included in the differential diagnosis of a mass which on mammogram is well-defined.

  2. Search for lesions in mammograms: Statistical characterization of observer responses

    International Nuclear Information System (INIS)

    Bochud, Francois O.; Abbey, Craig K.; Eckstein, Miguel P.

    2004-01-01

    We investigate human performance for visually detecting simulated microcalcifications and tumors embedded in x-ray mammograms as a function of signal contrast and the number of possible signal locations. Our results show that performance degradation with an increasing number of locations is well approximated by signal detection theory (SDT) with the usual Gaussian assumption. However, more stringent statistical analysis finds a departure from Gaussian assumptions for the detection of microcalcifications. We investigated whether these departures from the SDT Gaussian model could be accounted for by an increase in human internal response correlations arising from the image-pixel correlations present in 1/f spectrum backgrounds and/or observer internal response distributions that departed from the Gaussian assumption. Results were consistent with a departure from the Gaussian response distributions and suggested that the human observer internal responses were more compact than the Gaussian distribution. Finally, we conducted a free search experiment where the signal could appear anywhere within the image. Results show that human performance in a multiple-alternative forced-choice experiment can be used to predict performance in the clinically realistic free search experiment when the investigator takes into account the search area and the observers' inherent spatial imprecision to localize the targets

  3. Cancer on a mammogram is not memorable: readers remember their recalls and not cancers

    International Nuclear Information System (INIS)

    Pitman, Alexander G.; Kok, Phebe; Zentner, Lucila

    2012-01-01

    To determine if presence of cancer on a mammogram makes that mammogram more memorable. A total of 100 mammograms (25 cancers) were grouped into 5 sets of 20 cases. Set pairs were presented in five reads to eight radiologist readers. Readers were asked to 'clear' or 'call back' cases, and at post-baseline reads to indicate whether each case was 'new' or 'old ' (remembered from prior read). Two sets were presented only at baseline, to calculate each reader's false recollection rate. For cases presented more than once ('old' cases, 100 presentations) readers could have 'correct memory' or 'memory loss'. Memory performance was defined as odds ratio of correct memory to memory loss. Multivariate logistic data regression analysis identified predictors of memory performance from: reader, set, time since last read, presence of cancer, and whether the case was called back at the last read. Memory performance differed markedly between readers and reader identity was a highly significant predictor of memory performance. Presence of cancer was not a significant predictor of memory performance (odds ratio 0.77, 95% CI: 0.49–1.21). Whether the case was called back at the last read was a highly significant predictor (odds ratio 4.22, 95% CI: 2.70–6.61) for the model incorporating reader variability, and also the model without reader variability (odds ratio 2.67, 95% CI: 1.74–4.08). The only statistically significant predictor of radiologist memory for a mammogram was whether the radiologist 'called it back' at a prior reading round. Presence of cancer on a mammogram did not make it memorable.

  4. Locally adaptive decision in detection of clustered microcalcifications in mammograms

    Science.gov (United States)

    Sainz de Cea, María V.; Nishikawa, Robert M.; Yang, Yongyi

    2018-02-01

    In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value  FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.

  5. Improving work-up of the abnormal mammogram through organized assessment: results from the ontario breast screening program.

    Science.gov (United States)

    Quan, May Lynn; Shumak, Rene S; Majpruz, Vicky; Holloway, Claire M D; O'Malley, Frances P; Chiarelli, Anna M

    2012-03-01

    Women with an abnormal screening mammogram should ideally undergo an organized assessment to attain a timely diagnosis. This study evaluated outcomes of women undergoing work-up after abnormal mammogram through a formal breast assessment affiliate (BAA) program with explicit care pathways compared with usual care (UC) using developed quality indicators for screening mammography programs. Between January 1 and December 31, 2007, a total of 320,635 women underwent a screening mammogram through the Ontario Breast Screening Program (OBSP), of whom 25,543 had an abnormal result requiring further assessment. Established indicators assessing timeliness, appropriateness of follow-up, and biopsy rates were compared between women who were assessed through either a BAA or UC using χ(2) analysis. Work-up of the abnormal mammogram for patients screened through a BAA resulted in a greater proportion of women attaining a definitive diagnosis within the recommended time interval when a histologic diagnosis was required. In addition, use of other quality measures including specimen radiography for both core biopsies and surgical specimens and preoperative core needle biopsy was greater in BAA facilities. These findings support future efforts to increase the number of BAAs within the OBSP, because the pathways and reporting methods associated with them result in improvements in our ability to provide timely and appropriate care for women requiring work-up of an abnormal mammogram.

  6. Exposure parameters of mammograms with and without mass lesions from a South African breast care centre

    International Nuclear Information System (INIS)

    Acho, Sussan N.; Boonzaier, Willem P. E.; Nel, Ina F.

    2017-01-01

    In South African breast care centres, full-field digital mammography units provide breast imaging services to symptomatic and asymptomatic women simultaneously. This study evaluated the technical exposure parameters of 800 mammograms of which 100 mammograms had obvious mass lesions in the fibro-glandular tissue. The average breast compression force of mammograms with mass lesions in the fibro-glandular tissue was 18.4% less than the average breast compression force of mammograms without mass lesions. The average mean glandular dose (MGD), tube potential (kV p ) and compressed breast thickness (CBT) values were 2.14 mGy, 30.5 kV p and 63.9 mm, respectively, for mammograms with mass lesions, and 1.45 mGy, 29.6 kV p and 56.9 mm, respectively, for mammograms without mass lesions. Overall, the average MGD and mean CBT of mammograms with mass lesion were significantly higher compared to those without mass lesions (p < 0.05), although there was no significant difference in their tube potentials (p > 0.05). (authors)

  7. Clinical Image Evaluation of Film Mammograms in Korea: Comparison with the ACR Standard

    International Nuclear Information System (INIS)

    Gwak, Yeon Joo; Kim, Hye Jung; Kwak, Jin Young; Son, Eun Ju; Ko, Kyung Hee; Lee, Jin Hwa; Lim, Hyo Soon; Lee, You Jin; Park, Ji Won; Shin, Kyung Min; Jang, Yun-Jin

    2013-01-01

    The goal of this study is to compare the overall quality of film mammograms taken according to the Korean standards with the American College of Radiology (ACR) standard for clinical image evaluation and to identify means of improving mammography quality in Korea. Four hundred and sixty eight sets of film mammograms were evaluated with respect to the Korean and ACR standards for clinical image evaluation. The pass and failure rates of mammograms were compared by medical facility types. Average scores in each category of the two standards were evaluated. Receiver operating characteristic curve analysis was used to identify an optimal Korean standard pass mark by taking the ACR standard as the reference standard. 93.6% (438/468) of mammograms passed the Korean standard, whereas only 80.1% (375/468) passed the ACR standard (p < 0.001). Non-radiologic private clinics had the lowest pass rate (88.1%: Korean standard, 71.8%: ACR standard) and the lowest total score (76.0) by the Korean standard. Average scores of positioning were lowest (19.3/29 by the Korean standard and 3.7/5 by the ACR standard). A cutoff score of 77.0 for the Korean standard was found to correspond to a pass level when the ACR standard was applied. We suggest that tighter regulations, such as, raising the Korean pass mark, subtracting more for severe deficiencies, or considering a very low scores in even a single category as failure, are needed to improve the quality of mammography in Korea

  8. Performance of computer-aided detection in false-negative screening mammograms of breast cancers

    International Nuclear Information System (INIS)

    Han, Boo Kyung; Kim, Ji Young; Shin, Jung Hee; Choe, Yeon Hyeon

    2004-01-01

    To analyze retrospectively the abnormalities visible on the false-negative screening mammograms of patients with breast cancer and to determine the performance of computer-aided detection (CAD) in the detection of cancers. Of 108 consecutive cases of breast cancer diagnosed over a period of 6 years, of which previous screening mammograms were available, 32 retrospectively visible abnormalities (at which locations cancer later developed) were found in the previous mammograms, and which were originally reported as negative. These 32 patients ranged in age from 38 to 72 years (mean 52 years). We analyzed their previous mammographic findings, and assessed the ability of CAD to mark cancers in previous mammograms, according to the clinical presentation, the type of abnormalities and the mammographic parenchymal density. In these 32 previous mammograms of breast cancers (20 asymptomatic, 12 symptomatic), the retrospectively visible abnormalities were identified as densities in 22, calcifications in 8, and densities with calcifications in 2. CAD marked abnormalities in 20 (63%) of the 32 cancers with false-negative screening mammograms; 14 (70%) of the 20 subsequent screening-detected cancers, 5 (50%) of the 10 interval cancers, and 1 (50%) of the 2 cancers palpable after the screening interval. CAD marked 12 (50%) of the 24 densities and 9 (90%) of the 10 calcifications. CAD marked abnormalities in 7 (50%) of the 14 predominantly fatty breasts, and 13 (72%) of the 18 dense breasts. CAD-assisted diagnosis could potentially decrease the number of false-negative mammograms caused by the failure to recognize the cancer in the screening program, although its usefulness in the prevention of interval cancers appears to be limited

  9. Vitamin D intake, month the mammogram was taken and mammographic density in Norwegian women aged 50-69.

    Directory of Open Access Journals (Sweden)

    Merete Ellingjord-Dale

    Full Text Available The role of vitamin D in breast cancer etiology is unclear. There is some, but inconsistent, evidence that vitamin D is associated with both breast cancer risk and mammographic density (MD. We evaluated the associations of MD with month the mammogram was taken, and with vitamin D intake, in a population of women from Norway--a country with limited sunlight exposure for a large part of the year.3114 women aged 50-69, who participated in the Norwegian Breast Cancer Screening Program (NBCSP in 2004 or 2006/07, completed risk factor and food frequency (FFQ questionnaires. Dietary and total (dietary plus supplements vitamin D, calcium and energy intakes were estimated by the FFQ. Month when the mammogram was taken was recorded on the mammogram. Percent MD was assessed using a computer assisted method (Madena, University of Southern California after digitization of the films. Linear regression models were used to investigate percent MD associations with month the mammogram was taken, and vitamin D and calcium intakes, adjusting for age, body mass index (BMI, study year, estrogen and progestin therapy (EPT, education, parity, calcium intakes and energy intakes.There was no statistical significant association between the month the mammogram was taken and percent MD. Overall, there was no association between percent MD and quartiles of total or dietary vitamin D intakes, or of calcium intake. However, analysis restricted to women aged <55 years revealed a suggestive inverse association between total vitamin D intake and percent MD (p for trend = 0.03.Overall, we found no strong evidence that month the mammogram was taken was associated with percent MD. We found no inverse association between vitamin D intake and percent MD overall, but observed a suggestive inverse association between dietary vitamin D and MD for women less than 55 years old.

  10. Computer aided monitoring of breast abnormalities in X-ray mammograms

    OpenAIRE

    Selvan, Arul; Saatchi, Reza; Ferris, Christine

    2011-01-01

    X­ray mammography is regarded as the most effective tool for the detection and diagnosis of breast cancer, but the interpretation of mammograms is a difficult and \\ud error­prone task. Computer­aided detection (CADe) systems address the problem that radiologists often miss signs of cancers that are retrospectively visible in mammograms. Furthermore, computer­aided diagnosis (CADx) systems assist the radiologist in the classification of mammographic lesions as benign or malignant[1].\\ud This p...

  11. Can Australian radiographers assess screening mammograms accurately? Biennial follow-up from a four year prospective study and lesion analysis

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2016-01-01

    Introduction: Globally, the role of the radiographer is changing; some countries have developed advanced roles with specific scopes of practice. Other countries, like Australia, are in the process of this change. This paper demonstrates the abilities of Australian radiographers in mammogram screen reading, highlighting some of their specific difficulties with different lesion types. Method: Six experienced radiographers participated in a prospective study, screen reading 2000 mammograms each between 2010 and 2011. This paper looks at the results of those same women at biennial re-screen. Analysis of the results included validation of normal results by negative follow-up screens and new cancers at biennial review; there is also analysis on the types of lesions detected and missed. Results: After biennial review, three cancers in 2013/2014 had been marked as abnormal by one radiographer two years prior, which increased her sensitivity from 64% to 85%. Sensitivity for the radiologists decreased from the assumed 100% to 95%. Radiographers appeared to be skilled in detection of calcifications and architectural distortions but had difficulty with non-specific densities. Conclusion: This study demonstrates the potential for Australian radiographers to enhance the accuracy of screen reading programs. - Highlights: • Radiographers have the potential to increase breast cancer detection rates. • Radiographers appear to be skilled at detecting calcifications. • Lesions commonly overlooked by radiographers could be targeted for training.

  12. Breast Cancer Detection with Gabor Features from Digital Mammograms

    Directory of Open Access Journals (Sweden)

    Yufeng Zheng

    2010-01-01

    Full Text Available A new breast cancer detection algorithm, named the “Gabor Cancer Detection” (GCD algorithm, utilizing Gabor features is proposed. Three major steps are involved in the GCD algorithm, preprocessing, segmentation (generating alarm segments, and classification (reducing false alarms. In preprocessing, a digital mammogram is down-sampled, quantized, denoised and enhanced. Nonlinear diffusion is used for noise suppression. In segmentation, a band-pass filter is formed by rotating a 1-D Gaussian filter (off center in frequency space, termed as “Circular Gaussian Filter” (CGF. A CGF can be uniquely characterized by specifying a central frequency and a frequency band. A mass or calcification is a space-occupying lesion and usually appears as a bright region on a mammogram. The alarm segments (suspicious to be masses/calcifications can be extracted out using a threshold that is adaptively decided upon the histogram analysis of the CGF-filtered mammogram. In classification, a Gabor filter bank is formed with five bands by four orientations (horizontal, vertical, 45 and 135 degree in Fourier frequency domain. For each mammographic image, twenty Gabor-filtered images are produced. A set of edge histogram descriptors (EHD are then extracted from 20 Gabor images for classification. An EHD signature is computed with four orientations of Gabor images along each band and five EHD signatures are then joined together to form an EHD feature vector of 20 dimensions. With the EHD features, the fuzzy C-means clustering technique and k-nearest neighbor (KNN classifier are used to reduce the number of false alarms. The experimental results tested on the DDSM database (University of South Florida show the promises of GCD algorithm in breast cancer detection, which achieved TP (true positive rate = 90% at FPI (false positives per image = 1.21 in mass detection; and TP = 93% at FPI = 1.19 in calcification detection.

  13. The classification of normal screening mammograms

    Science.gov (United States)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  14. Iso-precision scaling of digitized mammograms to facilitate image analysis

    International Nuclear Information System (INIS)

    Karssmeijer, N.; van Erning, L.

    1991-01-01

    This paper reports on a 12 bit CCD camera equipped with a linear sensor of 4096 photodiodes which is used to digitize conventional mammographic films. An iso-precision conversion of the pixel values is preformed to transform the image data to a scale on which the image noise is equal at each level. For this purpose film noise and digitization noise have been determined as a function of optical density and pixel size. It appears that only at high optical densities digitization noise is comparable to or larger than film noise. The quantization error caused by compression of images recorded with 12 bits per pixel to 8 bit images by an iso-precision conversion has been calculated as a function of the number of quantization levels. For mammograms digitized in a 4096 2 matrix the additional error caused by such a scale transform is only about 1.5 percent. An iso-precision scale transform can be advantageous when automated procedures for quantitative image analysis are developed. Especially when detection of signals in noise is aimed at, a constant noise level over the whole pixel value range is very convenient. This is demonstrated by applying local thresholding to detect small microcalcifications. Results are compared to those obtained by using logarithmic or linearized scales

  15. A Single Sided Edge Marking Method for Detecting Pectoral Muscle in Digital Mammograms

    Directory of Open Access Journals (Sweden)

    G. Toz

    2018-02-01

    Full Text Available In the computer-assisted diagnosis of breast cancer, the removal of pectoral muscle from mammograms is very important. In this study, a new method, called Single-Sided Edge Marking (SSEM technique, is proposed for the identification of the pectoral muscle border from mammograms. 60 mammograms from the INbreast database were used to test the proposed method. The results obtained were compared for False Positive Rate, False Negative Rate, and Sensitivity using the ground truth values pre-determined by radiologists for the same images. Accordingly, it has been shown that the proposed method can detect the pectoral muscle border with an average of 95.6% sensitivity.

  16. Improved Classification of Mammograms Following Idealized Training

    Science.gov (United States)

    Hornsby, Adam N.; Love, Bradley C.

    2014-01-01

    People often make decisions by stochastically retrieving a small set of relevant memories. This limited retrieval implies that human performance can be improved by training on idealized category distributions (Giguère & Love, 2013). Here, we evaluate whether the benefits of idealized training extend to categorization of real-world stimuli, namely classifying mammograms as normal or tumorous. Participants in the idealized condition were trained exclusively on items that, according to a norming study, were relatively unambiguous. Participants in the actual condition were trained on a representative range of items. Despite being exclusively trained on easy items, idealized-condition participants were more accurate than those in the actual condition when tested on a range of item types. However, idealized participants experienced difficulties when test items were very dissimilar from training cases. The benefits of idealization, attributable to reducing noise arising from cognitive limitations in memory retrieval, suggest ways to improve real-world decision making. PMID:24955325

  17. Improved Classification of Mammograms Following Idealized Training.

    Science.gov (United States)

    Hornsby, Adam N; Love, Bradley C

    2014-06-01

    People often make decisions by stochastically retrieving a small set of relevant memories. This limited retrieval implies that human performance can be improved by training on idealized category distributions (Giguère & Love, 2013). Here, we evaluate whether the benefits of idealized training extend to categorization of real-world stimuli, namely classifying mammograms as normal or tumorous. Participants in the idealized condition were trained exclusively on items that, according to a norming study, were relatively unambiguous. Participants in the actual condition were trained on a representative range of items. Despite being exclusively trained on easy items, idealized-condition participants were more accurate than those in the actual condition when tested on a range of item types. However, idealized participants experienced difficulties when test items were very dissimilar from training cases. The benefits of idealization, attributable to reducing noise arising from cognitive limitations in memory retrieval, suggest ways to improve real-world decision making.

  18. Dynamic multiple thresholding breast boundary detection algorithm for mammograms

    International Nuclear Information System (INIS)

    Wu, Yi-Ta; Zhou Chuan; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Daly, Caroline Plowden; Douglas, Julie A.; Zhang Yiheng; Sahiner, Berkman; Shi Jiazheng; Wei Jun

    2010-01-01

    Purpose: Automated detection of breast boundary is one of the fundamental steps for computer-aided analysis of mammograms. In this study, the authors developed a new dynamic multiple thresholding based breast boundary (MTBB) detection method for digitized mammograms. Methods: A large data set of 716 screen-film mammograms (442 CC view and 274 MLO view) obtained from consecutive cases of an Institutional Review Board approved project were used. An experienced breast radiologist manually traced the breast boundary on each digitized image using a graphical interface to provide a reference standard. The initial breast boundary (MTBB-Initial) was obtained by dynamically adapting the threshold to the gray level range in local regions of the breast periphery. The initial breast boundary was then refined by using gradient information from horizontal and vertical Sobel filtering to obtain the final breast boundary (MTBB-Final). The accuracy of the breast boundary detection algorithm was evaluated by comparison with the reference standard using three performance metrics: The Hausdorff distance (HDist), the average minimum Euclidean distance (AMinDist), and the area overlap measure (AOM). Results: In comparison with the authors' previously developed gradient-based breast boundary (GBB) algorithm, it was found that 68%, 85%, and 94% of images had HDist errors less than 6 pixels (4.8 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 89%, 90%, and 96% of images had AMinDist errors less than 1.5 pixels (1.2 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 96%, 98%, and 99% of images had AOM values larger than 0.9 for GBB, MTBB-Initial, and MTBB-Final, respectively. The improvement by the MTBB-Final method was statistically significant for all the evaluation measures by the Wilcoxon signed rank test (p<0.0001). Conclusions: The MTBB approach that combined dynamic multiple thresholding and gradient information provided better performance than the breast boundary

  19. ANALISA PERBANDINGAN METODE SEGMENTASI CITRA PADA CITRA MAMMOGRAM

    Directory of Open Access Journals (Sweden)

    Toni Arifin

    2016-09-01

    Full Text Available Abstract Cancer is a desaeas with a high prevalence in the world. As many 8,2 million people died of cancer. The prevalence of cancer was happened in woman that is breast cancer. Breast cancer is a malignancy derived from grandular cells, gland duct and supporting the breast tissues. There are many ways of detecting the presence of breast cancer which one is mammography test that aims to examine the human breast using low-dose X-rays. Observation mammography results in the form of mammogram images can be done with image processing, in this way the process of observation is not take a long time and error in the observation can be reduced. One of the process image processing is image segmentation, the step of image segmentation is an important in image analysis there force is needed method in process of image sementation. This observation is aims to analyze comparison of two image segmentation methods of mammogram images that is using Watershed method and Otsu method after that it will see the quality of image by calculating the signal to noise ratio and timing run of each method. The result of this observation is showed that the signal to noise ratio on the Watershed method 7,475 dB and Otsu method 6.197 dB and the conclution is Watershed method is better than Otsu method, whereas if viewed the timing run Watershed method 0,016 seconds is more faster than Otsu method.

  20. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  1. Computerized detection of masses on mammograms by entropy maximization thresholding

    International Nuclear Information System (INIS)

    Kom, Guillaume; Tiedeu, Alain; Feudjio, Cyrille; Ngundam, J.

    2010-03-01

    In many cases, masses in X-ray mammograms are subtle and their detection can benefit from an automated system serving as a diagnostic aid. It is to this end that the authors propose in this paper, a new computer aided mass detection for breast cancer diagnosis. The first step focuses on wavelet filters enhancement which removes bright background due to dense breast tissues and some film artifacts while preserving features and patterns related to the masses. In the second step, enhanced image is computed by Entropy Maximization Thresholding (EMT) to obtain segmented masses. The efficiency of 98,181% is achieved by analyzing a database of 84 mammograms previously marked by radiologists and digitized at a pixel size of 343μmm x 343μ mm. The segmentation results, in terms of size of detected masses, give a relative error on mass area that is less than 8%. The performance of the proposed method has also been evaluated by means of the receiver operating-characteristics (ROC) analysis. This yielded respectively, an area (Az) of 0.9224 and 0.9295 under the ROC curve whether enhancement step is applied or not. Furthermore, we observe that the EMT yields excellent segmentation results compared to those found in literature. (author)

  2. Study of the Effects of Total Modulation Transfer Function Changes on Observer Performance Using Clinical Mammograms.

    Science.gov (United States)

    Bencomo, Jose Antonio Fagundez

    The main goal of this study was to relate physical changes in image quality measured by Modulation Transfer Function (MTF) to diagnostic accuracy. One Hundred and Fifty Kodak Min-R screen/film combination conventional craniocaudal mammograms obtained with the Pfizer Microfocus Mammographic system were selected from the files of the Department of Radiology, at M.D. Anderson Hospital and Tumor Institute. The mammograms included 88 cases with a variety of benign diagnosis and 62 cases with a variety of malignant biopsy diagnosis. The average age of the patient population was 55 years old. 70 cases presented calcifications with 30 cases having calcifications smaller than 0.5mm. 46 cases presented irregular bordered masses larger than 1 cm. 30 cases presented smooth bordered masses with 20 larger than 1 cm. Four separated copies of the original images were made each having a different change in the MTF using a defocusing technique whereby copies of the original were obtained by light exposure through different thicknesses (spacing) of transparent film base. The mammograms were randomized, and evaluated by three experienced mammographers for the degree of visibility of various anatomical breast structures and pathological lesions (masses and calicifications), subjective image quality, and mammographic interpretation. 3,000 separate evaluations were anayzed by several statistical techniques including Receiver Operating Characteristic curve analysis, McNemar test for differences between proportions and the Landis et al. method of agreement weighted kappa for ordinal categorical data. Results from the statistical analysis show: (1) There were no statistical significant differences in the diagnostic accuracy of the observers when diagnosing from mammograms with the same MTF. (2) There were no statistically significant differences in diagnostic accuracy for each observer when diagnosing from mammograms with the different MTF's used in the study. (3) There statistical

  3. Automatic breast cancer risk assessment from digital mammograms

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Brandt, Sami; Karssemeijer, N

    Purpose: Textural characteristics of the breast tissue structure on mammogram have been shown to improve breast cancer risk assessment in several large studies. Currently, however, the texture is not used to assess risk in standard clinical procedures or involved in general breast cancer risk ass...

  4. Computer aided system for segmentation and visualization of microcalcifications in digital mammograms

    International Nuclear Information System (INIS)

    Reljin, B.; Reljin, I.; Milosevic, Z.; Stojic, T.

    2009-01-01

    Two methods for segmentation and visualization of microcalcifications in digital or digitized mammograms are described. First method is based on modern mathematical morphology, while the second one uses the multifractal approach. In the first method, by using an appropriate combination of some morphological operations, high local contrast enhancement, followed by significant suppression of background tissue, irrespective of its radiology density, is obtained. By iterative procedure, this method highly emphasizes only small bright details, possible microcalcifications. In a multifractal approach, from initial mammogram image, a corresponding multifractal 'images' are created, from which a radiologist has a freedom to change the level of segmentation. An appropriate user friendly computer aided visualization (CAV) system with embedded two methods is realized. The interactive approach enables the physician to control the level and the quality of segmentation. Suggested methods were tested through mammograms from MIAS database as a gold standard, and from clinical praxis, using digitized films and digital images from full field digital mammograph. (authors)

  5. A retrospective study of the performance of radiographers in interpreting screening mammograms

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2011-01-01

    Purpose: This paper provides data on the continued success of radiographers in reviewing mammograms with similar accuracy to screen readers. Method: The participants consisted of 7 radiographers and 2 current official screen readers. Two hundred and fifty sets of mammograms from 2003 were used in this study. Each participant reviewed each set of mammograms as a Rescreen or Recall. Patient outcomes were assessed by following up the results of any histology or pathology tests in 2003 or the 2005/2006 screening results. Results: The screen reader's sensitivities ranged from 79% to 93% and the specificities ranged from 82% to 84%. The radiographer values ranged from 57% to 97% and 63% to 80% respectively. Conclusion: The sensitivity and specificity values attained by some radiographers were equivalent to those of both the screen readers. Accuracy rates of the radiographers suggest that screen reading by selected and appropriately trained radiographers should be achievable in Australia.

  6. Mobile Versus Fixed Facility: Latinas' Attitudes and Preferences for Obtaining a Mammogram.

    Science.gov (United States)

    Scheel, John R; Tillack, Allison A; Mercer, Lauren; Coronado, Gloria D; Beresford, Shirley A A; Molina, Yamile; Thompson, Beti

    2018-01-01

    Mobile mammographic services have been proposed as a way to reduce Latinas' disproportionate late-stage presentation compared with white women by increasing their access to mammography. The aims of this study were to assess why Latinas may not use mobile mammographic services and to explore their preferences after using these services. Using a mixed-methods approach, a secondary analysis was conducted of baseline survey data (n = 538) from a randomized controlled trial to improve screening mammography rates among Latinas in Washington. Descriptive statistics and bivariate regression were used to characterize mammography location preferences and to test for associations with sociodemographic indices, health care access, and perceived breast cancer risk and beliefs. On the basis of these findings, a qualitative study (n = 18) was used to explore changes in perceptions after using mobile mammographic services. More Latinas preferred obtaining a mammogram at a fixed facility (52.3% [n = 276]) compared with having no preference (46.3% [n = 249]) and preferring mobile mammographic services (1.7% [n = 9]). Concerns about privacy and comfort (15.6% [n = 84]) and about general quality (10.6% [n = 57]) were common reasons for preferring a fixed facility. Those with no history of mammography preferred a fixed facility (P mobile mammographic services after obtaining a mammogram. Although most Latinas preferred obtaining a mammogram at a fixed facility, positive experiences with mobile mammography services changed their attitudes toward them. These findings highlight the need to include community education when using mobile mammographic service to increase screening mammography rates in underserved communities. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Estimating average glandular dose by measuring glandular rate in mammograms

    International Nuclear Information System (INIS)

    Goto, Sachiko; Azuma, Yoshiharu; Sumimoto, Tetsuhiro; Eiho, Shigeru

    2003-01-01

    The glandular rate of the breast was objectively measured in order to calculate individual patient exposure dose (average glandular dose) in mammography. By employing image processing techniques and breast-equivalent phantoms with various glandular rate values, a conversion curve for pixel value to glandular rate can be determined by a neural network. Accordingly, the pixel values in clinical mammograms can be converted to the glandular rate value for each pixel. The individual average glandular dose can therefore be calculated using the individual glandular rates on the basis of the dosimetry method employed for quality control in mammography. In the present study, a data set of 100 craniocaudal mammograms from 50 patients was used to evaluate our method. The average glandular rate and average glandular dose of the data set were 41.2% and 1.79 mGy, respectively. The error in calculating the individual glandular rate can be estimated to be less than ±3%. When the calculation error of the glandular rate is taken into consideration, the error in the individual average glandular dose can be estimated to be 13% or less. We feel that our method for determining the glandular rate from mammograms is useful for minimizing subjectivity in the evaluation of patient breast composition. (author)

  8. Joint two-view information for computerized detection of microcalcifications on mammograms

    International Nuclear Information System (INIS)

    Sahiner, Berkman; Chan, H.-P.; Hadjiiski, Lubomir M.; Helvie, Mark A.; Paramagul, Chinatana; Ge Jun; Wei Jun; Zhou Chuan

    2006-01-01

    We are developing new techniques to improve the accuracy of computerized microcalcification detection by using the joint two-view information on craniocaudal (CC) and mediolateral-oblique (MLO) views. After cluster candidates were detected using a single-view detection technique, candidates on CC and MLO views were paired using their radial distances from the nipple. Candidate pairs were classified with a similarity classifier that used the joint information from both views. Each cluster candidate was also characterized by its single-view features. The outputs of the similarity classifier and the single-view classifier were fused and the cluster candidate was classified as a true microcalcification cluster or a false-positive (FP) using the fused two-view information. A data set of 116 pairs of mammograms containing microcalcification clusters and 203 pairs of normal images from the University of South Florida (USF) public database was used for training the two-view detection algorithm. The trained method was tested on an independent test set of 167 pairs of mammograms, which contained 71 normal pairs and 96 pairs with microcalcification clusters collected at the University of Michigan (UM). The similarity classifier had a very low FP rate for the test set at low and medium levels of sensitivity. However, the highest mammogram-based sensitivity that could be reached by the similarity classifier was 69%. The single-view classifier had a higher FP rate compared to the similarity classifier, but it could reach a maximum mammogram-based sensitivity of 93%. The fusion method combined the scores of these two classifiers so that the number of FPs was substantially reduced at relatively low and medium sensitivities, and a relatively high maximum sensitivity was maintained. For the malignant microcalcification clusters, at a mammogram-based sensitivity of 80%, the FP rates were 0.18 and 0.35 with the two-view fusion and single-view detection methods, respectively. When the

  9. Radiation scars on mammograms

    International Nuclear Information System (INIS)

    Otto, H.; Breining, H.; Knappschafts-Krankenhaus Essen

    1985-01-01

    Six patients with radiation scars are described. In each case the diagnosis was confirmed histologically in five cases corresponding mammograms were available. The histological appearances of radiation scars are described and the radiological features are presented. These lesions can be diagnosed mammographically in vivo. Macroscopically differentiation from a scirrhous carcinoma is not possible and therefore a radiation scar must always be excised; this also leads to definitive cure. On mammographic screening the incidence is 0.5 to 0.9 per thousand. The significance of radiation scars depends on the fact that they are pre-cancerous and therefore are equivalent to the early diagnosis of a carcinoma with the possibility of a complete cure. (orig.) [de

  10. Volumetric quantification of the effect of aging and hormone replacement therapy on breast composition from digital mammograms

    International Nuclear Information System (INIS)

    Hammann-Kloss, J.S.; Bick, U.; Fallenberg, E.; Engelken, F.

    2014-01-01

    Objective: To assess the physiological changes in breast composition with aging using volumetric breast composition measurement from digital mammograms and to assess the effect of hormone replacement therapy (HRT). Methods: A total of 764 consecutive mammograms of 208 non-HRT using women and 508 mammograms of 134 HRT-using women were analyzed using a volumetric breast composition assessment software (Quantra™, Hologic Inc.). Fibroglandular tissue volume (FTV), breast volume (BV), and percent density (PD) were measured. For statistical analysis, women were divided into a premenopausal (<46 years), a perimenopausal (46–55 years), and a postmenopausal (>55 years) age group. More detailed graphical analysis was performed using smaller age brackets. Women using HRT were compared to age-matched controls not using HRT. Results: Women in the postmenopausal age group had a significantly lower FTV and PD and a significantly higher BV than women in the premenopausal age group (FTV: 77 vs. 120 cm 3 , respectively; PD: 16% vs. 28%, respectively; BV 478 vs. 406 cm 3 , respectively; p < 0.01 for all). Median FTV was nearly stable in consecutive mammograms in the premenopausal and postmenopausal age groups, but declined at a rate of 3.9% per year in the perimenopausal period. Median PD was constant in the premenopausal and postmenopausal age groups and declined at a rate of 0.57% per year in the perimenopausal age group. BV continuously increased with age. Women using HRT throughout the study had a 5% higher PD than women not using HRT (22% vs. 17%, respectively; p < 0.001). Conclusions: Accurate knowledge of normal changes in breast composition are of particular interest nowadays due to the importance of breast density for breast cancer risk evaluation. FTV and PD change significantly during the perimenopausal period but remain relatively constant before and thereafter. Median total breast volume consistently increases with age and further contributes to changes in breast

  11. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  12. A similarity measure method combining location feature for mammogram retrieval.

    Science.gov (United States)

    Wang, Zhiqiong; Xin, Junchang; Huang, Yukun; Li, Chen; Xu, Ling; Li, Yang; Zhang, Hao; Gu, Huizi; Qian, Wei

    2018-05-28

    Breast cancer, the most common malignancy among women, has a high mortality rate in clinical practice. Early detection, diagnosis and treatment can reduce the mortalities of breast cancer greatly. The method of mammogram retrieval can help doctors to find the early breast lesions effectively and determine a reasonable feature set for image similarity measure. This will improve the accuracy effectively for mammogram retrieval. This paper proposes a similarity measure method combining location feature for mammogram retrieval. Firstly, the images are pre-processed, the regions of interest are detected and the lesions are segmented in order to get the center point and radius of the lesions. Then, the method, namely Coherent Point Drift, is used for image registration with the pre-defined standard image. The center point and radius of the lesions after registration are obtained and the standard location feature of the image is constructed. This standard location feature can help figure out the location similarity between the image pair from the query image to each dataset image in the database. Next, the content feature of the image is extracted, including the Histogram of Oriented Gradients, the Edge Direction Histogram, the Local Binary Pattern and the Gray Level Histogram, and the image pair content similarity can be calculated using the Earth Mover's Distance. Finally, the location similarity and content similarity are fused to form the image fusion similarity, and the specified number of the most similar images can be returned according to it. In the experiment, 440 mammograms, which are from Chinese women in Northeast China, are used as the database. When fusing 40% lesion location feature similarity and 60% content feature similarity, the results have obvious advantages. At this time, precision is 0.83, recall is 0.76, comprehensive indicator is 0.79, satisfaction is 96.0%, mean is 4.2 and variance is 17.7. The results show that the precision and recall of this

  13. Studies on computer-aided diagnosis systems for chest radiographs and mammograms (in Japanese)

    International Nuclear Information System (INIS)

    Hara, Takeshi

    2001-01-01

    This thesis describes computer-aided diagnosis (CAD) systems for chest radiographs and mammograms. Preprocessing and imaging processing methods for each CAD system include dynamic range compression and region segmentation technique. A new pattern recognition technique combines genetic algorithms with template matching methods to detect lung nodules. A genetic algorithm was employed to select the optimal shape of simulated nodular shadows to be compared with real lesions on digitized chest images. Detection performance was evaluated using 332 chest radiographs from the database of the Japanese Society of Radiological Technology. Our average true-positive rate was 72.8% with an average of 11 false-positive findings per image. A new detection method using high resolution digital images with 0.05 mm sampling is also proposed for the mammogram CAD system to detect very small microcalcifications. An automated classification method uses feature extraction based on fractal dimension analysis of masses. Using over 200 cases to evaluate the detection of mammographic masses and calcifications, the detection rate of masses and microcalcifications were 87% and 96% with 1.5 and 1.8 false-positive findings, respectively. The classification performance on benign vs malignant lesions, the Az values that were defined by the areas under the ROC curves derived from classification schemes of masses and microcalcifications were 0.84 and 0.89. To demonstrate the practicality of these CAD systems in a computer-network environment, we propose to use the mammogram CAD system via the Internet and WWW. A common gateway interface and server-client approach for the CAD system via the Internet will permit display of the CAD results on ordinary computers

  14. Bayesian Maximum Entropy Based Algorithm for Digital X-ray Mammogram Processing

    Directory of Open Access Journals (Sweden)

    Radu Mutihac

    2009-06-01

    Full Text Available Basics of Bayesian statistics in inverse problems using the maximum entropy principle are summarized in connection with the restoration of positive, additive images from various types of data like X-ray digital mammograms. An efficient iterative algorithm for image restoration from large data sets based on the conjugate gradient method and Lagrange multipliers in nonlinear optimization of a specific potential function was developed. The point spread function of the imaging system was determined by numerical simulations of inhomogeneous breast-like tissue with microcalcification inclusions of various opacities. The processed digital and digitized mammograms resulted superior in comparison with their raw counterparts in terms of contrast, resolution, noise, and visibility of details.

  15. What Is a Mammogram and When Should I Get One?

    Science.gov (United States)

    ... Statistics What CDC Is Doing Research African American Women and Mass Media Campaign Public Service Announcements Print Materials Buttons and Badges Stay Informed Cancer Home What Is a Mammogram? Language: English (US) Español ( ...

  16. New Embedded Denotes Fuzzy C-Mean Application for Breast Cancer Density Segmentation in Digital Mammograms

    Science.gov (United States)

    Othman, Khairulnizam; Ahmad, Afandi

    2016-11-01

    In this research we explore the application of normalize denoted new techniques in advance fast c-mean in to the problem of finding the segment of different breast tissue regions in mammograms. The goal of the segmentation algorithm is to see if new denotes fuzzy c- mean algorithm could separate different densities for the different breast patterns. The new density segmentation is applied with multi-selection of seeds label to provide the hard constraint, whereas the seeds labels are selected based on user defined. New denotes fuzzy c- mean have been explored on images of various imaging modalities but not on huge format digital mammograms just yet. Therefore, this project is mainly focused on using normalize denoted new techniques employed in fuzzy c-mean to perform segmentation to increase visibility of different breast densities in mammography images. Segmentation of the mammogram into different mammographic densities is useful for risk assessment and quantitative evaluation of density changes. Our proposed methodology for the segmentation of mammograms on the basis of their region into different densities based categories has been tested on MIAS database and Trueta Database.

  17. Using x-ray mammograms to assist in microwave breast image interpretation.

    Science.gov (United States)

    Curtis, Charlotte; Frayne, Richard; Fear, Elise

    2012-01-01

    Current clinical breast imaging modalities include ultrasound, magnetic resonance (MR) imaging, and the ubiquitous X-ray mammography. Microwave imaging, which takes advantage of differing electromagnetic properties to obtain image contrast, shows potential as a complementary imaging technique. As an emerging modality, interpretation of 3D microwave images poses a significant challenge. MR images are often used to assist in this task, and X-ray mammograms are readily available. However, X-ray mammograms provide 2D images of a breast under compression, resulting in significant geometric distortion. This paper presents a method to estimate the 3D shape of the breast and locations of regions of interest from standard clinical mammograms. The technique was developed using MR images as the reference 3D shape with the future intention of using microwave images. Twelve breast shapes were estimated and compared to ground truth MR images, resulting in a skin surface estimation accurate to within an average Euclidean distance of 10 mm. The 3D locations of regions of interest were estimated to be within the same clinical area of the breast as corresponding regions seen on MR imaging. These results encourage investigation into the use of mammography as a source of information to assist with microwave image interpretation as well as validation of microwave imaging techniques.

  18. Volumetric breast density estimation from full-field digital mammograms.

    NARCIS (Netherlands)

    Engeland, S. van; Snoeren, P.R.; Huisman, H.J.; Boetes, C.; Karssemeijer, N.

    2006-01-01

    A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast

  19. Using X-Ray Mammograms to Assist in Microwave Breast Image Interpretation

    Directory of Open Access Journals (Sweden)

    Charlotte Curtis

    2012-01-01

    Full Text Available Current clinical breast imaging modalities include ultrasound, magnetic resonance (MR imaging, and the ubiquitous X-ray mammography. Microwave imaging, which takes advantage of differing electromagnetic properties to obtain image contrast, shows potential as a complementary imaging technique. As an emerging modality, interpretation of 3D microwave images poses a significant challenge. MR images are often used to assist in this task, and X-ray mammograms are readily available. However, X-ray mammograms provide 2D images of a breast under compression, resulting in significant geometric distortion. This paper presents a method to estimate the 3D shape of the breast and locations of regions of interest from standard clinical mammograms. The technique was developed using MR images as the reference 3D shape with the future intention of using microwave images. Twelve breast shapes were estimated and compared to ground truth MR images, resulting in a skin surface estimation accurate to within an average Euclidean distance of 10 mm. The 3D locations of regions of interest were estimated to be within the same clinical area of the breast as corresponding regions seen on MR imaging. These results encourage investigation into the use of mammography as a source of information to assist with microwave image interpretation as well as validation of microwave imaging techniques.

  20. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    International Nuclear Information System (INIS)

    Jamal, N; Ng, K-H; Looi, L-M; McLean, D; Zulfiqar, A; Tan, S-P; Liew, W-F; Shantini, A; Ranganathan, S

    2006-01-01

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r 2 = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk

  1. Patient understanding of the revised USPSTF screening mammogram guidelines: need for development of patient decision aids

    Directory of Open Access Journals (Sweden)

    Allen Summer V

    2012-10-01

    Full Text Available Abstract Background The purpose of the study was to examine patients’ understanding of the revised screening mammogram guidelines released by the United States Preventive Services Task Force (USPSTF in 2009 addressing age at initiation and frequency of screening mammography. Methods Patients from the Departments of Family Medicine, Internal Medicine, and Obstetrics and Gynecology (n = 150 at a tertiary care medical center in the United States completed a survey regarding their understanding of the revised USPSTF guidelines following their release, within four to six months of their scheduled mammogram (March 2010 to May 2010. Results Of the patients surveyed, 97/147 (67% indicated increased confusion regarding the age and frequency of screening mammography, 61/148 (41% reported increased anxiety about mammograms, and 58/146 (40% reported anxiety about their own health status following the release of the revised screening guidelines. Most of the patients surveyed, 111/148 (75%, did not expect to change their timing or frequency of screening mammograms in the future. Conclusion Results from this survey suggested increased confusion and possibly an increase in patients’ anxiety related to screening mammography and their own health status following the release of the revised USPSTF screening mammogram guidelines to the public and subsequent media portrayal of the revised guidelines. Although the study did not specifically address causality for these findings, the results highlight the need for improvements in the communication of guidelines to patients and the public. Development of shared decision-making tools and outcomes should be considered to address the communication challenge.

  2. Monte Carlo Modelling of Mammograms : Development and Validation

    International Nuclear Information System (INIS)

    Spyrou, G.; Panayiotakis, G.; Bakas, A.; Tzanakos, G.

    1998-01-01

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors)

  3. Using autoencoders for mammogram compression.

    Science.gov (United States)

    Tan, Chun Chet; Eswaran, Chikkannan

    2011-02-01

    This paper presents the results obtained for medical image compression using autoencoder neural networks. Since mammograms (medical images) are usually of big sizes, training of autoencoders becomes extremely tedious and difficult if the whole image is used for training. We show in this paper that the autoencoders can be trained successfully by using image patches instead of the whole image. The compression performances of different types of autoencoders are compared based on two parameters, namely mean square error and structural similarity index. It is found from the experimental results that the autoencoder which does not use Restricted Boltzmann Machine pre-training yields better results than those which use this pre-training method.

  4. Computer-Aided Detection in Digital Mammography: False-Positive Marks and Their Reproducibility in Negative Mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min; Seong, Min Hyun

    2009-01-01

    Background: There are relatively few studies reporting the frequency of false-positive computer-aided detection (CAD) marks and their reproducibility in normal cases. Purpose: To evaluate retrospectively the false-positive mark rate of a CAD system and the reproducibility of false-positive marks in two sets of negative digital mammograms. Material and Methods: Two sets of negative digital mammograms were obtained in 360 women (mean age 57 years, range 30-76 years) with an approximate interval of 1 year (mean time 343.7 days), and a CAD system was applied. False-positive CAD marks and the reproducibility were determined. Results: Of the 360 patients, 252 (70.0%) and 240 (66.7%) patients had 1-7 CAD marks on the initial and second mammograms, respectively. The false-positive CAD mark rate was 1.5 (1.1 for masses and 0.4 for calcifications) and 1.4 (1.0 for masses and 0.4 for calcifications) per examination in the initial and second mammograms, respectively. The reproducibility of the false-positive CAD marks was 12.0% for both mass (81/680) and microcalcification (33/278) marks. Conclusion: False-positive CAD marks were seen in approximately 70% of normal cases. However, the reproducibility was very low. Radiologists must be familiar with the findings of false-positive CAD marks, since they are very common and can increase the recall rate in screening

  5. Parameter estimation in stochastic mammogram model by heuristic optimization techniques.

    NARCIS (Netherlands)

    Selvan, S.E.; Xavier, C.C.; Karssemeijer, N.; Sequeira, J.; Cherian, R.A.; Dhala, B.Y.

    2006-01-01

    The appearance of disproportionately large amounts of high-density breast parenchyma in mammograms has been found to be a strong indicator of the risk of developing breast cancer. Hence, the breast density model is popular for risk estimation or for monitoring breast density change in prevention or

  6. Monte Carlo Modelling of Mammograms : Development and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Spyrou, G; Panayiotakis, G [Univercity of Patras, School of Medicine, Medical Physics Department, 265 00 Patras (Greece); Bakas, A [Technological Educational Institution of Athens, Department of Radiography, 122 10 Athens (Greece); Tzanakos, G [University of Athens, Department of Physics, Divission of Nuclear and Particle Physics, 157 71 Athens (Greece)

    1999-12-31

    A software package using Monte Carlo methods has been developed for the simulation of x-ray mammography. A simplified geometry of the mammographic apparatus has been considered along with the software phantom of compressed breast. This phantom may contain inhomogeneities of various compositions and sizes at any point. Using this model one can produce simulated mammograms. Results that demonstrate the validity of this simulation are presented. (authors) 16 refs, 4 figs

  7. Effects of Different Compression Techniques on Diagnostic Accuracies of Breast Masses on Digitized Mammograms

    International Nuclear Information System (INIS)

    Zhigang Liang; Xiangying Du; Jiabin Liu; Yanhui Yang; Dongdong Rong; Xinyu Y ao; Kuncheng Li

    2008-01-01

    Background: The JPEG 2000 compression technique has recently been introduced into the medical imaging field. It is critical to understand the effects of this technique on the detection of breast masses on digitized images by human observers. Purpose: To evaluate whether lossless and lossy techniques affect the diagnostic results of malignant and benign breast masses on digitized mammograms. Material and Methods: A total of 90 screen-film mammograms including craniocaudal and lateral views obtained from 45 patients were selected by two non-observing radiologists. Of these, 22 cases were benign lesions and 23 cases were malignant. The mammographic films were digitized by a laser film digitizer, and compressed to three levels (lossless and lossy 20:1 and 40:1) using the JPEG 2000 wavelet-based image compression algorithm. Four radiologists with 10-12 years' experience in mammography interpreted the original and compressed images. The time interval was 3 weeks for each reading session. A five-point malignancy scale was used, with a score of 1 corresponding to definitely not a malignant mass, a score of 2 referring to not a malignant mass, a score of 3 meaning possibly a malignant mass, a score of 4 being probably a malignant mass, and a score of 5 interpreted as definitely a malignant mass. The radiologists' performance was evaluated using receiver operating characteristic analysis. Results: The average Az values for all radiologists decreased from 0.8933 for the original uncompressed images to 0.8299 for the images compressed at 40:1. This difference was not statistically significant. The detection accuracy of the original images was better than that of the compressed images, and the Az values decreased with increasing compression ratio. Conclusion: Digitized mammograms compressed at 40:1 could be used to substitute original images in the diagnosis of breast cancer

  8. The Effect of Breast Implants on Mammogram Outcomes.

    Science.gov (United States)

    Kam, Kelli; Lee, Esther; Pairawan, Seyed; Anderson, Kendra; Cora, Cherie; Bae, Won; Senthil, Maheswari; Solomon, Naveenraj; Lum, Sharon

    2015-10-01

    Breast cancer detection in women with implants has been questioned. We sought to evaluate the impact of breast implants on mammographic outcomes. A retrospective review of women undergoing mammography between March 1 and October 30, 2013 was performed. Demographic characteristics and mammogram results were compared between women with and without breast implants. Overall, 4.8 per cent of 1863 women identified during the study period had breast implants. Median age was 59 years (26-93). Women with implants were younger (53.9 vs 59.2 years, P breast tissue (72.1% vs 56.4%, P = 0.004) than those without. There were no statistically significant differences with regards to Breast Imaging Recording and Data System 0 score (13.3% with implants vs 21.4% without), call back exam (18.9% with vs 24.1% without), time to resolution of abnormal imaging (58.6 days with vs 43.3 without), or cancer detection rate (0% with implants vs 1.0% without). Because implants did not significantly affect mammogram results, women with implants should be reassured that mammography remains useful in detecting cancer. However, future research is required to determine whether lower call back rates and longer time to resolution of imaging findings contribute to delays in diagnosis in patients with implants.

  9. Ethnic differences in social support after initial receipt of an abnormal mammogram.

    Science.gov (United States)

    Molina, Yamile; Hohl, Sarah D; Nguyen, Michelle; Hempstead, Bridgette H; Weatherby, Shauna Rae; Dunbar, Claire; Beresford, Shirley A A; Ceballos, Rachel M

    2016-10-01

    We examine access to and type of social support after initial receipt of an abnormal mammogram across non-Latina White (NLW), African American, and Latina women. This cross-sectional study used a mixed method design, with quantitative and qualitative measures. Women were recruited through 2 community advocates and 3 breast-health-related care organizations. With regard to access, African American women were less likely to access social support relative to NLW counterparts. Similar nonsignificant differences were found for Latinas. Women did not discuss results with family and friends to avoid burdening social networks and negative reactions. Networks' geographic constraints and medical mistrust influenced Latina and African American women's decisions to discuss results. With regard to type of social support, women reported emotional support across ethnicity. Latina and African American women reported more instrumental support, whereas NLW women reported more informational support in the context of their well-being. There are shared and culturally unique aspects of women's experiences with social support after initially receiving an abnormal mammogram. Latina and African American women may particularly benefit from informational support from health care professionals. Communitywide efforts to mitigate mistrust and encourage active communication about cancer may improve ethnic disparities in emotional well-being and diagnostic resolution during initial receipt of an abnormal mammogram. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Staging of breast cancer and the advanced applications of digital mammogram: what the physician needs to know?

    Science.gov (United States)

    Helal, Maha H; Mansour, Sahar M; Zaglol, Mai; Salaleldin, Lamia A; Nada, Omniya M; Haggag, Marwa A

    2017-03-01

    To study the role of advanced applications of digital mammogram, whether contrast-enhanced spectral mammography (CESM) or digital breast tomosynthesis (DBT), in the "T" staging of histologically proven breast cancer before planning for treatment management. In this prospective analysis, we evaluated 98 proved malignant breast masses regarding their size, multiplicity and the presence of associated clusters of microcalcifications. Evaluation methods included digital mammography (DM), 3D tomosynthesis and CESM. Traditional DM was first performed then in a period of 10-14-day interval; breast tomosynthesis and contrast-based mammography were performed for the involved breast only. Views at tomosynthesis were acquired in a "step-and-shoot" tube motion mode to produce multiple (11-15), low-dose images and in contrast-enhanced study, low-energy (22-33 kVp) and high-energy (44-49 kVp) exposures were taken after the i.v. injection of the contrast agent. Operative data were the gold standard reference. Breast tomosynthesis showed the highest accuracy in size assessment (n = 69, 70.4%) than contrast-enhanced (n = 49, 50%) and regular mammography (n = 59, 60.2%). Contrast-enhanced mammography presented the least performance in assessing calcifications, yet it was most sensitive in the detection of multiplicity (92.3%), followed by tomosynthesis (77%) and regular mammography (53.8%). The combined analysis of the three modalities provided an accuracy of 74% in the "T" staging of breast cancer. The combined application of tomosynthesis and contrast-enhanced digital mammogram enhanced the performance of the traditional DM and presented an informative method in the staging of breast cancer. Advances in knowledge: Staging and management planning of breast cancer can divert according to tumour size, multiplicity and the presence of microcalcifications. DBT shows sharp outlines of the tumour with no overlap tissue and spots microcalcifications. Contrast

  11. Resolution effects on the morphology of calcifications in digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Kallergi, Maria; He, Li; Gavrielides, Marios; Heine, John; Clarke, Laurence P [Department of Radiology, College of Medicine, and H. Lee Moffitt Cancer Center and Research Institute at the University of South Florida, 12901 Bruce B. Downs Blvd., Box 17, Tampa, FL 33612 (United States)

    1999-12-31

    The development of computer assisted diagnosis (CAD) techniques and direct digital mammography systems have generated significant interest in the issue of the effect of image resolution on the detection and classification (benign vs malignant) of mammographic abnormalities. CAD in particular seems to heavily depend on image resolution, either due to the inherent algorithm design and optimization, which is almost always dependent, or due to the differences in image content at the various resolutions. This twofold dependence makes it even more difficult to answer the question of what is the minimum resolution required for successful detection and/or classification of a specific mammographic abnormality, such as calcifications. One may begin by evaluating the losses in the mammograms as the films are digitized with different pixel sizes and depths. In this paper we attempted to measure these losses for the case of calcifications at four different spatial resolutions through a simulation model and a classification scheme that is based only on morphological features. The results showed that a 60 {mu}m pixel size and 12 bits per pixel should at least be used if the morphology and distribution of the calcifications are essential components in the CAD algorithm design. These conclusions were tested with the use of a wavelet-based algorithm for the segmentation of simulated mammographic calcifications at various resolutions. The evaluation of the segmentation through shape analysis and classification supported the initial conclusion. (authors) 14 refs., 1 tabs.

  12. Towards an in-plane methodology to track breast lesions using mammograms and patient-specific finite-element simulations

    Science.gov (United States)

    Lapuebla-Ferri, Andrés; Cegoñino-Banzo, José; Jiménez-Mocholí, Antonio-José; Pérez del Palomar, Amaya

    2017-11-01

    In breast cancer screening or diagnosis, it is usual to combine different images in order to locate a lesion as accurately as possible. These images are generated using a single or several imaging techniques. As x-ray-based mammography is widely used, a breast lesion is located in the same plane of the image (mammogram), but tracking it across mammograms corresponding to different views is a challenging task for medical physicians. Accordingly, simulation tools and methodologies that use patient-specific numerical models can facilitate the task of fusing information from different images. Additionally, these tools need to be as straightforward as possible to facilitate their translation to the clinical area. This paper presents a patient-specific, finite-element-based and semi-automated simulation methodology to track breast lesions across mammograms. A realistic three-dimensional computer model of a patient’s breast was generated from magnetic resonance imaging to simulate mammographic compressions in cranio-caudal (CC, head-to-toe) and medio-lateral oblique (MLO, shoulder-to-opposite hip) directions. For each compression being simulated, a virtual mammogram was obtained and posteriorly superimposed to the corresponding real mammogram, by sharing the nipple as a common feature. Two-dimensional rigid-body transformations were applied, and the error distance measured between the centroids of the tumors previously located on each image was 3.84 mm and 2.41 mm for CC and MLO compression, respectively. Considering that the scope of this work is to conceive a methodology translatable to clinical practice, the results indicate that it could be helpful in supporting the tracking of breast lesions.

  13. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    Energy Technology Data Exchange (ETDEWEB)

    Jamal, N [Medical Technology Division, Malaysian Institute for Nuclear Technology Research (MINT) 43000 Kajang (Malaysia); Ng, K-H [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia); Looi, L-M [Department of Pathology, University of Malaya, 50603 Kuala Lumpur (Malaysia); McLean, D [Medical Physics Department, Westmead Hospital, Sydney, NSW 2145 (Australia); Zulfiqar, A [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Tan, S-P [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Liew, W-F [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Shantini, A [Department of Radiology, Kuala Lumpur Hospital, 50586 Kuala Lumpur (Malaysia); Ranganathan, S [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia)

    2006-11-21

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r{sup 2} = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk.

  14. Mammogram classification scheme using 2D-discrete wavelet and local binary pattern for detection of breast cancer

    Science.gov (United States)

    Adi Putra, Januar

    2018-04-01

    In this paper, we propose a new mammogram classification scheme to classify the breast tissues as normal or abnormal. Feature matrix is generated using Local Binary Pattern to all the detailed coefficients from 2D-DWT of the region of interest (ROI) of a mammogram. Feature selection is done by selecting the relevant features that affect the classification. Feature selection is used to reduce the dimensionality of data and features that are not relevant, in this paper the F-test and Ttest will be performed to the results of the feature extraction dataset to reduce and select the relevant feature. The best features are used in a Neural Network classifier for classification. In this research we use MIAS and DDSM database. In addition to the suggested scheme, the competent schemes are also simulated for comparative analysis. It is observed that the proposed scheme has a better say with respect to accuracy, specificity and sensitivity. Based on experiments, the performance of the proposed scheme can produce high accuracy that is 92.71%, while the lowest accuracy obtained is 77.08%.

  15. Reading screening mammograms – Attitudes among radiologists and radiographers about skill mix

    International Nuclear Information System (INIS)

    Johansen, Lena Westphal; Brodersen, John

    2011-01-01

    Introduction: Because of shortage of personnel for the Danish mammography screening programme, the aim of this study was to investigate the attitudes of radiologists and radiographers towards a future implementation of radiographers reading screening mammograms. Materials and methods: Seven combined phenomenological and hermeneutical interviews with radiographers and radiologists were performed. Stratified selection was used for sampling of informants. The interviews were analysed against theory about quality, organization and profession. Results: Quality related possibilities: radiographers do routinely measure the performance quality, radiographers obtain sufficient reading qualifications, and skill mix improves quality. Quality related obstacles: radiologists do not routinely measure performance quality. Organization related possibilities: shortage of radiologists, positive attitudes of managers, and improved working relations. Organization related obstacles: shortage of radiographers and negative attitudes of managers. Professional related possibilities: positive experience with skill mix. Professional related obstacles: worries about negative consequences for the training of radiologists, and resistance against handing over tasks to another profession. Conclusion: Attitudes towards radiographers reading screening mammograms are attached to either quality-, organisational or professional perspectives. Radiographers are capable of learning to read mammograms at sufficient performance level but routine measurement of performance quality is essential. Resistance against skill mix may be caused by an emotionally conditioned fear of losing demarcations. The main motive for skill mix is improvement of the utilization of resources. No evidence was found regarding the organisational and financial consequences of skill mix. Despite of this all radiologists and radiographers experienced with skill mix were strong advocates for reading radiographers.

  16. Ameliorating mammograms by using novel image processing algorithms

    Science.gov (United States)

    Pillai, A.; Kwartowitz, D.

    2014-03-01

    Mammography is one of the most important tools for the early detection of breast cancer typically through detection of characteristic masses and/or micro calcifications. Digital mammography has become commonplace in recent years. High quality mammogram images are large in size, providing high-resolution data. Estimates of the false negative rate for cancers in mammography are approximately 10%-30%. This may be due to observation error, but more frequently it is because the cancer is hidden by other dense tissue in the breast and even after retrospective review of the mammogram, cannot be seen. In this study, we report on the results of novel image processing algorithms that will enhance the images providing decision support to reading physicians. Techniques such as Butterworth high pass filtering and Gabor filters will be applied to enhance images; followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI, which will be used to classify the ROIs as either masses or non-masses. Among the statistical methods most used for the characterization of textures, the co-occurrence matrix makes it possible to determine the frequency of appearance of two pixels separated by a distance, at an angle from the horizontal. This matrix contains a very large amount of information that is complex. Therefore, it is not used directly but through measurements known as indices of texture such as average, variance, energy, contrast, correlation, normalized correlation and entropy.

  17. Mammographer personality traits – elements of the optimal mammogram experience

    Directory of Open Access Journals (Sweden)

    Amanda Louw

    2014-11-01

    Doelstellings: Die doel van hierdie studie was om van die faktore wat pasiënte se persepsies beïnvloed, te ondersoek. Pasiënte se persepsies en voorkeure ten opsigte van mammograwe se persoonlikheidseienskappe word in hierdie artikel bespreek. Metode: In dié beskrywende, verkennende studie is ’n nie-waarskynlikheid-gerieflikheidsteekproef-metode gebruik om data van 274 mammogram-pasiënte in vier kliniese opleidingsentrums in Gauteng met behulp van ’n vraelys in te win. Die respondente moes die belangrikheid van 24 persoonlikheidseienskappe van mammograwe beoordeel. Geldigheid, betroubaarheid, geloofwaardigheid en etiese oorwegings is in ag geneem. Resultate: Van al die vraelyste is 91% ingehandig. Die data is met behulp van beskrywende statistiek en faktoranalise geïnterpreteer, en vier faktore is uit die persoonlikheidskaal geïdentifiseer. Gevolgtrekking: Dit blyk dat pasiënte mammograwe beoordeel volgens die vertroue wat hulle inboesem, die sorg wat hulle verleen, hoe veilig hulle pasiënte laat voel, asook hoe goed hulle kommunikeer. Aangesien die mammograaf-pasiënt-verhouding pasiënte se indrukke van mammogramme sterk beïnvloed, kan hierdie vier faktore as fundamentele elemente van ’n optimale mammogram-ondersoek beskou word.

  18. Noise equalization for detection of microcalcification clusters in direct digital mammogram images.

    NARCIS (Netherlands)

    McLoughlin, K.J.; Bones, P.J.; Karssemeijer, N.

    2004-01-01

    Equalizing image noise is shown to be an important step in the automatic detection of microcalcifications in digital mammography. This study extends a well established film-screen noise equalization scheme developed by Veldkamp et al. for application to full-field digital mammogram (FFDM) images. A

  19. Assessment of a novel mass detection algorithm in mammograms

    Directory of Open Access Journals (Sweden)

    Ehsan Kozegar

    2013-01-01

    Settings and Design: The proposed mass detector consists of two major steps. In the first step, several suspicious regions are extracted from the mammograms using an adaptive thresholding technique. In the second step, false positives originating by the previous stage are reduced by a machine learning approach. Materials and Methods: All modules of the mass detector were assessed on mini-MIAS database. In addition, the algorithm was tested on INBreast database for more validation. Results: According to FROC analysis, our mass detection algorithm outperforms other competing methods. Conclusions: We should not just insist on sensitivity in the segmentation phase because if we forgot FP rate, and our goal was just higher sensitivity, then the learning algorithm would be biased more toward false positives and the sensitivity would decrease dramatically in the false positive reduction phase. Therefore, we should consider the mass detection problem as a cost sensitive problem because misclassification costs are not the same in this type of problems.

  20. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  1. Computer-aided diagnosis scheme for histological classification of clustered microcalcifications on magnification mammograms

    International Nuclear Information System (INIS)

    Nakayama, Ryohei; Uchiyama, Yoshikazu; Watanabe, Ryoji; Katsuragawa, Shigehiko; Namba, Kiyoshi; Doi, Kunio

    2004-01-01

    The histological classification of clustered microcalcifications on mammograms can be difficult, and thus often require biopsy or follow-up. Our purpose in this study was to develop a computer-aided diagnosis schemefor identifying the histological classification of clustered microcalcifications on magnification mammograms in order to assist the radiologists' interpretation as a 'second opinion'. Our database consisted of 58 magnification mammograms, which included 35 malignant clustered microcalcifications (9 invasive carcinomas, 12 noninvasive carcinomas of the comedo type, and 14 noninvasive carcinomas of the noncomedo type) and 23 benign clustered microcalcifications (17 mastopathies and 6 fibroadenomas). The histological classifications of all clustered microcalcifications were proved by pathologic diagnosis. The clustered microcalcifications were first segmented by use of a novel filter bank and a thresholding technique. Five objective features on clustered microcalcifications were determined by taking into account subjective features that experienced the radiologists commonly use to identify possible histological classifications. The Bayes decision rule with five objective features was employed for distinguishing between five histological classifications. The classification accuracies for distinguishing between three malignant histological classifications were 77.8% (7/9) for invasive carcinoma, 75.0% (9/12) for noninvasive carcinoma of the comedo type, and 92.9% (13/14) for noninvasive carcinoma of the noncomedo type. The classification accuracies for distinguishing between two benign histological classifications were 94.1% (16/17) for mastopathy, and 100.0% (6/6) for fibroadenoma. This computerized method would be useful in assisting radiologists in their assessments of clustered microcalcifications

  2. ELM BASED CAD SYSTEM TO CLASSIFY MAMMOGRAMS BY THE COMBINATION OF CLBP AND CONTOURLET

    Directory of Open Access Journals (Sweden)

    S Venkatalakshmi

    2017-05-01

    Full Text Available Breast cancer is a serious life threat to the womanhood, worldwide. Mammography is the promising screening tool, which can show the abnormality being detected. However, the physicians find it difficult to detect the affected regions, as the size of microcalcifications is very small. Hence it would be better, if a CAD system can accompany the physician in detecting the malicious regions. Taking this as a challenge, this paper presents a CAD system for mammogram classification which is proven to be accurate and reliable. The entire work is decomposed into four different stages and the outcome of a phase is passed as the input of the following phase. Initially, the mammogram is pre-processed by adaptive median filter and the segmentation is done by GHFCM. The features are extracted by combining the texture feature descriptors Completed Local Binary Pattern (CLBP and contourlet to frame the feature sets. In the training phase, Extreme Learning Machine (ELM is trained with the feature sets. During the testing phase, the ELM can classify between normal, malignant and benign type of cancer. The performance of the proposed approach is analysed by varying the classifier, feature extractors and parameters of the feature extractor. From the experimental analysis, it is evident that the proposed work outperforms the analogous techniques in terms of accuracy, sensitivity and specificity.

  3. Automatic and consistent registration framework for temporal pairs of mammograms in application to breast cancer risk assessment due to hormone replacement therapy (HRT)

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Carreras, I. Arganda; Nielsen, Mads

    2009-01-01

     Purpose: Mammographic density is a strong risk factor for breast cancer. However, whether changes in mammographic density due to HRT are associated with risk remains unclear. The aim of this study is to provide a framework for accurate interval change analysis in temporal pairs of mammograms of ...

  4. Computer aided detection of clusters of microcalcifications on full field digital mammograms

    International Nuclear Information System (INIS)

    Ge Jun; Sahiner, Berkman; Hadjiiski, Lubomir M.; Chan, H.-P.; Wei Jun; Helvie, Mark A.; Zhou Chuan

    2006-01-01

    We are developing a computer-aided detection (CAD) system to identify microcalcification clusters (MCCs) automatically on full field digital mammograms (FFDMs). The CAD system includes six stages: preprocessing; image enhancement; segmentation of microcalcification candidates; false positive (FP) reduction for individual microcalcifications; regional clustering; and FP reduction for clustered microcalcifications. At the stage of FP reduction for individual microcalcifications, a truncated sum-of-squares error function was used to improve the efficiency and robustness of the training of an artificial neural network in our CAD system for FFDMs. At the stage of FP reduction for clustered microcalcifications, morphological features and features derived from the artificial neural network outputs were extracted from each cluster. Stepwise linear discriminant analysis (LDA) was used to select the features. An LDA classifier was then used to differentiate clustered microcalcifications from FPs. A data set of 96 cases with 192 images was collected at the University of Michigan. This data set contained 96 MCCs, of which 28 clusters were proven by biopsy to be malignant and 68 were proven to be benign. The data set was separated into two independent data sets for training and testing of the CAD system in a cross-validation scheme. When one data set was used to train and validate the convolution neural network (CNN) in our CAD system, the other data set was used to evaluate the detection performance. With the use of a truncated error metric, the training of CNN could be accelerated and the classification performance was improved. The CNN in combination with an LDA classifier could substantially reduce FPs with a small tradeoff in sensitivity. By using the free-response receiver operating characteristic methodology, it was found that our CAD system can achieve a cluster-based sensitivity of 70, 80, and 90 % at 0.21, 0.61, and 1.49 FPs/image, respectively. For case

  5. Reduction of false positives in the detection of architectural distortion in mammograms by using a geometrically constrained phase portrait model

    International Nuclear Information System (INIS)

    Ayres, Fabio J.; Rangayyan, Rangaraj M.

    2007-01-01

    Objective One of the commonly missed signs of breast cancer is architectural distortion. We have developed techniques for the detection of architectural distortion in mammograms, based on the analysis of oriented texture through the application of Gabor filters and a linear phase portrait model. In this paper, we propose constraining the shape of the general phase portrait model as a means to reduce the false-positive rate in the detection of architectural distortion. Material and methods The methods were tested with one set of 19 cases of architectural distortion and 41 normal mammograms, and with another set of 37 cases of architectural distortion. Results Sensitivity rates of 84% with 4.5 false positives per image and 81% with 10 false positives per image were obtained for the two sets of images. Conclusion The adoption of a constrained phase portrait model with a symmetric matrix and the incorporation of its condition number in the analysis resulted in a reduction in the false-positive rate in the detection of architectural distortion. The proposed techniques, dedicated for the detection and localization of architectural distortion, should lead to efficient detection of early signs of breast cancer. (orig.)

  6. The relationship of psychosocial factors to mammograms, physical activity, and fruit and vegetable consumption among sisters of breast cancer patients

    Directory of Open Access Journals (Sweden)

    Hartman SJ

    2011-08-01

    Full Text Available Sheri J Hartman1, Shira I Dunsiger1, Paul B Jacobsen21Centers for Behavioral and Preventive Medicine, The Miriam Hospital and W Alpert Medical School of Brown University, Providence, RI; 2Department of Health Outcomes and Behavior, H Lee Moffitt Cancer Center and Research Institute, Tampa, FL, USAAbstract: This study examined the relationship of psychosocial factors to health-promoting behaviors in sisters of breast cancer patients. One hundred and twenty sisters of breast cancer patients completed questionnaires assessing response efficacy of mammography screenings, physical activity, and fruit and vegetable consumption on decreasing breast cancer risk, breast cancer worry, involvement in their sister’s cancer care, mammography screenings, physical activity, and fruit and vegetable consumption. Results indicate that greater perceived effectiveness for mammograms was associated with a 67% increase in odds of yearly mammograms. Greater involvement in the patient’s care was associated with a 7% decrease in odds of yearly mammograms. Greater perceived effectiveness for physical activity was significantly related to greater physical activity. There was a trend for greater perceived effectiveness for fruits and vegetables to be associated with consuming more fruits and vegetables. Breast cancer worry was not significantly associated with the outcomes. While perceived effectiveness for a specific health behavior in reducing breast cancer risk was consistently related to engaging in that health behavior, women reported significantly lower perceived effectiveness for physical activity and fruits and vegetables than for mammograms. Making women aware of the health benefits of these behaviors may be important in promoting changes.Keywords: breast cancer risk, mammograms, physical activity, diet, perceived effectiveness

  7. Detecting microcalcifications in mammograms by using SVM method for the diagnostics of breast cancer

    Science.gov (United States)

    Wan, Baikun; Wang, Ruiping; Qi, Hongzhi; Cao, Xuchen

    2005-01-01

    Support vector machine (SVM) is a new statistical learning method. Compared with the classical machine learning methods, SVM learning discipline is to minimize the structural risk instead of the empirical risk of the classical methods, and it gives better generative performance. Because SVM algorithm is a convex quadratic optimization problem, the local optimal solution is certainly the global optimal one. In this paper a SVM algorithm is applied to detect the micro-calcifications (MCCs) in mammograms for the diagnostics of breast cancer that has not been reported yet. It had been tested with 10 mammograms and the results show that the algorithm can achieve a higher true positive in comparison with artificial neural network (ANN) based on the empirical risk minimization, and is valuable for further study and application in the clinical engineering.

  8. Area and volumetric density estimation in processed full-field digital mammograms for risk assessment of breast cancer.

    Directory of Open Access Journals (Sweden)

    Abbas Cheddad

    Full Text Available INTRODUCTION: Mammographic density, the white radiolucent part of a mammogram, is a marker of breast cancer risk and mammographic sensitivity. There are several means of measuring mammographic density, among which are area-based and volumetric-based approaches. Current volumetric methods use only unprocessed, raw mammograms, which is a problematic restriction since such raw mammograms are normally not stored. We describe fully automated methods for measuring both area and volumetric mammographic density from processed images. METHODS: The data set used in this study comprises raw and processed images of the same view from 1462 women. We developed two algorithms for processed images, an automated area-based approach (CASAM-Area and a volumetric-based approach (CASAM-Vol. The latter method was based on training a random forest prediction model with image statistical features as predictors, against a volumetric measure, Volpara, for corresponding raw images. We contrast the three methods, CASAM-Area, CASAM-Vol and Volpara directly and in terms of association with breast cancer risk and a known genetic variant for mammographic density and breast cancer, rs10995190 in the gene ZNF365. Associations with breast cancer risk were evaluated using images from 47 breast cancer cases and 1011 control subjects. The genetic association analysis was based on 1011 control subjects. RESULTS: All three measures of mammographic density were associated with breast cancer risk and rs10995190 (p0.10 for risk, p>0.03 for rs10995190. CONCLUSIONS: Our results show that it is possible to obtain reliable automated measures of volumetric and area mammographic density from processed digital images. Area and volumetric measures of density on processed digital images performed similar in terms of risk and genetic association.

  9. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain

    OpenAIRE

    Srivastava, Subodh; Sharma, Neeraj; Singh, S. K.; Srivastava, R.

    2014-01-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based uns...

  10. Simultaneous detection and classification of breast masses in digital mammograms via a deep learning YOLO-based CAD system.

    Science.gov (United States)

    Al-Masni, Mohammed A; Al-Antari, Mugahed A; Park, Jeong-Min; Gi, Geon; Kim, Tae-Yeon; Rivera, Patricio; Valarezo, Edwin; Choi, Mun-Taek; Han, Seung-Moo; Kim, Tae-Seong

    2018-04-01

    Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel Computer-Aided Diagnosis (CAD) system based on one of the regional deep learning techniques, a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Although most previous studies only deal with classification of masses, our proposed YOLO-based CAD system can handle detection and classification simultaneously in one framework. The proposed CAD system contains four main stages: preprocessing of mammograms, feature extraction utilizing deep convolutional networks, mass detection with confidence, and finally mass classification using Fully Connected Neural Networks (FC-NNs). In this study, we utilized original 600 mammograms from Digital Database for Screening Mammography (DDSM) and their augmented mammograms of 2,400 with the information of the masses and their types in training and testing our CAD. The trained YOLO-based CAD system detects the masses and then classifies their types into benign or malignant. Our results with five-fold cross validation tests show that the proposed CAD system detects the mass location with an overall accuracy of 99.7%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 97%. Our proposed system even works on some challenging breast cancer cases where the masses exist over the pectoral muscles or dense regions. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Regions of micro-calcifications clusters detection based on new features from imbalance data in mammograms

    Science.gov (United States)

    Wang, Keju; Dong, Min; Yang, Zhen; Guo, Yanan; Ma, Yide

    2017-02-01

    Breast cancer is the most common cancer among women. Micro-calcification cluster on X-ray mammogram is one of the most important abnormalities, and it is effective for early cancer detection. Surrounding Region Dependence Method (SRDM), a statistical texture analysis method is applied for detecting Regions of Interest (ROIs) containing microcalcifications. Inspired by the SRDM, we present a method that extract gray and other features which are effective to predict the positive and negative regions of micro-calcifications clusters in mammogram. By constructing a set of artificial images only containing micro-calcifications, we locate the suspicious pixels of calcifications of a SRDM matrix in original image map. Features are extracted based on these pixels for imbalance date and then the repeated random subsampling method and Random Forest (RF) classifier are used for classification. True Positive (TP) rate and False Positive (FP) can reflect how the result will be. The TP rate is 90% and FP rate is 88.8% when the threshold q is 10. We draw the Receiver Operating Characteristic (ROC) curve and the Area Under the ROC Curve (AUC) value reaches 0.9224. The experiment indicates that our method is effective. A novel regions of micro-calcifications clusters detection method is developed, which is based on new features for imbalance data in mammography, and it can be considered to help improving the accuracy of computer aided diagnosis breast cancer.

  12. Attitudes of women in their forties toward the 2009 USPSTF mammogram guidelines: a randomized trial on the effects of media exposure.

    Science.gov (United States)

    Davidson, AuTumn S; Liao, Xun; Magee, B Dale

    2011-07-01

    The objective of the study was to assess women's attitudes toward 2009 US Preventive Services Task Force mammography screening guideline changes and evaluate the role of media in shaping opinions. Two hundred forty-nine women, aged 39-49 years, presenting for annual examinations randomized to read 1 of 2 articles, and survey completion comprised the design of the study. Eighty-eight percent overestimated the lifetime breast cancer (BrCa) risk. Eighty-nine percent want yearly mammograms in their 40s. Eighty-six percent felt the changes were unsafe, and even if the changes were doctor recommended, 84% would not delay screening until age 50 years. Those with a friend/relative with BrCa were more likely to want annual mammography in their forties (92% vs 77%, P = .001), and feel changes unsafe (91% vs 69%, P ≤ .0001). Participants with previous false-positive mammograms were less likely to accept doctor-recommended screening delay until age 50 years (8% vs 21%, P = .01). Women overestimate BrCa risk. Skepticism of new mammogram guidelines exists, and is increased by exposure to negative media. Those with prior false-positive mammograms are less likely to accept changes. Copyright © 2011 Mosby, Inc. All rights reserved.

  13. A new approach to develop computer-aided detection schemes of digital mammograms

    Science.gov (United States)

    Tan, Maxine; Qian, Wei; Pu, Jiantao; Liu, Hong; Zheng, Bin

    2015-06-01

    The purpose of this study is to develop a new global mammographic image feature analysis based computer-aided detection (CAD) scheme and evaluate its performance in detecting positive screening mammography examinations. A dataset that includes images acquired from 1896 full-field digital mammography (FFDM) screening examinations was used in this study. Among them, 812 cases were positive for cancer and 1084 were negative or benign. After segmenting the breast area, a computerized scheme was applied to compute 92 global mammographic tissue density based features on each of four mammograms of the craniocaudal (CC) and mediolateral oblique (MLO) views. After adding three existing popular risk factors (woman’s age, subjectively rated mammographic density, and family breast cancer history) into the initial feature pool, we applied a sequential forward floating selection feature selection algorithm to select relevant features from the bilateral CC and MLO view images separately. The selected CC and MLO view image features were used to train two artificial neural networks (ANNs). The results were then fused by a third ANN to build a two-stage classifier to predict the likelihood of the FFDM screening examination being positive. CAD performance was tested using a ten-fold cross-validation method. The computed area under the receiver operating characteristic curve was AUC = 0.779   ±   0.025 and the odds ratio monotonically increased from 1 to 31.55 as CAD-generated detection scores increased. The study demonstrated that this new global image feature based CAD scheme had a relatively higher discriminatory power to cue the FFDM examinations with high risk of being positive, which may provide a new CAD-cueing method to assist radiologists in reading and interpreting screening mammograms.

  14. Validation of a method for measuring the volumetric breast density from digital mammograms

    International Nuclear Information System (INIS)

    Alonzo-Proulx, O; Shen, S Z; Yaffe, M J; Packard, N; Boone, J M; Al-Mayah, A; Brock, K K

    2010-01-01

    The purpose of this study was to evaluate the performance of an algorithm used to measure the volumetric breast density (VBD) from digital mammograms. The algorithm is based on the calibration of the detector signal versus the thickness and composition of breast-equivalent phantoms. The baseline error in the density from the algorithm was found to be 1.25 ± 2.3% VBD units (PVBD) when tested against a set of calibration phantoms, of thicknesses 3-8 cm, with compositions equivalent to fibroglandular content (breast density) between 0% and 100% and under x-ray beams between 26 kVp and 32 kVp with a Rh/Rh anode/filter. The algorithm was also tested against images from a dedicated breast computed tomography (CT) scanner acquired on 26 volunteers. The CT images were segmented into regions representing adipose, fibroglandular and skin tissues, and then deformed using a finite-element algorithm to simulate the effects of compression in mammography. The mean volume, VBD and thickness of the compressed breast for these deformed images were respectively 558 cm 3 , 23.6% and 62 mm. The displaced CT images were then used to generate simulated digital mammograms, considering the effects of the polychromatic x-ray spectrum, the primary and scattered energy transmitted through the breast, the anti-scatter grid and the detector efficiency. The simulated mammograms were analyzed with the VBD algorithm and compared with the deformed CT volumes. With the Rh/Rh anode filter, the root mean square difference between the VBD from CT and from the algorithm was 2.6 PVBD, and a linear regression between the two gave a slope of 0.992 with an intercept of -1.4 PVBD and a correlation with R 2 = 0.963. The results with the Mo/Mo and Mo/Rh anode/filter were similar.

  15. Validation of a method for measuring the volumetric breast density from digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Alonzo-Proulx, O; Shen, S Z; Yaffe, M J [Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario M4N 3M5 (Canada); Packard, N; Boone, J M [UC Davis Medical Center, University of California-Davis, Sacramento, CA 95817 (United States); Al-Mayah, A; Brock, K K, E-mail: oliviera@sri.utoronto.c [University Health Network, University of Toronto, Toronto, Ontario M5G 2M9 (Canada)

    2010-06-07

    The purpose of this study was to evaluate the performance of an algorithm used to measure the volumetric breast density (VBD) from digital mammograms. The algorithm is based on the calibration of the detector signal versus the thickness and composition of breast-equivalent phantoms. The baseline error in the density from the algorithm was found to be 1.25 {+-} 2.3% VBD units (PVBD) when tested against a set of calibration phantoms, of thicknesses 3-8 cm, with compositions equivalent to fibroglandular content (breast density) between 0% and 100% and under x-ray beams between 26 kVp and 32 kVp with a Rh/Rh anode/filter. The algorithm was also tested against images from a dedicated breast computed tomography (CT) scanner acquired on 26 volunteers. The CT images were segmented into regions representing adipose, fibroglandular and skin tissues, and then deformed using a finite-element algorithm to simulate the effects of compression in mammography. The mean volume, VBD and thickness of the compressed breast for these deformed images were respectively 558 cm{sup 3}, 23.6% and 62 mm. The displaced CT images were then used to generate simulated digital mammograms, considering the effects of the polychromatic x-ray spectrum, the primary and scattered energy transmitted through the breast, the anti-scatter grid and the detector efficiency. The simulated mammograms were analyzed with the VBD algorithm and compared with the deformed CT volumes. With the Rh/Rh anode filter, the root mean square difference between the VBD from CT and from the algorithm was 2.6 PVBD, and a linear regression between the two gave a slope of 0.992 with an intercept of -1.4 PVBD and a correlation with R{sup 2} = 0.963. The results with the Mo/Mo and Mo/Rh anode/filter were similar.

  16. iPixel: a visual content-based and semantic search engine for retrieving digitized mammograms by using collective intelligence.

    Science.gov (United States)

    Alor-Hernández, Giner; Pérez-Gallardo, Yuliana; Posada-Gómez, Rubén; Cortes-Robles, Guillermo; Rodríguez-González, Alejandro; Aguilar-Laserre, Alberto A

    2012-09-01

    Nowadays, traditional search engines such as Google, Yahoo and Bing facilitate the retrieval of information in the format of images, but the results are not always useful for the users. This is mainly due to two problems: (1) the semantic keywords are not taken into consideration and (2) it is not always possible to establish a query using the image features. This issue has been covered in different domains in order to develop content-based image retrieval (CBIR) systems. The expert community has focussed their attention on the healthcare domain, where a lot of visual information for medical analysis is available. This paper provides a solution called iPixel Visual Search Engine, which involves semantics and content issues in order to search for digitized mammograms. iPixel offers the possibility of retrieving mammogram features using collective intelligence and implementing a CBIR algorithm. Our proposal compares not only features with similar semantic meaning, but also visual features. In this sense, the comparisons are made in different ways: by the number of regions per image, by maximum and minimum size of regions per image and by average intensity level of each region. iPixel Visual Search Engine supports the medical community in differential diagnoses related to the diseases of the breast. The iPixel Visual Search Engine has been validated by experts in the healthcare domain, such as radiologists, in addition to experts in digital image analysis.

  17. Psychological distress, social withdrawal, and coping following receipt of an abnormal mammogram among different ethnicities: a mediation model.

    Science.gov (United States)

    Molina, Yamile; Beresford, Shirley A A; Espinoza, Noah; Thompson, Beti

    2014-09-01

    To explore ethnic differences in psychological distress and social withdrawal after receiving an abnormal mammogram result and to assess if coping strategies mediate ethnic differences. Descriptive correlational. Two urban mobile mammography units and a rural community hospital in the state of Washington. 41 Latina and 41 non-Latina Caucasian (NLC) women who had received an abnormal mammogram result. Women completed standard sociodemographic questions, Impact of Event Scale-Revised, the social dimension of the Psychological Consequences Questionnaire, and the Brief COPE. Ethnicity, psychological distress, social withdrawal, and coping. Latinas experienced greater psychological distress and social withdrawal compared to NLC counterparts. Denial as a coping strategy mediated ethnic differences in psychological distress. Religious coping mediated ethnic differences in social withdrawal. Larger population-based studies are necessary to understand how ethnic differences in coping strategies can influence psychological outcomes. This is an important finding that warrants additional study among women who are and are not diagnosed with breast cancer following an abnormal mammogram. Nurses may be able to work with Latina patients to diminish denial coping and consequent distress. Nurses may be particularly effective, given cultural values concerning strong interpersonal relationships and respect for authority figures.

  18. SU-E-I-59: Investigation of the Usefulness of a Standard Deviation and Mammary Gland Density as Indexes for Mammogram Classification.

    Science.gov (United States)

    Takarabe, S; Yabuuchi, H; Morishita, J

    2012-06-01

    To investigate the usefulness of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high- density mammary glands region to a whole mammary glands region as features for classification of mammograms into four categories based on the ACR BI-RADS breast composition. We used 36 digital mediolateral oblique view mammograms (18 patients) approved by our IRB. These images were classified into the four categories of breast compositions by an experienced breast radiologist and the results of the classification were regarded as a gold standard. First, a whole mammary region in a breast was divided into two regions such as a high-density mammary glands region and a low/iso-density mammary glands region by using a threshold value that was obtained from the pixel values corresponding to a pectoral muscle region. Then the percentage of a high-density mammary glands region to a whole mammary glands region was calculated. In addition, as a new method, the standard deviation of pixel values in a whole mammary glands region was calculated as an index based on the intermingling of mammary glands and fats. Finally, all mammograms were classified by using the combination of the percentage of a high-density mammary glands region and the standard deviation of each image. The agreement rates of the classification between our proposed method and gold standard was 86% (31/36). This result signified that our method has the potential to classify mammograms. The combination of the standard deviation of pixel values in a whole mammary glands region and the percentage of a high-density mammary glands region to a whole mammary glands region was available as features to classify mammograms based on the ACR BI- RADS breast composition. © 2012 American Association of Physicists in Medicine.

  19. Segmentation of the Breast Region in Digital Mammograms and Detection of Masses

    OpenAIRE

    Armen Sahakyan; Hakop Sarukhanyan

    2012-01-01

    The mammography is the most effective procedure for an early diagnosis of the breast cancer. Finding an accurate and efficient breast region segmentation technique still remains a challenging problem in digital mammography. In this paper we explore an automated technique for mammogram segmentation. The proposed algorithm uses morphological preprocessing algorithm in order to: remove digitization noises and separate background region from the breast profile region for further edge detection an...

  20. An abnormal screening mammogram causes more anxiety than a palpable lump in benign breast disease

    NARCIS (Netherlands)

    Keyzer-Dekker, C. M. G.; van Esch, L.; de Vries, J.; Ernst, Marloes; Nieuwenhuijzen, G. A. P.; Roukema, J. A.; van der Steeg, A. F. W.

    Being recalled for further diagnostic procedures after an abnormal screening mammogram (ASM) can evoke a high state anxiety with lowered quality of life (QoL). We examined whether these adverse psychological consequences are found in all women with benign breast disease (BBD) or are particular to

  1. Diagnostic accuracy of commercial system for computer-assisted detection (CADx) as an adjunct to interpretation of mammograms

    International Nuclear Information System (INIS)

    Menna, Sabatino; Di Virgilio, Maria Rosaria; Burke, Paolo; Frigerio, Alfonso; Boglione, Elisa; Ciccarelli, Grazia; Di Filippo, Sabato; Garretti, Licia

    2005-01-01

    Purpose. To evaluate the diagnostic accuracy of the commercial computer-aided detection CADx system for the reading of mammograms. Materials and methods. The study assessed the Second Look system developed and marketed by CADx Medical Systems, Montreal, Canada. The diagnostic sensitivity was evaluated by means of a retrospective study on 98 consecutive cancers detected at screening by double independent reading. The specificity and the positive predictive value (PPV) for cancer of the CADx system were prospectively evaluated on a second group of 560 consecutive mammograms of asymptomatic women not included in screening program. The radiologist who was present during the test assessed the abnormal mammographic findings by one or more of the following diagnostic procedures: physical examination, additional mammographic detail views with or without magnification,ultrasonography, ultrasound- or mammography guided fine needle aspiration cytology, and core-biopsy. The exams first underwent conventional reading and then a second reading carried out with the aid of the CADx system. Results.The overall diagnostic sensitivity of the CADx system on the 98 screening cancers was 81.6%; in particular it was 89.3% for calcifications, 83.9% for masses and only 37.5% for architectural distortion. The CADx markings for each mammography were 4.7 on average. Identification of invasive carcinoma was independent from tumour size. In the second group of 560 mammograms, the CADx system marked all cases identified as positive by conventional reading and confirmed by biopsy (7/7), but did not permit the detection of any additional cancer. The CADx markings per exam were 4.2 on average, the specificity was 13.7% and the PPV was 0.55% versus 13.7% recall rate of conventional reading. CADx reading led to a 1.96% (11/560) increase of the women necessitating further diagnostic investigation. Conclusions. The results of our study show that the diagnostic sensitivity of the CADx system is lower

  2. Computerized detection of masses on mammograms: A comparative study of two algorithms

    International Nuclear Information System (INIS)

    Tiedeu, A.; Kom, G.; Kom, M.

    2007-02-01

    In this paper, we implement and carry out the comparison of two methods of computer-aided-detection of masses on mammograms. The two algorithms basically consist of 3 steps each: segmentation, binarization and noise suppression but using different techniques for each step. A database of 60 images was used to compare the performance of the two algorithms in terms of general detection efficiency, conservation of size and shape of detected masses. (author)

  3. Is there a correlation between the presence of a spiculated mass on mammogram and luminal a subtype breast cancer?

    International Nuclear Information System (INIS)

    Liu, Song; Wu, Xiao Dong; Xu, Wen Jian; Lin, Qing; Liu, Xue Jun; Li, Ying

    2016-01-01

    To determine whether the appearance of a spiculated mass on a mammogram is associated with luminal A subtype breast cancer and the factors that may influence the presence or absence of the spiculated mass. Three hundred seventeen (317) patients who underwent image-guided or surgical biopsy between December 2014 and April 2015 were included in the study. Radiologists conducted retrospective assessments of the presence of spiculated masses according to the criteria of Breast Imaging Reporting and Data System. We used combinations of estrogen receptor (ER), progesterone receptor (PR), human epithelial growth factor receptor 2 (HER2), and Ki67 as surrogate markers to identify molecular subtypes of breast cancer. Pearson chi-square test was employed to measure statistical significance of correlations. Furthermore, we built a bi-variate logistic regression model to quantify the relative contribution of the factors that may influence the presence or absence of the spiculated mass. Seventy-one percent (71%) of the spiculated masses were classified as luminal A. Masses classified as luminal A were 10.3 times more likely to be presented as spiculated mass on a mammogram than all other subtypes. Patients with low Ki67 index (< 14%) and HER2 negative were most likely to present with a spiculated mass on their mammograms (p <0.001) than others. The hormone receptor status (ER and PR), pathology grade, overall breast composition, were all associated with the presence of a spiculated mass, but with less weight in contribution than Ki67 and HER2. We observed an association between the luminal A subtype of invasive breast cancer and the presence of a spiculated mass on a mammogram. It is hypothesized that lower Ki67 index and HER2 negativity may be the most significant factors in the presence of a spiculated mass

  4. Is there a correlation between the presence of a spiculated mass on mammogram and luminal a subtype breast cancer?

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Song; Wu, Xiao Dong; Xu, Wen Jian; Lin, Qing; Liu, Xue Jun; Li, Ying [The Affiliated Hospital of Qingdao University, Qingdao (China)

    2016-11-15

    To determine whether the appearance of a spiculated mass on a mammogram is associated with luminal A subtype breast cancer and the factors that may influence the presence or absence of the spiculated mass. Three hundred seventeen (317) patients who underwent image-guided or surgical biopsy between December 2014 and April 2015 were included in the study. Radiologists conducted retrospective assessments of the presence of spiculated masses according to the criteria of Breast Imaging Reporting and Data System. We used combinations of estrogen receptor (ER), progesterone receptor (PR), human epithelial growth factor receptor 2 (HER2), and Ki67 as surrogate markers to identify molecular subtypes of breast cancer. Pearson chi-square test was employed to measure statistical significance of correlations. Furthermore, we built a bi-variate logistic regression model to quantify the relative contribution of the factors that may influence the presence or absence of the spiculated mass. Seventy-one percent (71%) of the spiculated masses were classified as luminal A. Masses classified as luminal A were 10.3 times more likely to be presented as spiculated mass on a mammogram than all other subtypes. Patients with low Ki67 index (< 14%) and HER2 negative were most likely to present with a spiculated mass on their mammograms (p <0.001) than others. The hormone receptor status (ER and PR), pathology grade, overall breast composition, were all associated with the presence of a spiculated mass, but with less weight in contribution than Ki67 and HER2. We observed an association between the luminal A subtype of invasive breast cancer and the presence of a spiculated mass on a mammogram. It is hypothesized that lower Ki67 index and HER2 negativity may be the most significant factors in the presence of a spiculated mass.

  5. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Directory of Open Access Journals (Sweden)

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  6. Can Australian radiographers assess screening mammograms accurately? First stage results from a four year prospective study

    International Nuclear Information System (INIS)

    Moran, S.; Warren-Forward, H.

    2016-01-01

    Introduction: Globally, the role of the radiographer is changing; some countries have developed advanced roles with specific scopes of practice. Other countries, like Australia, are in the process of this change. The aim of this research is to assess the diagnostic outcomes reported by the radiographers and compare them to those reported by current screen readers. Method: Six experienced radiographers were invited to participate in a prospective study conducted between 2010 and 2011. They were required to read 2000 mammograms each. Their results were compared with those of the radiologists. Statistical analysis of the results included overall cancer detection rates, recall rates, levels of agreement, kappa, sensitivity, specificity, accuracy, positive predictive value and negative predictive value. Results: A total of 9348 women were included in the study. The percentage of cancers detected by the radiographers ranged from 53% to 100% of the cancers detected by the radiologists. Radiologist recall rate ranged between 3.4% and 5.5% and the radiographers' range was 2.9%–9.8%. Level of agreement of the radiographers with the radiologists ranged from 90 to 96%. Conclusion: The potential for accuracy in screen reading by Australian radiographers is supported by the results of this study. Implementation of formal training is likely to result in an increase in the diagnostic accuracy of radiographers. - Highlights: • Radiographers prospectively read 2000 screening mammograms each. • These results support potential for accuracy in screen reading by radiographers. • Will advanced practice be introduced within BreastScreen Australia?.

  7. Automated registration of diagnostic to prediagnostic x-ray mammograms: Evaluation and comparison to radiologists' accuracy

    International Nuclear Information System (INIS)

    Pinto Pereira, Snehal M.; Hipwell, John H.; McCormack, Valerie A.; Tanner, Christine; Moss, Sue M.; Wilkinson, Louise S.; Khoo, Lisanne A. L.; Pagliari, Catriona; Skippage, Pippa L.; Kliger, Carole J.; Hawkes, David J.; Santos Silva, Isabel M. dos

    2010-01-01

    Purpose: To compare and evaluate intensity-based registration methods for computation of serial x-ray mammogram correspondence. Methods: X-ray mammograms were simulated from MRIs of 20 women using finite element methods for modeling breast compressions and employing a MRI/x-ray appearance change model. The parameter configurations of three registration methods, affine, fluid, and free-form deformation (FFD), were optimized for registering x-ray mammograms on these simulated images. Five mammography film readers independently identified landmarks (tumor, nipple, and usually two other normal features) on pairs of diagnostic and corresponding prediagnostic digitized images from 52 breast cancer cases. Landmarks were independently reidentified by each reader. Target registration errors were calculated to compare the three registration methods using the reader landmarks as a gold standard. Data were analyzed using multilevel methods. Results: Between-reader variability varied with landmark (p<0.01) and screen (p=0.03), with between-reader mean distance (mm) in point location on the diagnostic/prediagnostic images of 2.50 (95% CI 1.95, 3.15)/2.84 (2.24, 3.55) for nipples and 4.26 (3.43, 5.24)/4.76 (3.85, 5.84) for tumors. Registration accuracy was sensitive to the type of landmark and the amount of breast density. For dense breasts (≥40%), the affine and fluid methods outperformed FFD. For breasts with lower density, the affine registration surpassed both fluid and FFD. Mean accuracy (mm) of the affine registration varied between 3.16 (95% CI 2.56, 3.90) for nipple points in breasts with density 20%-39% and 5.73 (4.80, 6.84) for tumor points in breasts with density <20%. Conclusions: Affine registration accuracy was comparable to that between independent film readers. More advanced two-dimensional nonrigid registration algorithms were incapable of increasing the accuracy of image alignment when compared to affine registration.

  8. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  9. Hough transform for clustered microcalcifications detection in full-field digital mammograms

    Science.gov (United States)

    Fanizzi, A.; Basile, T. M. A.; Losurdo, L.; Amoroso, N.; Bellotti, R.; Bottigli, U.; Dentamaro, R.; Didonna, V.; Fausto, A.; Massafra, R.; Moschetta, M.; Tamborra, P.; Tangaro, S.; La Forgia, D.

    2017-09-01

    Many screening programs use mammography as principal diagnostic tool for detecting breast cancer at a very early stage. Despite the efficacy of the mammograms in highlighting breast diseases, the detection of some lesions is still doubtless for radiologists. In particular, the extremely minute and elongated salt-like particles of microcalcifications are sometimes no larger than 0.1 mm and represent approximately half of all cancer detected by means of mammograms. Hence the need for automatic tools able to support radiologists in their work. Here, we propose a computer assisted diagnostic tool to support radiologists in identifying microcalcifications in full (native) digital mammographic images. The proposed CAD system consists of a pre-processing step, that improves contrast and reduces noise by applying Sobel edge detection algorithm and Gaussian filter, followed by a microcalcification detection step performed by exploiting the circular Hough transform. The procedure performance was tested on 200 images coming from the Breast Cancer Digital Repository (BCDR), a publicly available database. The automatically detected clusters of microcalcifications were evaluated by skilled radiologists which asses the validity of the correctly identified regions of interest as well as the system error in case of missed clustered microcalcifications. The system performance was evaluated in terms of Sensitivity and False Positives per images (FPi) rate resulting comparable to the state-of-art approaches. The proposed model was able to accurately predict the microcalcification clusters obtaining performances (sensibility = 91.78% and FPi rate = 3.99) which favorably compare to other state-of-the-art approaches.

  10. Reading screening mammograms - Attitudes among radiologists and radiographers about skill mix

    DEFF Research Database (Denmark)

    Johansen, Lena Westphal; Brodersen, John

    2011-01-01

    INTRODUCTION: Because of shortage of personnel for the Danish mammography screening programme, the aim of this study was to investigate the attitudes of radiologists and radiographers towards a future implementation of radiographers reading screening mammograms. MATERIALS AND METHODS: Seven...... of managers, and improved working relations. Organization related obstacles: shortage of radiographers and negative attitudes of managers. Professional related possibilities: positive experience with skill mix. Professional related obstacles: worries about negative consequences for the training...... and financial consequences of skill mix. Despite of this all radiologists and radiographers experienced with skill mix were strong advocates for reading radiographers....

  11. Similarity estimation for reference image retrieval in mammograms using convolutional neural network

    Science.gov (United States)

    Muramatsu, Chisako; Higuchi, Shunichi; Morita, Takako; Oiwa, Mikinao; Fujita, Hiroshi

    2018-02-01

    Periodic breast cancer screening with mammography is considered effective in decreasing breast cancer mortality. For screening programs to be successful, an intelligent image analytic system may support radiologists' efficient image interpretation. In our previous studies, we have investigated image retrieval schemes for diagnostic references of breast lesions on mammograms and ultrasound images. Using a machine learning method, reliable similarity measures that agree with radiologists' similarity were determined and relevant images could be retrieved. However, our previous method includes a feature extraction step, in which hand crafted features were determined based on manual outlines of the masses. Obtaining the manual outlines of masses is not practical in clinical practice and such data would be operator-dependent. In this study, we investigated a similarity estimation scheme using a convolutional neural network (CNN) to skip such procedure and to determine data-driven similarity scores. By using CNN as feature extractor, in which extracted features were employed in determination of similarity measures with a conventional 3-layered neural network, the determined similarity measures were correlated well with the subjective ratings and the precision of retrieving diagnostically relevant images was comparable with that of the conventional method using handcrafted features. By using CNN for determination of similarity measure directly, the result was also comparable. By optimizing the network parameters, results may be further improved. The proposed method has a potential usefulness in determination of similarity measure without precise lesion outlines for retrieval of similar mass images on mammograms.

  12. An automatic system to discriminate malignant from benign massive lesions on mammograms

    International Nuclear Information System (INIS)

    Retico, A.; Delogu, P.; Fantacci, M.E.; Kasae, P.

    2006-01-01

    Mammography is widely recognized as the most reliable technique for early detection of breast cancers. Automated or semi-automated computerized classification schemes can be very useful in assisting radiologists with a second opinion about the visual diagnosis of breast lesions, thus leading to a reduction in the number of unnecessary biopsies. We present a computer-aided diagnosis (CADi) system for the characterization of massive lesions in mammograms, whose aim is to distinguish malignant from benign masses. The CADi system we realized is based on a three-stage algorithm: (a) a segmentation technique extracts the contours of the massive lesion from the image; (b) 16 features based on size and shape of the lesion are computed; (c) a neural classifier merges the features into an estimated likelihood of malignancy. A data set of 226 massive lesions (109 malignant and 117 benign) has been used in this study. The system performances have been evaluated in terms of the receiver-operating characteristic (ROC) analysis, obtaining A z =0.80+/-0.04 as the estimated area under the ROC curve

  13. Automated registration of diagnostic to prediagnostic x-ray mammograms: Evaluation and comparison to radiologists' accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Pinto Pereira, Snehal M.; Hipwell, John H.; McCormack, Valerie A.; Tanner, Christine; Moss, Sue M.; Wilkinson, Louise S.; Khoo, Lisanne A. L.; Pagliari, Catriona; Skippage, Pippa L.; Kliger, Carole J.; Hawkes, David J.; Santos Silva, Isabel M. dos [Cancer Research UK Epidemiology and Genetics Group, London School of Hygiene and Tropical Medicine, Keppel Street, London WC1E 7HT (United Kingdom); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Lifestyle and Cancer Group, International Agency for Research on Cancer, 150 cours Albert Thomas, Lyon 69008 (France); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Cancer Screening Evaluation Unit, Institute of Cancer Research, Surrey SM2 5NG (United Kingdom); St. George' s Healthcare NHS Trust and South West London Breast Screening Service, London SW17 0QT (United Kingdom); Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Cancer Research UK Epidemiology and Genetics Group, London School of Hygiene and Tropical Medicine, Keppel Street, London WC1E 7HT (United Kingdom)

    2010-09-15

    Purpose: To compare and evaluate intensity-based registration methods for computation of serial x-ray mammogram correspondence. Methods: X-ray mammograms were simulated from MRIs of 20 women using finite element methods for modeling breast compressions and employing a MRI/x-ray appearance change model. The parameter configurations of three registration methods, affine, fluid, and free-form deformation (FFD), were optimized for registering x-ray mammograms on these simulated images. Five mammography film readers independently identified landmarks (tumor, nipple, and usually two other normal features) on pairs of diagnostic and corresponding prediagnostic digitized images from 52 breast cancer cases. Landmarks were independently reidentified by each reader. Target registration errors were calculated to compare the three registration methods using the reader landmarks as a gold standard. Data were analyzed using multilevel methods. Results: Between-reader variability varied with landmark (p<0.01) and screen (p=0.03), with between-reader mean distance (mm) in point location on the diagnostic/prediagnostic images of 2.50 (95% CI 1.95, 3.15)/2.84 (2.24, 3.55) for nipples and 4.26 (3.43, 5.24)/4.76 (3.85, 5.84) for tumors. Registration accuracy was sensitive to the type of landmark and the amount of breast density. For dense breasts ({>=}40%), the affine and fluid methods outperformed FFD. For breasts with lower density, the affine registration surpassed both fluid and FFD. Mean accuracy (mm) of the affine registration varied between 3.16 (95% CI 2.56, 3.90) for nipple points in breasts with density 20%-39% and 5.73 (4.80, 6.84) for tumor points in breasts with density <20%. Conclusions: Affine registration accuracy was comparable to that between independent film readers. More advanced two-dimensional nonrigid registration algorithms were incapable of increasing the accuracy of image alignment when compared to affine registration.

  14. Avoidable surgical consultations in women with a positive screening mammogram: Experience from a southern region of the Dutch breast screening programme

    International Nuclear Information System (INIS)

    Schreutelkamp, J.L.; Kwee, R.M.; Booij, M. de; Adriaensen, M.E.A.P.M.

    2014-01-01

    Introduction: According to current Dutch guidelines, all women with a positive screening mammogram are referred for a full hospital assessment, which includes surgical consultation and radiological assessment. Surgical consultation may be unnecessary for many patients. Our objective was to determine how often surgical consultations can be avoided by radiological pre-assessment. Materials and methods: All women with a positive screening mammogram, referred to our radiology department between 2002 and 2007, were included (n = 1014). Percentage of women that was downstaged to BI-RADS category 1 or 2 by radiological pre-assessment was calculated. Negative predictive value (NPV) for malignancy was estimated from the in-hospital follow-up, which was available up to September 2012. Results: 423 of 1014 women (42%) were downstaged to BI-RADS category 1 or 2 by radiological pre-assessment. During follow-up, 8 of these 423 women (2%) developed a malignancy in the same breast. At least 6 of these malignancies were located at a different location as the original screening findings which led to the initial referral. The estimated NPV for malignancy was 99.5% (95%CI, 98.3–99.9). Conclusion: By referring women with a positive screening mammogram to the radiology department for pre-assessment, a surgical consultation was avoided in 42%, with an estimated NPV of 99.5% for malignancy

  15. Abnormality detection of mammograms by discriminative dictionary learning on DSIFT descriptors.

    Science.gov (United States)

    Tavakoli, Nasrin; Karimi, Maryam; Nejati, Mansour; Karimi, Nader; Reza Soroushmehr, S M; Samavi, Shadrokh; Najarian, Kayvan

    2017-07-01

    Detection and classification of breast lesions using mammographic images are one of the most difficult studies in medical image processing. A number of learning and non-learning methods have been proposed for detecting and classifying these lesions. However, the accuracy of the detection/classification still needs improvement. In this paper we propose a powerful classification method based on sparse learning to diagnose breast cancer in mammograms. For this purpose, a supervised discriminative dictionary learning approach is applied on dense scale invariant feature transform (DSIFT) features. A linear classifier is also simultaneously learned with the dictionary which can effectively classify the sparse representations. Our experimental results show the superior performance of our method compared to existing approaches.

  16. A deep learning approach for the analysis of masses in mammograms with minimal user intervention.

    Science.gov (United States)

    Dhungel, Neeraj; Carneiro, Gustavo; Bradley, Andrew P

    2017-04-01

    We present an integrated methodology for detecting, segmenting and classifying breast masses from mammograms with minimal user intervention. This is a long standing problem due to low signal-to-noise ratio in the visualisation of breast masses, combined with their large variability in terms of shape, size, appearance and location. We break the problem down into three stages: mass detection, mass segmentation, and mass classification. For the detection, we propose a cascade of deep learning methods to select hypotheses that are refined based on Bayesian optimisation. For the segmentation, we propose the use of deep structured output learning that is subsequently refined by a level set method. Finally, for the classification, we propose the use of a deep learning classifier, which is pre-trained with a regression to hand-crafted feature values and fine-tuned based on the annotations of the breast mass classification dataset. We test our proposed system on the publicly available INbreast dataset and compare the results with the current state-of-the-art methodologies. This evaluation shows that our system detects 90% of masses at 1 false positive per image, has a segmentation accuracy of around 0.85 (Dice index) on the correctly detected masses, and overall classifies masses as malignant or benign with sensitivity (Se) of 0.98 and specificity (Sp) of 0.7. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Parameter optimization of a computer-aided diagnosis scheme for the segmentation of microcalcification clusters in mammograms

    International Nuclear Information System (INIS)

    Gavrielides, Marios A.; Lo, Joseph Y.; Floyd, Carey E. Jr.

    2002-01-01

    Our purpose in this study is to develop a parameter optimization technique for the segmentation of suspicious microcalcification clusters in digitized mammograms. In previous work, a computer-aided diagnosis (CAD) scheme was developed that used local histogram analysis of overlapping subimages and a fuzzy rule-based classifier to segment individual microcalcifications, and clustering analysis for reducing the number of false positive clusters. The performance of this previous CAD scheme depended on a large number of parameters such as the intervals used to calculate fuzzy membership values and on the combination of membership values used by each decision rule. These parameters were optimized empirically based on the performance of the algorithm on the training set. In order to overcome the limitations of manual training and rule generation, the segmentation algorithm was modified in order to incorporate automatic parameter optimization. For the segmentation of individual microcalcifications, the new algorithm used a neural network with fuzzy-scaled inputs. The fuzzy-scaled inputs were created by processing the histogram features with a family of membership functions, the parameters of which were automatically extracted from the distribution of the feature values. The neural network was trained to classify feature vectors as either positive or negative. Individual microcalcifications were segmented from positive subimages. After clustering, another neural network was trained to eliminate false positive clusters. A database of 98 images provided training and testing sets to optimize the parameters and evaluate the CAD scheme, respectively. The performance of the algorithm was evaluated with a FROC analysis. At a sensitivity rate of 93.2%, there was an average of 0.8 false positive clusters per image. The results are very comparable with those taken using our previously published rule-based method. However, the new algorithm is more suited to generalize its

  18. Analysis of contrast and absorbed doses in mammography

    International Nuclear Information System (INIS)

    Augusto, F.M.; Ghilardi Netto, T.; Subtil, L.J.; Silva, R. da

    2001-01-01

    One of the great causes of mortality between women in the world is the breast cancer. The mammograms are the method most efficient to detect some cases of cancer of breast before this to be clinically concrete. The quality of a picture system must be determined by the ability to detect tissue soft masses, cyst or tumors, but also calcifications. This detection is directly connected with the contrast obtained in these pictures. This work has for objective to develop a method for the analysis of this contrast in mammograms verifying the doses referred to these mammograms and comparing them with national and international levels of reference. (author)

  19. Genetic Fuzzy System (GFS based wavelet co-occurrence feature selection in mammogram classification for breast cancer diagnosis

    Directory of Open Access Journals (Sweden)

    Meenakshi M. Pawar

    2016-09-01

    Full Text Available Breast cancer is significant health problem diagnosed mostly in women worldwide. Therefore, early detection of breast cancer is performed with the help of digital mammography, which can reduce mortality rate. This paper presents wrapper based feature selection approach for wavelet co-occurrence feature (WCF using Genetic Fuzzy System (GFS in mammogram classification problem. The performance of GFS algorithm is explained using mini-MIAS database. WCF features are obtained from detail wavelet coefficients at each level of decomposition of mammogram image. At first level of decomposition, 18 features are applied to GFS algorithm, which selects 5 features with an average classification success rate of 39.64%. Subsequently, at second level it selects 9 features from 36 features and the classification success rate is improved to 56.75%. For third level, 16 features are selected from 54 features and average success rate is improved to 64.98%. Lastly, at fourth level 72 features are applied to GFS, which selects 16 features and thereby increasing average success rate to 89.47%. Hence, GFS algorithm is the effective way of obtaining optimal set of feature in breast cancer diagnosis.

  20. Analysis of the effect of spatial resolution on texture features in the classification of breast masses in mammograms

    International Nuclear Information System (INIS)

    Rangayyan, R.M.; Nguyen, T.M.; Ayres, F.J.; Nandi, A.K.

    2007-01-01

    The present study investigates the effect of spatial resolution on co-occurrence matrix-based texture features in discriminating breast lesions as benign masses or malignant tumors. The highest classification result, in terms of the area under the receiver operating characteristics (ROC) curve, of A z 0.74, was obtained at the spatial resolution of 800 μm using all 14 of Haralick's texture features computed using the margins, or ribbons, of the breast masses as seen on mammograms. Furthermore, our study indicates that texture features computed using the ribbons resulted in higher classification accuracy than the same texture features computed using the corresponding regions of interest within the mass boundaries drawn by an expert radiologist. Classification experiments using each single texture feature showed that the texture F 8 , sum entropy, gives consistently high classification results with an average A z of 0.64 across all levels of resolution. At certain levels of resolution, the textures F 5 , F 9 , and F 11 individually gave the highest classification result with A z = 0.70. (orig.)

  1. Study on Compression Induced Contrast in X-ray Mammograms Using Breast Mimicking Phantoms

    Directory of Open Access Journals (Sweden)

    A. B. M. Aowlad Hossain

    2015-09-01

    Full Text Available X-ray mammography is commonly used to scan cancer or tumors in breast using low dose x-rays. But mammograms suffer from low contrast problem. The breast is compressed in mammography to reduce x-ray scattering effects. As tumors are stiffer than normal tissues, they undergo smaller deformation under compression. Therefore, image intensity at tumor region may change less than the background tissues. In this study, we try to find out compression induced contrast from multiple mammographic images of tumorous breast phantoms taken with different compressions. This is an extended work of our previous simulation study with experiment and more analysis. We have used FEM models for synthetic phantom and constructed a phantom using agar and n-propanol for simulation and experiment. The x-ray images of deformed phantoms have been obtained under three compression steps and a non-rigid registration technique has been applied to register these images. It is noticeably observed that the image intensity changes at tumor are less than those at surrounding which induce a detectable contrast. Addition of this compression induced contrast to the simulated and experimental images has improved their original contrast by a factor of about 1.4

  2. Automatic detection of anomalies in screening mammograms

    Science.gov (United States)

    2013-01-01

    Background Diagnostic performance in breast screening programs may be influenced by the prior probability of disease. Since breast cancer incidence is roughly half a percent in the general population there is a large probability that the screening exam will be normal. That factor may contribute to false negatives. Screening programs typically exhibit about 83% sensitivity and 91% specificity. This investigation was undertaken to determine if a system could be developed to pre-sort screening-images into normal and suspicious bins based on their likelihood to contain disease. Wavelets were investigated as a method to parse the image data, potentially removing confounding information. The development of a classification system based on features extracted from wavelet transformed mammograms is reported. Methods In the multi-step procedure images were processed using 2D discrete wavelet transforms to create a set of maps at different size scales. Next, statistical features were computed from each map, and a subset of these features was the input for a concerted-effort set of naïve Bayesian classifiers. The classifier network was constructed to calculate the probability that the parent mammography image contained an abnormality. The abnormalities were not identified, nor were they regionalized. The algorithm was tested on two publicly available databases: the Digital Database for Screening Mammography (DDSM) and the Mammographic Images Analysis Society’s database (MIAS). These databases contain radiologist-verified images and feature common abnormalities including: spiculations, masses, geometric deformations and fibroid tissues. Results The classifier-network designs tested achieved sensitivities and specificities sufficient to be potentially useful in a clinical setting. This first series of tests identified networks with 100% sensitivity and up to 79% specificity for abnormalities. This performance significantly exceeds the mean sensitivity reported in literature

  3. Diagnostic abilities of three CAD methods for assessing microcalcifications in mammograms and an aspect of equivocal cases decisions by radiologists

    International Nuclear Information System (INIS)

    Hung, W.T.; Nguyen, H.T.; Thornton, B.S.; Rickard, M.T.; Blinowska, A.

    2003-01-01

    Radiologists use an 'Overall impression' rating to assess a suspicious region on a mammogram. The value ranges from 1 to 5. They will definitely send a patient for biopsy if the rating is 4 or 5. They will send the patient for core biopsy when a rating of 3 (indeterminate) is given. We have developed three methods to aid diagnosis of cases with microcalcifications. The first two methods, namely, Bayesian and multiple logistic regression (with a special 'cutting score' technique), utilise six parameter ratings which minimise subjectivity in characterising the microcalcifications. The third method uses three parameters (age of patient, uniformity of size of microcalcification and their distribution) in a multiple stepwise regression. For both training set and test set, all three methods are as good as the two radiologists in terms of percentages of correct classification. Therefore, all three proposed methods potentially can be used as second readers. Copyright (2003) Australasian College of Physical Scientists and Engineers in Medicine

  4. Block-based wavelet transform coding of mammograms with region-adaptive quantization

    Science.gov (United States)

    Moon, Nam Su; Song, Jun S.; Kwon, Musik; Kim, JongHyo; Lee, ChoongWoong

    1998-06-01

    To achieve both high compression ratio and information preserving, it is an efficient way to combine segmentation and lossy compression scheme. Microcalcification in mammogram is one of the most significant sign of early stage of breast cancer. Therefore in coding, detection and segmentation of microcalcification enable us to preserve it well by allocating more bits to it than to other regions. Segmentation of microcalcification is performed both in spatial domain and in wavelet transform domain. Peak error controllable quantization step, which is off-line designed, is suitable for medical image compression. For region-adaptive quantization, block- based wavelet transform coding is adopted and different peak- error-constrained quantizers are applied to blocks according to the segmentation result. In view of preservation of microcalcification, the proposed coding scheme shows better performance than JPEG.

  5. Low energy mammogram obtained in contrast-enhanced digital mammography (CEDM) is comparable to routine full-field digital mammography (FFDM)

    Energy Technology Data Exchange (ETDEWEB)

    Francescone, Mark A., E-mail: maf2184@columbia.edu [Columbia University Medical Center, ColumbiaDoctors Midtown, 51 West 51st Street, Suite 300, New York, NY 10019 (United States); Jochelson, Maxine S., E-mail: jochelsm@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Dershaw, D. David, E-mail: dershawd@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Sung, Janice S., E-mail: sungj@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Hughes, Mary C., E-mail: hughesm@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Zheng, Junting, E-mail: zhengj@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Moskowitz, Chaya, E-mail: moskowc1@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States); Morris, Elizabeth A., E-mail: morrise@mskcc.org [Memorial Sloan-Kettering Cancer Center, 1275 York Avenue, New York, NY 10065 (United States)

    2014-08-15

    Purpose: Contrast enhanced digital mammography (CEDM) uses low energy and high energy exposures to produce a subtracted contrast image. It is currently performed with a standard full-field digital mammogram (FFDM). The purpose is to determine if the low energy image performed after intravenous iodine injection can replace the standard FFDM. Methods: And Materials: In an IRB approved HIPAA compatible study, low-energy CEDM images of 170 breasts in 88 women (ages 26–75; mean 50.3) undergoing evaluation for elevated risk or newly diagnosed breast cancer were compared to standard digital mammograms performed within 6 months. Technical parameters including posterior nipple line (PNL) distance, compression thickness, and compression force on the MLO projection were compared. Mammographic findings were compared qualitatively and quantitatively. Mixed linear regression using generalized estimating equation (GEE) method was performed. Intraclass correlation coefficients (ICC) with 95% confidence interval (95%CI) were estimated to assess agreement. Results: No statistical difference was found in the technical parameters compression thickness, PNL distance, compression force (p-values: 0.767, 0.947, 0.089). No difference was found in the measured size of mammographic findings (p-values 0.982–0.988). Grouped calcifications had a mean size/extent of 2.1 cm (SD 0.6) in the low-energy contrast images, and a mean size/extent of 2.2 cm (SD 0.6) in the standard digital mammogram images. Masses had a mean size of 1.8 cm (SD 0.2) in both groups. Calcifications were equally visible on both CEDM and FFDM. Conclusion: Low energy CEDM images are equivalent to standard FFDM despite the presence of intravenous iodinated contrast. Low energy CEDM images may be used for interpretation in place of the FFDM, thereby reducing patient dose.

  6. Low energy mammogram obtained in contrast-enhanced digital mammography (CEDM) is comparable to routine full-field digital mammography (FFDM)

    International Nuclear Information System (INIS)

    Francescone, Mark A.; Jochelson, Maxine S.; Dershaw, D. David; Sung, Janice S.; Hughes, Mary C.; Zheng, Junting; Moskowitz, Chaya; Morris, Elizabeth A.

    2014-01-01

    Purpose: Contrast enhanced digital mammography (CEDM) uses low energy and high energy exposures to produce a subtracted contrast image. It is currently performed with a standard full-field digital mammogram (FFDM). The purpose is to determine if the low energy image performed after intravenous iodine injection can replace the standard FFDM. Methods: And Materials: In an IRB approved HIPAA compatible study, low-energy CEDM images of 170 breasts in 88 women (ages 26–75; mean 50.3) undergoing evaluation for elevated risk or newly diagnosed breast cancer were compared to standard digital mammograms performed within 6 months. Technical parameters including posterior nipple line (PNL) distance, compression thickness, and compression force on the MLO projection were compared. Mammographic findings were compared qualitatively and quantitatively. Mixed linear regression using generalized estimating equation (GEE) method was performed. Intraclass correlation coefficients (ICC) with 95% confidence interval (95%CI) were estimated to assess agreement. Results: No statistical difference was found in the technical parameters compression thickness, PNL distance, compression force (p-values: 0.767, 0.947, 0.089). No difference was found in the measured size of mammographic findings (p-values 0.982–0.988). Grouped calcifications had a mean size/extent of 2.1 cm (SD 0.6) in the low-energy contrast images, and a mean size/extent of 2.2 cm (SD 0.6) in the standard digital mammogram images. Masses had a mean size of 1.8 cm (SD 0.2) in both groups. Calcifications were equally visible on both CEDM and FFDM. Conclusion: Low energy CEDM images are equivalent to standard FFDM despite the presence of intravenous iodinated contrast. Low energy CEDM images may be used for interpretation in place of the FFDM, thereby reducing patient dose

  7. Disparities in abnormal mammogram follow-up time for Asian women compared to non-Hispanic Whites and between Asian ethnic groups

    Science.gov (United States)

    Nguyen, KH; Pasick, RJ; Stewart, SL; Kerlikowske, K; Karliner, LS

    2017-01-01

    Background Delays in abnormal mammogram follow-up contribute to poor outcomes. We examined abnormal screening mammogram follow-up differences for non-Hispanic Whites (NHW) and Asian women. Methods Prospective cohort of NHW and Asian women with a Breast Imaging Reporting and Data System abnormal result of 0 or 3+ in the San Francisco Mammography Registry between 2000–2010. We performed Kaplan-Meier estimation for median-days to follow-up with a diagnostic radiologic test, and compared proportion with follow-up at 30, 60 and 90 days, and no follow-up at one-year for Asians overall (and Asian ethnic groups) and NHWs. We additionally assessed the relationship between race/ethnicity and time-to-follow-up with adjusted Cox proportional hazards models. Results Among Asian women, Vietnamese and Filipinas had the longest, and Japanese the shortest, median follow-up time (32, 28, 19 days, respectively) compared to NHWs (15 days). The proportion of women receiving follow-up at 30 days was lower for Asians vs NHWs (57% vs 77%, pAsian ethnic groups except Japanese. Asians had a reduced hazard of follow-up compared with NHWs (aHR 0.70, 95% CI 0.69–0.72). Asians also had a higher rate than NHWs of no follow-up (15% vs 10%; pAsian ethnic groups, Filipinas had the highest percentage of women with no follow-up (18.1%). Conclusion Asian, particularly Filipina and Vietnamese, women were less likely than NHWs to receive timely follow-up after an abnormal screening mammogram. Research should disaggregate Asian ethnicity to better understand and address barriers to effective cancer prevention. PMID:28603859

  8. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  9. Quantifying the effect of colorization enhancement on mammogram images

    Science.gov (United States)

    Wojnicki, Paul J.; Uyeda, Elizabeth; Micheli-Tzanakou, Evangelia

    2002-04-01

    Current methods of radiological displays provide only grayscale images of mammograms. The limitation of the image space to grayscale provides only luminance differences and textures as cues for object recognition within the image. However, color can be an important and significant cue in the detection of shapes and objects. Increasing detection ability allows the radiologist to interpret the images in more detail, improving object recognition and diagnostic accuracy. Color detection experiments using our stimulus system, have demonstrated that an observer can only detect an average of 140 levels of grayscale. An optimally colorized image can allow a user to distinguish 250 - 1000 different levels, hence increasing potential image feature detection by 2-7 times. By implementing a colorization map, which follows the luminance map of the original grayscale images, the luminance profile is preserved and color is isolated as the enhancement mechanism. The effect of this enhancement mechanism on the shape, frequency composition and statistical characteristics of the Visual Evoked Potential (VEP) are analyzed and presented. Thus, the effectiveness of the image colorization is measured quantitatively using the Visual Evoked Potential (VEP).

  10. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    International Nuclear Information System (INIS)

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-01-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  11. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  12. SU-E-I-58: Objective Models of Breast Shape Undergoing Mammography and Tomosynthesis Using Principal Component Analysis.

    Science.gov (United States)

    Feng, Ssj; Sechopoulos, I

    2012-06-01

    To develop an objective model of the shape of the compressed breast undergoing mammographic or tomosynthesis acquisition. Automated thresholding and edge detection was performed on 984 anonymized digital mammograms (492 craniocaudal (CC) view mammograms and 492 medial lateral oblique (MLO) view mammograms), to extract the edge of each breast. Principal Component Analysis (PCA) was performed on these edge vectors to identify a limited set of parameters and eigenvectors that. These parameters and eigenvectors comprise a model that can be used to describe the breast shapes present in acquired mammograms and to generate realistic models of breasts undergoing acquisition. Sample breast shapes were then generated from this model and evaluated. The mammograms in the database were previously acquired for a separate study and authorized for use in further research. The PCA successfully identified two principal components and their corresponding eigenvectors, forming the basis for the breast shape model. The simulated breast shapes generated from the model are reasonable approximations of clinically acquired mammograms. Using PCA, we have obtained models of the compressed breast undergoing mammographic or tomosynthesis acquisition based on objective analysis of a large image database. Up to now, the breast in the CC view has been approximated as a semi-circular tube, while there has been no objectively-obtained model for the MLO view breast shape. Such models can be used for various breast imaging research applications, such as x-ray scatter estimation and correction, dosimetry estimates, and computer-aided detection and diagnosis. © 2012 American Association of Physicists in Medicine.

  13. Primary breast osteosarcoma mimicking calcified fibroadenoma on screening digital breast tomosynthesis mammogram

    Directory of Open Access Journals (Sweden)

    Debbie Lee Bennett, MD

    2017-12-01

    Full Text Available Primary breast osteosarcoma is a rare malignancy, with mostly case reports in the literature. The appearance of breast osteosarcoma on digital breast tomosynthesis imaging has not yet been described. A 69-year-old woman presents for routine screening mammography and is found to have a calcified mass in her right breast. Pattern of calcification appeared “sunburst” on digital breast tomosynthesis images. This mass was larger than on the previous year's mammogram, at which time it had been interpreted as a benign calcified fibroadenoma. The subsequent workup demonstrated the mass to reflect primary breast osteosarcoma. The patient's workup and treatment are detailed in this case. Primary breast osteosarcoma, although rare, should be included as a diagnostic consideration for breast masses with a sunburst pattern of calcifications, particularly when the mammographic appearance has changed.

  14. Multi-task transfer learning deep convolutional neural network: application to computer-aided diagnosis of breast cancer on mammograms

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Cha, Kenny H.; Richter, Caleb D.

    2017-12-01

    Transfer learning in deep convolutional neural networks (DCNNs) is an important step in its application to medical imaging tasks. We propose a multi-task transfer learning DCNN with the aim of translating the ‘knowledge’ learned from non-medical images to medical diagnostic tasks through supervised training and increasing the generalization capabilities of DCNNs by simultaneously learning auxiliary tasks. We studied this approach in an important application: classification of malignant and benign breast masses. With Institutional Review Board (IRB) approval, digitized screen-film mammograms (SFMs) and digital mammograms (DMs) were collected from our patient files and additional SFMs were obtained from the Digital Database for Screening Mammography. The data set consisted of 2242 views with 2454 masses (1057 malignant, 1397 benign). In single-task transfer learning, the DCNN was trained and tested on SFMs. In multi-task transfer learning, SFMs and DMs were used to train the DCNN, which was then tested on SFMs. N-fold cross-validation with the training set was used for training and parameter optimization. On the independent test set, the multi-task transfer learning DCNN was found to have significantly (p  =  0.007) higher performance compared to the single-task transfer learning DCNN. This study demonstrates that multi-task transfer learning may be an effective approach for training DCNN in medical imaging applications when training samples from a single modality are limited.

  15. Case base classification on digital mammograms: improving the performance of case base classifier

    Science.gov (United States)

    Raman, Valliappan; Then, H. H.; Sumari, Putra; Venkatesa Mohan, N.

    2011-10-01

    Breast cancer continues to be a significant public health problem in the world. Early detection is the key for improving breast cancer prognosis. The aim of the research presented here is in twofold. First stage of research involves machine learning techniques, which segments and extracts features from the mass of digital mammograms. Second level is on problem solving approach which includes classification of mass by performance based case base classifier. In this paper we build a case-based Classifier in order to diagnose mammographic images. We explain different methods and behaviors that have been added to the classifier to improve the performance of the classifier. Currently the initial Performance base Classifier with Bagging is proposed in the paper and it's been implemented and it shows an improvement in specificity and sensitivity.

  16. Modeling sequential context effects in diagnostic interpretation of screening mammograms.

    Science.gov (United States)

    Alamudun, Folami; Paulus, Paige; Yoon, Hong-Jun; Tourassi, Georgia

    2018-07-01

    Prior research has shown that physicians' medical decisions can be influenced by sequential context, particularly in cases where successive stimuli exhibit similar characteristics when analyzing medical images. This type of systematic error is known to psychophysicists as sequential context effect as it indicates that judgments are influenced by features of and decisions about the preceding case in the sequence of examined cases, rather than being based solely on the peculiarities unique to the present case. We determine if radiologists experience some form of context bias, using screening mammography as the use case. To this end, we explore correlations between previous perceptual behavior and diagnostic decisions and current decisions. We hypothesize that a radiologist's visual search pattern and diagnostic decisions in previous cases are predictive of the radiologist's current diagnostic decisions. To test our hypothesis, we tasked 10 radiologists of varied experience to conduct blind reviews of 100 four-view screening mammograms. Eye-tracking data and diagnostic decisions were collected from each radiologist under conditions mimicking clinical practice. Perceptual behavior was quantified using the fractal dimension of gaze scanpath, which was computed using the Minkowski-Bouligand box-counting method. To test the effect of previous behavior and decisions, we conducted a multifactor fixed-effects ANOVA. Further, to examine the predictive value of previous perceptual behavior and decisions, we trained and evaluated a predictive model for radiologists' current diagnostic decisions. ANOVA tests showed that previous visual behavior, characterized by fractal analysis, previous diagnostic decisions, and image characteristics of previous cases are significant predictors of current diagnostic decisions. Additionally, predictive modeling of diagnostic decisions showed an overall improvement in prediction error when the model is trained on additional information about

  17. Volumetric breast density estimation from full-field digital mammograms.

    Science.gov (United States)

    van Engeland, Saskia; Snoeren, Peter R; Huisman, Henkjan; Boetes, Carla; Karssemeijer, Nico

    2006-03-01

    A method is presented for estimation of dense breast tissue volume from mammograms obtained with full-field digital mammography (FFDM). The thickness of dense tissue mapping to a pixel is determined by using a physical model of image acquisition. This model is based on the assumption that the breast is composed of two types of tissue, fat and parenchyma. Effective linear attenuation coefficients of these tissues are derived from empirical data as a function of tube voltage (kVp), anode material, filtration, and compressed breast thickness. By employing these, tissue composition at a given pixel is computed after performing breast thickness compensation, using a reference value for fatty tissue determined by the maximum pixel value in the breast tissue projection. Validation has been performed using 22 FFDM cases acquired with a GE Senographe 2000D by comparing the volume estimates with volumes obtained by semi-automatic segmentation of breast magnetic resonance imaging (MRI) data. The correlation between MRI and mammography volumes was 0.94 on a per image basis and 0.97 on a per patient basis. Using the dense tissue volumes from MRI data as the gold standard, the average relative error of the volume estimates was 13.6%.

  18. Grid Databases for Shared Image Analysis in the MammoGrid Project

    CERN Document Server

    Amendolia, S R; Hauer, T; Manset, D; McClatchey, R; Odeh, M; Reading, T; Rogulin, D; Schottlander, D; Solomonides, T

    2004-01-01

    The MammoGrid project aims to prove that Grid infrastructures can be used for collaborative clinical analysis of database-resident but geographically distributed medical images. This requires: a) the provision of a clinician-facing front-end workstation and b) the ability to service real-world clinician queries across a distributed and federated database. The MammoGrid project will prove the viability of the Grid by harnessing its power to enable radiologists from geographically dispersed hospitals to share standardized mammograms, to compare diagnoses (with and without computer aided detection of tumours) and to perform sophisticated epidemiological studies across national boundaries. This paper outlines the approach taken in MammoGrid to seamlessly connect radiologist workstations across a Grid using an "information infrastructure" and a DICOM-compliant object model residing in multiple distributed data stores in Italy and the UK

  19. Disparities in abnormal mammogram follow-up time for Asian women compared with non-Hispanic white women and between Asian ethnic groups.

    Science.gov (United States)

    Nguyen, Kim H; Pasick, Rena J; Stewart, Susan L; Kerlikowske, Karla; Karliner, Leah S

    2017-09-15

    Delays in abnormal mammogram follow-up contribute to poor outcomes. In the current study, the authors examined differences in abnormal screening mammogram follow-up between non-Hispanic white (NHW) and Asian women. The authors used a prospective cohort of NHW and Asian women with a Breast Imaging, Reporting and Data System (BI-RADS) abnormal result of category 0 or 3-plus in the San Francisco Mammography Registry between 2000 and 2010. Kaplan-Meier estimation for the median number of days to follow-up with a diagnostic radiologic test was performed, and the authors compared the percentage of women with follow-up at 30 days, 60 days, and 90 days and no follow-up at 1 year for Asian women overall (and Asian ethnic groups) and NHW women. In addition, the authors assessed the relationship between race/ethnicity and time to follow-up with adjusted Cox proportional hazards models. Among Asian women, Vietnamese and Filipina women had the longest, and Japanese women the shortest, median follow-up (32 days, 28 days, and 19 days, respectively) compared with NHW women (15 days). The percentage of women receiving follow-up at 30 days was lower for Asians versus NHWs (57% vs 77%; PAsian ethnic groups except Japanese. Asian women had a reduced hazard of follow-up compared with NHW women (adjusted hazard ratio, 0.70; 95% confidence interval, 0.69-0.72). Asian women also had a higher rate of receiving no follow-up compared with NHW women (15% vs 10%; PAsian ethnic groups, Filipinas were found to have the highest percentage of women with no follow-up (18.1%). Asian women, particularly Filipina and Vietnamese women, were less likely than NHW women to receive timely follow-up after an abnormal screening mammogram. Research should disaggregate Asian ethnicity to better understand and address barriers to effective cancer prevention. Cancer 2017;123:3468-75. © 2017 American Cancer Society. © 2017 American Cancer Society.

  20. Three-dimensional reconstruction of clustered microcalcifications from two digitized mammograms

    Science.gov (United States)

    Stotzka, Rainer; Mueller, Tim O.; Epper, Wolfgang; Gemmeke, Hartmut

    1998-06-01

    X-ray mammography is one of the most significant diagnosis methods in early detection of breast cancer. Usually two X- ray images from different angles are taken from each mamma to make even overlapping structures visible. X-ray mammography has a very high spatial resolution and can show microcalcifications of 50 - 200 micron in size. Clusters of microcalcifications are one of the most important and often the only indicator for malignant tumors. These calcifications are in some cases extremely difficult to detect. Computer assisted diagnosis of digitized mammograms may improve detection and interpretation of microcalcifications and cause more reliable diagnostic findings. We build a low-cost mammography workstation to detect and classify clusters of microcalcifications and tissue densities automatically. New in this approach is the estimation of the 3D formation of segmented microcalcifications and its visualization which will put additional diagnostic information at the radiologists disposal. The real problem using only two or three projections for reconstruction is the big loss of volume information. Therefore the arrangement of a cluster is estimated using only the positions of segmented microcalcifications. The arrangement of microcalcifications is visualized to the physician by rotating.

  1. Mapping 3D breast lesions from full-field digital mammograms using subject-specific finite element models

    Science.gov (United States)

    García, E.; Oliver, A.; Diaz, O.; Diez, Y.; Gubern-Mérida, A.; Martí, R.; Martí, J.

    2017-03-01

    Patient-specific finite element (FE) models of the breast have received increasing attention due to the potential capability of fusing images from different modalities. During the Magnetic Resonance Imaging (MRI) to X-ray mammography registration procedure, the FE model is compressed mimicking the mammographic acquisition. Subsequently, suspicious lesions in the MRI volume can be projected into the 2D mammographic space. However, most registration algorithms do not provide the reverse information, avoiding to obtain the 3D geometrical information from the lesions localized in the mammograms. In this work we introduce a fast method to localize the 3D position of the lesion within the MRI, using both cranio-caudal (CC) and medio-lateral oblique (MLO) mammographic projections, indexing the tetrahedral elements of the biomechanical model by means of an uniform grid. For each marked lesion in the Full-Field Digital Mammogram (FFDM), the X-ray path from source to the marker is calculated. Barycentric coordinates are computed in the tetrahedrons traversed by the ray. The list of elements and coordinates allows to localize two curves within the MRI and the closest point between both curves is taken as the 3D position of the lesion. The registration errors obtained in the mammographic space are 9.89 +/- 3.72 mm in CC- and 8.04 +/- 4.68 mm in MLO-projection and the error in the 3D MRI space is equal to 10.29 +/- 3.99 mm. Regarding the uniform grid, it is computed spending between 0.1 and 0.7 seconds. The average time spent to compute the 3D location of a lesion is about 8 ms.

  2. Image quality of a wet laser printer versus a paper printer for full-field digital mammograms.

    Science.gov (United States)

    Schueller, Gerd; Kaindl, Elisabeth; Matzek, Wolfgang K; Semturs, Friedrich; Schueller-Weidekamm, Claudia; Helbich, Thomas H

    2006-01-01

    The purpose of our study was to compare the image quality of a wet laser printer with that of a paper printer for full-field digital mammography (FFDM). For both a wet laser printer and a paper printer connected to an FFDM system, image quality parameters were evaluated using a standardized printer test image (luminance density, dynamic range). The detectability of standardized objects on a phantom was also evaluated. Furthermore, 640 mammograms of 80 patients with different breast tissue composition patterns were imaged with both printers. Subjective image quality parameters (brightness, contrast, and detection of details of anatomic structures-that is, skin, subcutis, musculature, glandular tissue, and fat), the detectability of breast lesions (mass, calcifications), and the diagnostic performance according to the BI-RADS classification were evaluated. Both the luminance density and the dynamic range were superior for the wet laser printer. More standardized objects were visible on the phantom imaged with the wet laser printer than with the paper printer (13/16 vs 11/16). Each subjective image quality parameter of the mammograms from the wet laser printer was rated superior to those of the paper printer. Significantly more breast lesions were detected on the wet laser printer images than on the paper printer images (masses, 13 vs 10; calcifications, 65 vs 48; p printer images, BI-RADS 4 and 5 categories were underestimated for 10 (43.5%) of 23 patients. For FFDM, images obtained from a wet laser printer show superior objective and subjective image quality compared with a paper printer. As a consequence, the paper printer should not be used for FFDM.

  3. Role of technetium-99m sestamibi scintimammography and contrast-enhanced magnetic resonance imaging for the evaluation of indeterminate mammograms

    International Nuclear Information System (INIS)

    Tiling, R.; Moser, R.; Meyer, G.; Tatsch, K.; Hahn, K.; Khalkhali, I.; Sommer, H.; Willemsen, F.; Pfluger, T.

    1997-01-01

    This study evaluated and compared technetium-99m sestamibi scintimammography (SMM) and breast magnetic resonance imaging (MRI) results in patients with indeterminate mammograms to determine whether either technique can improve the sensitivity and specificity for the diagnosis of breast carcinoma. From 123 consecutive patients who underwent physical examination, mammography, SMM, and histopathologic confirmation, a subgroup of 82 patients presenting with indeterminate mammograms was studied. Sixty-eight patients underwent contrast-enhanced MRI. SMM results were scored on the basis of the intensity and pattern of sestamibi uptake. MRI images were scored on the basis of signal intensity increase after administration of contrast material as well as the enhancement pattern and speed of gadolinium uptake. The results obtained with the two techniques were compared and related to the final histopathologic diagnoses. Considering indeterminate findings as positive, the sensitivity of SMM was 79% and the specificity, 70%. MRI displayed a sensitivity of 84% and a specificity of 49%. When indeterminate results were considered negative, the sensitivity and specificity of SMM were 62% and 83%, respectively. MRI revealed a sensitivity and specificity of 56% and 79%, respectively. The calculated sensitivities and specificities demonstrate the diagnostic limitations of both SMM and MRI in the evaluation of patients with indeterminate mammographic findings. Due to the higher specificity, SMM may be the preferred modality in the evaluation of selected patients with breast abnormalities. (orig.)

  4. Development of a Computer-aided Diagnosis System for Early Detection of Masses Using Retrospectively Detected Cancers on Prior Mammograms

    Science.gov (United States)

    2009-06-01

    des ign a c lassifier f or t he differentiation of the abnormal from the normal structures . In this project, we have also investigated the...were trained: one with the current m ammograms and the other with the prior mammograms. A feed- forward backpropagation artificial neural network...Sahin er, B., Helvie, M. A., Petri ck, N., Roubidou x, M. A., Wi lson, T. E., Adler, D. D., Pa ramagul, C., Ne wman, J . S . a nd G opal , S. S

  5. Multiscale wavelet representations for mammographic feature analysis

    Science.gov (United States)

    Laine, Andrew F.; Song, Shuwu

    1992-12-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet coefficients, enhanced by linear, exponential and constant weight functions localized in scale space. By improving the visualization of breast pathology we can improve the changes of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).

  6. The clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast

    International Nuclear Information System (INIS)

    Lee, Jin Hwa; Yoon, Seong Kuk; Choi, Sun Seob; Nam, Kyung Jin; Cho, Se Heon; Kim, Dae Cheol; Kim, Jung Il; Kim, Eun Kyung

    2006-01-01

    We wanted to evaluate the clinical significance of normal mammograms and normal sonograms in patients with palpable abnormalities of the breast. From Apr 2003 to Feb 2005, 107 patients with 113 palpable abnormalities who had combined normal sonographic and normal mammographic findings were retrospectively studied. The evaluated parameters included age of the patients, the clinical referrals, the distribution of the locations of the palpable abnormalities, whether there was a past surgical history, the mammographic densities and the sonographic echo patterns (purely hyperechoic fibrous tissue, mixed fibroglandular breast tissue, predominantly isoechoic glandular tissue and isoechoic subcutaneous fat tissue) at the sites of clinical concern, whether there was a change in imaging and/or the physical examination results at follow-up, and whether there were biopsy results. This study period was chosen to allow a follow-up period of at least 12 months. The patients' ages ranged from 22 to 66 years (mean age: 48.8 years) and 62 (58%) of the 107 patients were between 41 and 50 years old (58%). The most common location of the palpable abnormalities was the upper outer portion of the breast (45%) and most of the mammographic densities were dense patterns (BI-RADS Type 3 or 4: 91%). Our cases showed similar distribution for all the types of sonographic echo patterns. 23 patients underwent biopsy; all the biopsy specimens were benign. For the 84 patients with 90 palpable abnormalities who were followed, there was no interval development of breast cancer in the areas of clinical concern. Our results suggest that we can follow up and prevent unnecessary biopsies in women with palpable abnormalities when both the mammography and ultrasonography show normal tissue, but this study was limited by its small sample size. Therefore, a larger study will be needed to better define the negative predictive value of combined normal sonographic and mammographic findings

  7. The ATLAS distributed analysis system

    International Nuclear Information System (INIS)

    Legger, F

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  8. The ATLAS distributed analysis system

    Science.gov (United States)

    Legger, F.; Atlas Collaboration

    2014-06-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of Grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high and steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. Both the user support techniques and the direct feedback of users have been effective in improving the success rate and user experience when utilizing the distributed computing environment. In this contribution a description of the main components, activities and achievements of ATLAS distributed analysis is given. Several future improvements being undertaken will be described.

  9. Evaluation of hybrids algorithms for mass detection in digitalized mammograms

    International Nuclear Information System (INIS)

    Cordero, Jose; Garzon Reyes, Johnson

    2011-01-01

    The breast cancer remains being a significant public health problem, the early detection of the lesions can increase the success possibilities of the medical treatments. The mammography is an image modality effective to early diagnosis of abnormalities, where the medical image is obtained of the mammary gland with X-rays of low radiation, this allows detect a tumor or circumscribed mass between two to three years before that it was clinically palpable, and is the only method that until now achieved reducing the mortality by breast cancer. In this paper three hybrids algorithms for circumscribed mass detection on digitalized mammograms are evaluated. In the first stage correspond to a review of the enhancement and segmentation techniques used in the processing of the mammographic images. After a shape filtering was applied to the resulting regions. By mean of a Bayesian filter the survivors regions were processed, where the characteristics vector for the classifier was constructed with few measurements. Later, the implemented algorithms were evaluated by ROC curves, where 40 images were taken for the test, 20 normal images and 20 images with circumscribed lesions. Finally, the advantages and disadvantages in the correct detection of a lesion of every algorithm are discussed.

  10. Distributed analysis with PROOF in ATLAS collaboration

    International Nuclear Information System (INIS)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S; Benjamin, D; Montoya, G Carillo; Guan, W; Mellado, B; Xu, N; Cranmer, K; Shibata, A

    2010-01-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  11. Distributed analysis with PROOF in ATLAS collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Panitkin, S Y; Ernst, M; Ito, H; Maeno, T; Majewski, S; Rind, O; Tarrade, F; Wenaus, T; Ye, S [Brookhaven National Laboratory, Upton, NY 11973 (United States); Benjamin, D [Duke University, Durham, NC 27708 (United States); Montoya, G Carillo; Guan, W; Mellado, B; Xu, N [University of Wisconsin-Madison, Madison, WI 53706 (United States); Cranmer, K; Shibata, A [New York University, New York, NY 10003 (United States)

    2010-04-01

    The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF can be configured to work with centralized storage systems, but it is especially effective together with distributed local storage systems - like Xrootd, when data are distributed over computing nodes. It works efficiently on different types of hardware and scales well from a multi-core laptop to large computing farms. From that point of view it is well suited for both large central analysis facilities and Tier 3 type analysis farms. PROOF can be used in interactive or batch like regimes. The interactive regime allows the user to work with typically distributed data from the ROOT command prompt and get a real time feedback on analysis progress and intermediate results. We will discuss our experience with PROOF in the context of ATLAS Collaboration distributed analysis. In particular we will discuss PROOF performance in various analysis scenarios and in multi-user, multi-session environments. We will also describe PROOF integration with the ATLAS distributed data management system and prospects of running PROOF on geographically distributed analysis farms.

  12. Analysis of laser-printed spatial resolution for mammographic microcalcification detection

    International Nuclear Information System (INIS)

    Smathers, R.L.; Kowarski, D.

    1987-01-01

    The detectability of microcalfications in mammograms was compared in Kodak Min-R screen-film mammograms versus digitized laser-printed films. Pulverized bone specks were used as the phantoms to produce the original mammograms. The mammograms were then digitized to a spatial resolution of 2,048 x, 2048 with 4,096 gray levels and laser-printed at spatial resolutions of 512 x 512, 1,024 x 1,024, and 2,048 x 2,048 with 256 gray levels. The number of bone specks was determined on a region-by region basis. The 512 x 512 resolution laser-printed images were nondiagnostic, 1,024 x 1,024 images were better, and 2,048 x 2,048 images were quite comparable to the original screen-film mammograms

  13. An SVM classifier to separate false signals from microcalcifications in digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Bazzani, Armando; Bollini, Dante; Brancaccio, Rosa; Campanini, Renato; Riccardi, Alessandro; Romani, Davide [Department of Physics, University of Bologna (Italy); INFN, Bologna (Italy); Lanconelli, Nico [Department of Physics, University of Bologna, and INFN, Bologna (Italy). E-mail: nico.lanconelli@bo.infn.it; Bevilacqua, Alessandro [Department of Electronics, Computer Science and Systems, University of Bologna, and INFN, Bologna (Italy)

    2001-06-01

    In this paper we investigate the feasibility of using an SVM (support vector machine) classifier in our automatic system for the detection of clustered microcalcifications in digital mammograms. SVM is a technique for pattern recognition which relies on the statistical learning theory. It minimizes a function of two terms: the number of misclassified vectors of the training set and a term regarding the generalization classifier capability. We compare the SVM classifier with an MLP (multi-layer perceptron) in the false-positive reduction phase of our detection scheme: a detected signal is considered either microcalcification or false signal, according to the value of a set of its features. The SVM classifier gets slightly better results than the MLP one (Az value of 0.963 against 0.958) in the presence of a high number of training data; the improvement becomes much more evident (Az value of 0.952 against 0.918) in training sets of reduced size. Finally, the setting of the SVM classifier is much easier than the MLP one. (author)

  14. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration; Pacheco Pages, A; Stradling, A

    2013-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  15. The ATLAS Distributed Analysis System

    CERN Document Server

    Legger, F; The ATLAS collaboration

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During the first run of the LHC, the ATLAS Distributed Analysis (DA) service has operated stably and scaled as planned. More than 1600 users submitted jobs in 2012, with 2 million or more analysis jobs per week, peaking at about a million jobs per day. The system dynamically distributes popular data to expedite processing and maximally utilize resources. The reliability of the DA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters ...

  16. Toward the breast screening balance sheet: cumulative risk of false positives for annual versus biennial mammograms commencing at age 40 or 50.

    Science.gov (United States)

    Winch, Caleb J; Sherman, Kerry A; Boyages, John

    2015-01-01

    This study aimed to: (1) Estimate cumulative risk of recall from breast screening where no cancer is detected (a harm) in Australia; (2) Compare women screened annually versus biennially, commencing age 40 versus 50; and (3) Compare with international findings. At the no-cost metropolitan program studied, women attended biennial screening, but were offered annual screening if regarded at elevated risk for breast cancer. The cumulative risk of at least one recall was estimated using discrete-time survival analysis. Cancer detection statistics were computed. In total, 801,636 mammograms were undertaken in 231,824 women. Over 10 years, cumulative risk of recall was 13.3 % (95 % CI 12.7-13.8) for those screened biennially, and 19.9 % (CI 16.6-23.2) for those screened annually from age 50-51. Cumulative risk of complex false positive involving a biopsy was 3.1 % (CI 2.9-3.4) and 5.0 % (CI 3.4-6.6), respectively. From age 40-41, the risk of recall was 15.1 % (CI 14.3-16.0) and 22.5 % (CI 17.9-27.1) for biennial and annual screening, respectively. Corresponding rates of complex false positive were 3.3 % (CI 2.9-3.8) and 6.3 % (CI 3.4-9.1). Over 10 mammograms, invasive cancer was detected in 3.4 % (CI 3.3-3.5) and ductal carcinoma in situ in 0.7 % (CI 0.6-0.7) of women, with a non-significant trend toward a larger proportion of Tis and T1N0 cancers in women screened annually (74.5 %) versus biennially (70.1 %), χ (2) = 2.77, p = 0.10. Cancer detection was comparable to international findings. Recall risk was equal to European estimates for women screening from 50 and lower for screening from 40. Recall risk was half of United States' rates across start age and rescreening interval categories. Future benefit/harm balance sheets may be useful for communicating these findings to women.

  17. Distributed computing and nuclear reactor analysis

    International Nuclear Information System (INIS)

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-01-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations

  18. ATLAS Distributed Analysis Tools

    CERN Document Server

    Gonzalez de la Hoz, Santiago; Liko, Dietrich

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...

  19. Distributed analysis challenges in ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Duckeck, Guenter; Legger, Federica; Mitterer, Christoph Anton; Walker, Rodney [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2016-07-01

    The ATLAS computing model has undergone massive changes to meet the high luminosity challenge of the second run of the Large Hadron Collider (LHC) at CERN. The production system and distributed data management have been redesigned, a new data format and event model for analysis have been introduced, and common reduction and derivation frameworks have been developed. We report on the impact these changes have on the distributed analysis system, study the various patterns of grid usage for user analysis, focusing on the differences between the first and th e second LHC runs, and measure performances of user jobs.

  20. Clinical images evaluation of mammograms: a national survey

    International Nuclear Information System (INIS)

    Moon, Woo Kyung; Kim, Tae Jung; Cha, Joo Hee

    2003-01-01

    The goal of this study was to survey the overall quality of mammographic images in Korea. A total of 598 mammographic images collected from 257 hospitals nationwide were reviewed in terms of eight images quality categories, namely positioning, compression, contrast, exposure, sharpness, noise, artifacts, and examination identification, and rated on a five-point scale: (1=severe deficiency, 2=major deficiency, 3=minor deficiency, 4=good, 5=best). Failure was defined as the occurrence of more than four major deficiencies or one severe deficiency (score of 1 or 2). The results were compared among hospitals of varying kinds, and common problems in clinical images quality were identified. Two hundred and seventeen mammographic images (36.3%) failed the evaluation. Poor images were found in descending order of frequency, at The Society for Medical Examination (33/69, 47.8%), non-radiologyclinics (42/88, 47.7%), general hospitals (92/216, 42.6%), radiology clinics (39/102, 38.2%), and university hospitals (11/123, 8.9%) (p<0.01, Chi-square test). Among the 598 images, serious problems which occurred were related to positioning in 23.7% of instances (n=142) (p<0.01, Chi-square test), examination identification in 5.7% (n=34), exposure in 5.4% (n=32), contrast in 4.2% (n=25), sharpness in 2.7% (n=16), compression in 2.5% (n=15), artifacts in 2.5% (n=15), and noise in 0.3% (n=2). This study showed that in Korea, 36.3% of the mammograms examined in this sampling had important image-related defects that might have led to serious errors in patient management. The failure rate was significantly higher in non-radiology clinics and at The Society for Medical Examination than at university hospitals

  1. Quantitative comparison of clustered microcalcifications in for-presentation and for-processing mammograms in full-field digital mammography.

    Science.gov (United States)

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2017-07-01

    Mammograms acquired with full-field digital mammography (FFDM) systems are provided in both "for-processing'' and "for-presentation'' image formats. For-presentation images are traditionally intended for visual assessment by the radiologists. In this study, we investigate the feasibility of using for-presentation images in computerized analysis and diagnosis of microcalcification (MC) lesions. We make use of a set of 188 matched mammogram image pairs of MC lesions from 95 cases (biopsy proven), in which both for-presentation and for-processing images are provided for each lesion. We then analyze and characterize the MC lesions from for-presentation images and compare them with their counterparts in for-processing images. Specifically, we consider three important aspects in computer-aided diagnosis (CAD) of MC lesions. First, we quantify each MC lesion with a set of 10 image features of clustered MCs and 12 textural features of the lesion area. Second, we assess the detectability of individual MCs in each lesion from the for-presentation images by a commonly used difference-of-Gaussians (DoG) detector. Finally, we study the diagnostic accuracy in discriminating between benign and malignant MC lesions from the for-presentation images by a pretrained support vector machine (SVM) classifier. To accommodate the underlying background suppression and image enhancement in for-presentation images, a normalization procedure is applied. The quantitative image features of MC lesions from for-presentation images are highly consistent with that from for-processing images. The values of Pearson's correlation coefficient between features from the two formats range from 0.824 to 0.961 for the 10 MC image features, and from 0.871 to 0.963 for the 12 textural features. In detection of individual MCs, the FROC curve from for-presentation is similar to that from for-processing. In particular, at sensitivity level of 80%, the average number of false-positives (FPs) per image region is 9

  2. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  3. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain.

    Science.gov (United States)

    Srivastava, Subodh; Sharma, Neeraj; Singh, S K; Srivastava, R

    2014-07-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based unsharp masking and crispening method is applied on the approximation coefficients of the wavelet transformed images to further highlight the abnormalities such as micro-calcifications, tumours, etc., to reduce the false positives (FPs). Thirdly, a modified fuzzy c-means (FCM) segmentation method is applied on the output of the second step. In the modified FCM method, the mutual information is proposed as a similarity measure in place of conventional Euclidian distance based dissimilarity measure for FCM segmentation. Finally, the inverse 2D-DWT is applied. The efficacy of the proposed unsharp masking and crispening method for image enhancement is evaluated in terms of signal-to-noise ratio (SNR) and that of the proposed segmentation method is evaluated in terms of random index (RI), global consistency error (GCE), and variation of information (VoI). The performance of the proposed segmentation approach is compared with the other commonly used segmentation approaches such as Otsu's thresholding, texture based, k-means, and FCM clustering as well as thresholding. From the obtained results, it is observed that the proposed segmentation approach performs better and takes lesser processing time in comparison to the standard FCM and other segmentation methods in consideration.

  4. Cluster analysis for determining distribution center location

    Science.gov (United States)

    Lestari Widaningrum, Dyah; Andika, Aditya; Murphiyanto, Richard Dimas Julian

    2017-12-01

    Determination of distribution facilities is highly important to survive in the high level of competition in today’s business world. Companies can operate multiple distribution centers to mitigate supply chain risk. Thus, new problems arise, namely how many and where the facilities should be provided. This study examines a fast-food restaurant brand, which located in the Greater Jakarta. This brand is included in the category of top 5 fast food restaurant chain based on retail sales. There were three stages in this study, compiling spatial data, cluster analysis, and network analysis. Cluster analysis results are used to consider the location of the additional distribution center. Network analysis results show a more efficient process referring to a shorter distance to the distribution process.

  5. Comparative Distributions of Hazard Modeling Analysis

    Directory of Open Access Journals (Sweden)

    Rana Abdul Wajid

    2006-07-01

    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  6. 4D co-registration of X-ray and MR-mammograms: initial clinical results and potential incremental diagnostic value.

    Science.gov (United States)

    Dietzel, Matthias; Hopp, Torsten; Ruiter, Nicole V; Kaiser, Clemens G; Kaiser, Werner A; Baltzer, Pascal A

    2015-01-01

    4D co-registration of X-ray- and MR-mammograms (XM and MM) is a new method of image fusion. The present study aims to evaluate its clinical feasibility, radiological accuracy, and potential clinical value. XM and MM of 25 patients were co-registered. Results were evaluated by a blinded reader. Precision of the 4D co-registration was "very good" (mean-score [ms]=7), and lesions were "easier to delineate" (ms=5). In 88.8%, "relevant additional diagnostic information" was present, accounting for a more "confident diagnosis" in 76% (ms=5). 4D co-registration is feasible, accurate, and of potential clinical value. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Managing Pan-European mammography images and data using a service oriented architecture

    CERN Document Server

    Amendolia, S R; McClatchey, R; Rogulin, D; Solomonides, T

    2004-01-01

    Medical conditions such as breast cancer, and mammograms as images, are extremely complex with many degrees of variability across the population. An effective solution for the management of disparate mammogram data sources that provides sufficient statistics for complex epidemiological study is a federation of autonomous multi- centre sites which transcends national boundaries. Grid-based technologies are emerging as open-source standards-based solutions for managing and collaborating distributed resources. In the light of these new computing solutions, the MammoGrid project, as one example of a HealthGrid, is developing a Grid-aware medical application which manages a European-wide database of mammograms. The MammoGrid solution utilizes the grid technologies in seamlessly integrating distributed data sets and is investigating the potential of the Grid to support effective co-working among mammogram analysts throughout the EU.

  8. Analysis of a mammography teaching program based on an affordance design model.

    Science.gov (United States)

    Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei

    2006-12-01

    The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the

  9. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  10. Empirical analysis for Distributed Energy Resources' impact on future distribution network

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2012-01-01

    There has been a large body of statements claiming that the large scale deployment of Distributed Energy Resources (DERs) will eventually reshape the future distribution grid operation in various ways. Thus, it is interesting to introduce a platform to interpret to what extent the power system...... operation will be alternated. In this paper, quantitative results in terms of how the future distribution grid will be changed by the deployment of distributed generation, active demand and electric vehicles, are presented. The analysis is based on the conditions for both a radial and a meshed distribution...... network. The input parameters are based on the current and envisioned DER deployment scenarios proposed for Sweden....

  11. Digitized mammograms

    International Nuclear Information System (INIS)

    Bruneton, J.N.; Balu-Maestro, C.; Rogopoulos, A.; Chauvel, C.; Geoffray, A.

    1988-01-01

    Two observers conducted a blind evaluation of 100 mammography files, including 47 malignant cases. Films were read both before and after image digitization at 50 μm and 100 μm with the FilmDRSII. Digitization permitted better analysis of the normal anatomic structures and moderately improved diagnostic sensitivity. Searches for microcalcifications before and after digitization at 100 μm and 50 μm showed better analysis of anatomic structures after digitization (especially for solitary microcalcifications). The diagnostic benefit, with discovery of clustered microcalcifications, was more limited (one case at 100 μm, nine cases at 50 μm). Recognition of microcalcifications was clearly improved in dense breasts, which can benefit from reinterpretation after digitization at 50 μm rather 100μm

  12. Computer-aided detection system applied to full-field digital mammograms

    International Nuclear Information System (INIS)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella; Munoz Cacho, Pedro; Hoffmeister, Jeffrey W.

    2010-01-01

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  13. Computer-aided detection system applied to full-field digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella (Dept. of Radiology, Univ. Marques of Valdecilla Hospital, Santander (Spain)), e-mail: avegab@telefonica.net; Munoz Cacho, Pedro (Dept. of Statistics, Univ. Marques of Valdecilla Hospital, Santander (Spain)); Hoffmeister, Jeffrey W. (iCAD, Inc., Nashua, NH (United States))

    2010-12-15

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  14. Relationship between arterial vascular calcifications seen on screening mammograms and biochemical markers of endothelial injury

    Energy Technology Data Exchange (ETDEWEB)

    Pidal, Diego [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: dpidal@hotmail.com; Sanchez Vidal, M Teresa [Servicio de Medicina Interna, Hospital de Jove (Spain)], E-mail: medicinainterna@hospitaldejove.com; Rodriguez, Juan Carlos [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Servicio de Cirugia General, Hospital de Jove (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: investigacion@hospitaldejove.com; Corte, M Daniela [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: mdanielac@hotmail.com; Pravia, Paz [Servicio de Radiodiagnostico, Hospital de Jove (Spain)], E-mail: radiologia@hospitaldejove.com; Guinea, Oscar [Servicio de Radiodiagnostico, Hospital de Jove (Spain)], E-mail: oscarfguinea@seram.org; Pidal, Ivan [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: ivanpida@hotmail.com; Bongera, Miguel [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: mbchoppy@hotmail.com; Escribano, Damaso [Servicio de Medicina Interna, Hospital de Jove (Spain)], E-mail: medicinainterna@hospitaldejove.com; Gonzalez, Luis O. [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain)], E-mail: lovidiog@telefonica.net; Diez, M Cruz [Servicio de Cirugia General, Hospital de Jove (Spain)], E-mail: cirugiageneral@hospitaldejove.com; Venta, Rafael [Servicio de Analisis Clinicos, Hospital de San Agustin, Aviles (Spain); Departamento de Bioquimica y Biologia Molecular, Universidad de Oviedo (Spain)], E-mail: rafael.venta@sespa.princast.es; Vizoso, Francisco J. [Unidad de Investigacion del, Hospital de Jove, Gijon (Spain); Servicio de Cirugia General, Hospital de Jove (Spain); Instituto Universitario de Oncologia del Principado de Asturias, Oviedo (Spain)], E-mail: fjvizoso@telefonica.net

    2009-01-15

    To assess whether breast arterial calcifications (BAC) are associated with altered serum markers of cardiovascular risk, mammograms and records from 1759 women (age range: 45-65 years) screened for breast cancer were revised. One hundred and forty seven (8.36%) women showed BAC. A total of 136 women with BAC and controls (mean age: 57 and 55 years, respectively) accepted entering the study. There were no significant differences in serum levels of urea, glucose, uric acid, creatinine, total cholesterol, HDL-C, LDL-C, folic acid, vitamin B{sub 12}, TSH or cysteine, between both groups of patients. However, women with BAC showed higher serum levels of triglycerides (p = 0.006), homocysteine (p = 0.002) and hs-CRP (p = 0.003) than women without BAC. Likewise, we found a significantly higher percentage of cases with an elevated LDL-C/HDL-C ratio (coronary risk index >2) amongst women with BAC than in women without BAC (56.7 and 38.2%, respectively; p = 0.04). Our results indicate that the finding of BAC identify women showing altered serum markers of cardiovascular risk.

  15. Relationship between arterial vascular calcifications seen on screening mammograms and biochemical markers of endothelial injury

    International Nuclear Information System (INIS)

    Pidal, Diego; Sanchez Vidal, M Teresa; Rodriguez, Juan Carlos; Corte, M Daniela; Pravia, Paz; Guinea, Oscar; Pidal, Ivan; Bongera, Miguel; Escribano, Damaso; Gonzalez, Luis O.; Diez, M Cruz; Venta, Rafael; Vizoso, Francisco J.

    2009-01-01

    To assess whether breast arterial calcifications (BAC) are associated with altered serum markers of cardiovascular risk, mammograms and records from 1759 women (age range: 45-65 years) screened for breast cancer were revised. One hundred and forty seven (8.36%) women showed BAC. A total of 136 women with BAC and controls (mean age: 57 and 55 years, respectively) accepted entering the study. There were no significant differences in serum levels of urea, glucose, uric acid, creatinine, total cholesterol, HDL-C, LDL-C, folic acid, vitamin B 12 , TSH or cysteine, between both groups of patients. However, women with BAC showed higher serum levels of triglycerides (p = 0.006), homocysteine (p = 0.002) and hs-CRP (p = 0.003) than women without BAC. Likewise, we found a significantly higher percentage of cases with an elevated LDL-C/HDL-C ratio (coronary risk index >2) amongst women with BAC than in women without BAC (56.7 and 38.2%, respectively; p = 0.04). Our results indicate that the finding of BAC identify women showing altered serum markers of cardiovascular risk

  16. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    International Nuclear Information System (INIS)

    Debono, Josephine C; Poulos, Ann E

    2015-01-01

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed tool applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice

  17. Application of support vector machines to breast cancer screening using mammogram and history data

    Science.gov (United States)

    Land, Walker H., Jr.; Akanda, Anab; Lo, Joseph Y.; Anderson, Francis; Bryden, Margaret

    2002-05-01

    Support Vector Machines (SVMs) are a new and radically different type of classifiers and learning machines that use a hypothesis space of linear functions in a high dimensional feature space. This relatively new paradigm, based on Statistical Learning Theory (SLT) and Structural Risk Minimization (SRM), has many advantages when compared to traditional neural networks, which are based on Empirical Risk Minimization (ERM). Unlike neural networks, SVM training always finds a global minimum. Furthermore, SVMs have inherent ability to solve pattern classification without incorporating any problem-domain knowledge. In this study, the SVM was employed as a pattern classifier, operating on mammography data used for breast cancer detection. The main focus was to formulate the best learning machine configurations for optimum specificity and positive predictive value at very high sensitivities. Using a mammogram database of 500 biopsy-proven samples, the best performing SVM, on average, was able to achieve (under statistical 5-fold cross-validation) a specificity of 45.0% and a positive predictive value (PPV) of 50.1% at 100% sensitivity. At 97% sensitivity, a specificity of 55.8% and a PPV of 55.2% were obtained.

  18. The ATLAS distributed analysis system

    OpenAIRE

    Legger, F.

    2014-01-01

    In the LHC operations era, analysis of the multi-petabyte ATLAS data sample by globally distributed physicists is a challenging task. To attain the required scale the ATLAS Computing Model was designed around the concept of grid computing, realized in the Worldwide LHC Computing Grid (WLCG), the largest distributed computational resource existing in the sciences. The ATLAS experiment currently stores over 140 PB of data and runs about 140,000 concurrent jobs continuously at WLCG sites. During...

  19. Inequity of healthcare utilization on mammography examination and Pap smear screening in Thailand: Analysis of a population-based household survey.

    Directory of Open Access Journals (Sweden)

    Sukanya Chongthawonsatid

    Full Text Available Healthcare in Thailand is not equally distributed, and not all people can equally access healthcare resources even if they are covered by health insurance. To examine factors associated with the utilization of mammography examination for breast cancer and Pap smear screening for cervical cancer, data from the national reproductive health survey conducted by the National Statistical Office of Thailand in 2009 was examined. The survey was carried out on 15,074,126 women aged 30-59 years. The results showed that the wealthier respondents had more mammograms than did the lower-income groups. The concentration index was 0.144. The data on Pap smears for cervical cancer also showed that the wealthier respondents were more likely to have had a Pap smear than their lower-income counterparts. The concentration index was 0.054. Determinants of mammography examination were education, followed by health welfare and wealth index, whereas the determinants of Pap smear screening were wealth index, followed by health welfare and education. The government should support greater education for women because education was associated with socioeconomic status and wealth. There should be an increase in the number of screening campaigns, mobile clinics, and low-cost mammograms and continued support for accessibility to mammograms, especially in rural areas and low-income communities.

  20. Analysis about correlation between the shape and histopathological locations of mammographic microcalcifications

    International Nuclear Information System (INIS)

    Kim, Wha Young; Cho, Young Ah; Choi, Hye Young; Sung, Soon Hee; Bacek, Seung Yeon

    1998-01-01

    To analyze the location of microcalcifications present on pathologic specimens and the relationship between the shape of clustered microcalcifications seen on mammogram and the location of these microcalcifications on pathologic specimen. In 84 female patients aged 25-68, we analysed the location of microcalcifications seen on pathologic speciments. In 65 cases, the shape of these microcalcifications correlated with their location. These shapes, as seen on mammograms, were classified as granular, linear, or branching;the location of microcalcifications was difined as intraductal, stromal, lobular, or a mixture of the three. To determine the difference, if any, between pathologic diagnosis and pathological location and shape as seen on mammograms, statistical analysis using the Chi-square test was performed. Among 84 cases, 51 were benign and 33cases were malignant. In both types of disease, in 45% and 58% of cases, respectively, microcalcifications were located intraductally. There was no statistically significant difference between pathologic diagnosis and pathologic locations (p=3D0.191);analysis of the relationship between shape of microcalcification and pathological location similarly revealed no statistically significant difference(p>0.05). In four of 33 cases of malignant disease(12%), there was microcalcification not only of the tumor itself but also of the adjacent non-tumorous region. Regardless of whether the disease was benign or melignant, microcalcifieations were most commonly intraductal. The relationship between shape and location of microcalcifications seen on pathologic specimens demonstrated no statistical significance.=20

  1. Registration and analysis for images couple : application to mammograms

    OpenAIRE

    Boucher, Arnaud

    2014-01-01

    Advisor: Nicole Vincent. Date and location of PhD thesis defense: 10 January 2013, University of Paris Descartes In this thesis, the problem addressed is the development of a computer-aided diagnosis system (CAD) based on conjoint analysis of several images, and therefore on the comparison of these medical images. The particularity of our approach is to look for evolutions or aberrant new tissues in a given set, rather than attempting to characterize, with a strong a priori, the type of ti...

  2. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  3. Computer-aided classification of breast masses using contrast-enhanced digital mammograms

    Science.gov (United States)

    Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin

    2018-02-01

    By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (pbreast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.

  4. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  5. A Comparison of Distribution Free and Non-Distribution Free Factor Analysis Methods

    Science.gov (United States)

    Ritter, Nicola L.

    2012-01-01

    Many researchers recognize that factor analysis can be conducted on both correlation matrices and variance-covariance matrices. Although most researchers extract factors from non-distribution free or parametric methods, researchers can also extract factors from distribution free or non-parametric methods. The nature of the data dictates the method…

  6. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  7. Human observer detection experiments with mammograms and power-law noise

    International Nuclear Information System (INIS)

    Burgess, Arthur E.; Jacobson, Francine L.; Judy, Philip F.

    2001-01-01

    We determined contrast thresholds for lesion detection as a function of lesion size in both mammograms and filtered noise backgrounds with the same average power spectrum, P(f )=B/f 3 . Experiments were done using hybrid images with digital images of tumors added to digitized normal backgrounds, displayed on a monochrome monitor. Four tumors were extracted from digitized specimen radiographs. The lesion sizes were varied by digital rescaling to cover the range from 0.5 to 16 mm. Amplitudes were varied to determine the value required for 92% correct detection in two-alternative forced-choice (2AFC) and 90% for search experiments. Three observers participated, two physicists and a radiologist. The 2AFC mammographic results demonstrated a novel contrast-detail (CD) diagram with threshold amplitudes that increased steadily (with slope of 0.3) with increasing size for lesions larger than 1 mm. The slopes for prewhitening model observers were about 0.4. Human efficiency relative to these models was as high as 90%. The CD diagram slopes for the 2AFC experiments with filtered noise were 0.44 for humans and 0.5 for models. Human efficiency relative to the ideal observer was about 40%. The difference in efficiencies for the two types of backgrounds indicates that breast structure cannot be considered to be pure random noise for 2AFC experiments. Instead, 2AFC human detection with mammographic backgrounds is limited by a combination of noise and deterministic masking effects. The search experiments also gave thresholds that increased with lesion size. However, there was no difference in human results for mammographic and filtered noise backgrounds, suggesting that breast structure can be considered to be pure random noise for this task. Our conclusion is that, in spite of the fact that mammographic backgrounds have nonstationary statistics, models based on statistical decision theory can still be applied successfully to estimate human performance

  8. Translating the 2-dimensional mammogram into a 3-dimensional breast: Identifying factors that influence the movement of pre-operatively placed wire.

    Science.gov (United States)

    Park, Ko Un; Nathanson, David

    2017-08-01

    Pre-operative measurements from the skin to a wire-localized breast lesion can differ from operating room measurements. This study was designed to measure the discrepancies and study factors that may contribute to wire movement. Prospective data were collected on patients who underwent wire localization lumpectomy. Clip and hook location, breast size, density, and direction of wire placement were the main focus of the analysis. Wire movement was more likely with longer distance from skin to hook or clip, larger breast size (especially if "fatty"), longer time between wire placement and surgery start time, and medial wire placement in larger breast. Age, body mass index, presence of mass, malignant diagnosis, tumor grade, and clip distance to the chest wall were not associated with wire movement. A longer distance from skin to hook correlated with larger specimen volume. Translation of the lesion location from a 2-dimensional mammogram into 3-dimensional breasts is sometimes discrepant because of movement of the localizing wire. Breast size, distance of skin to clip or hook, and wire exit site in larger breasts have a significant impact on wire movement. This information may guide the surgeon's skin incision and extent of excision. © 2017 Wiley Periodicals, Inc.

  9. Distributed Algorithms for Time Optimal Reachability Analysis

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    . We propose distributed computing to accelerate time optimal reachability analysis. We develop five distributed state exploration algorithms, implement them in \\uppaal enabling it to exploit the compute resources of a dedicated model-checking cluster. We experimentally evaluate the implemented...... algorithms with four models in terms of their ability to compute near- or proven-optimal solutions, their scalability, time and memory consumption and communication overhead. Our results show that distributed algorithms work much faster than sequential algorithms and have good speedup in general.......Time optimal reachability analysis is a novel model based technique for solving scheduling and planning problems. After modeling them as reachability problems using timed automata, a real-time model checker can compute the fastest trace to the goal states which constitutes a time optimal schedule...

  10. Modelling and analysis of distributed simulation protocols with distributed graph transformation

    OpenAIRE

    Lara, Juan de; Taentzer, Gabriele

    2005-01-01

    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. J. de Lara, and G. Taentzer, "Modelling and analysis of distributed simulation protocols with distributed graph transformation...

  11. Women with physical disability and the mammogram: An observational study to identify barriers and facilitators

    International Nuclear Information System (INIS)

    Poulos, Ann; Balandin, Susan; Llewellyn, Gwynnyth; McCarthy, Louella; Dark, Leigha

    2011-01-01

    Purpose: To identify barriers and facilitators experienced by women with physical disability having a mammogram. Method: Direct observation of the mammography procedure for women with a range of physical disability at screening facilities of BreastScreen NSW Australia. Results: A volunteer sample of 13 women with varying degrees of physical disability participated in the study. The outcomes suggested that many barriers for women with physical disability can be ameliorated by environmental adaptations and guidelines for both radiographers and women. Some women however cannot be screened successfully, or can be screened only with a level of trauma and/or pain which militates against their continuation within the screening program. This study has identified physical limitations which preclude a successful outcome, those which increase the discomfort/pain of the procedure and aspects of the procedure which can be improved to minimise the experience of discomfort/pain. Conclusion: From the outcomes of the study the development of a decision tool is indicated as a method of providing information for women with physical disability and their doctors as to the likelihood of a successful outcome to participation in mammography screening.

  12. Women with physical disability and the mammogram: An observational study to identify barriers and facilitators

    Energy Technology Data Exchange (ETDEWEB)

    Poulos, Ann, E-mail: ann.poulos@sydney.edu.a [University of Sydney, Faculty of Health Sciences, Discipline of Medical Radiation Sciences, PO Box 170, Lidcombe, NSW 1825 (Australia); Balandin, Susan [University of Sydney, Faculty of Health Sciences, Discipline of Speech Pathology, PO Box 170, Lidcombe, NSW 1825 (Australia); Avdeling for helse- og sosialfag, Hogskolen i Molde, Postboks 2110, 6402 Molde (Norway); Llewellyn, Gwynnyth; McCarthy, Louella [University of Sydney, Faculty of Health Sciences, Discipline of Occupational Therapy, PO Box 170, Lidcombe, NSW 1825 (Australia); Dark, Leigha [University of Sydney, Faculty of Health Sciences, Discipline of Speech Pathology, PO Box 170, Lidcombe, NSW 1825 (Australia)

    2011-02-15

    Purpose: To identify barriers and facilitators experienced by women with physical disability having a mammogram. Method: Direct observation of the mammography procedure for women with a range of physical disability at screening facilities of BreastScreen NSW Australia. Results: A volunteer sample of 13 women with varying degrees of physical disability participated in the study. The outcomes suggested that many barriers for women with physical disability can be ameliorated by environmental adaptations and guidelines for both radiographers and women. Some women however cannot be screened successfully, or can be screened only with a level of trauma and/or pain which militates against their continuation within the screening program. This study has identified physical limitations which preclude a successful outcome, those which increase the discomfort/pain of the procedure and aspects of the procedure which can be improved to minimise the experience of discomfort/pain. Conclusion: From the outcomes of the study the development of a decision tool is indicated as a method of providing information for women with physical disability and their doctors as to the likelihood of a successful outcome to participation in mammography screening.

  13. Transient stability analysis of a distribution network with distributed generators

    NARCIS (Netherlands)

    Xyngi, I.; Ishchenko, A.; Popov, M.; Sluis, van der L.

    2009-01-01

    This letter describes the transient stability analysis of a 10-kV distribution network with wind generators, microturbines, and CHP plants. The network being modeled in Matlab/Simulink takes into account detailed dynamic models of the generators. Fault simulations at various locations are

  14. Response Time Analysis of Distributed Web Systems Using QPNs

    Directory of Open Access Journals (Sweden)

    Tomasz Rak

    2015-01-01

    Full Text Available A performance model is used for studying distributed Web systems. Performance evaluation is done by obtaining load test measurements. Queueing Petri Nets formalism supports modeling and performance analysis of distributed World Wide Web environments. The proposed distributed Web systems modeling and design methodology have been applied in the evaluation of several system architectures under different external loads. Furthermore, performance analysis is done to determine the system response time.

  15. Distribution of lod scores in oligogenic linkage analysis.

    Science.gov (United States)

    Williams, J T; North, K E; Martin, L J; Comuzzie, A G; Göring, H H; Blangero, J

    2001-01-01

    In variance component oligogenic linkage analysis it can happen that the residual additive genetic variance bounds to zero when estimating the effect of the ith quantitative trait locus. Using quantitative trait Q1 from the Genetic Analysis Workshop 12 simulated general population data, we compare the observed lod scores from oligogenic linkage analysis with the empirical lod score distribution under a null model of no linkage. We find that zero residual additive genetic variance in the null model alters the usual distribution of the likelihood-ratio statistic.

  16. A novel approach for detection and classification of mammographic microcalcifications using wavelet analysis and extreme learning machine.

    Science.gov (United States)

    Malar, E; Kandaswamy, A; Chakravarthy, D; Giri Dharan, A

    2012-09-01

    The objective of this paper is to reveal the effectiveness of wavelet based tissue texture analysis for microcalcification detection in digitized mammograms using Extreme Learning Machine (ELM). Microcalcifications are tiny deposits of calcium in the breast tissue which are potential indicators for early detection of breast cancer. The dense nature of the breast tissue and the poor contrast of the mammogram image prohibit the effectiveness in identifying microcalcifications. Hence, a new approach to discriminate the microcalcifications from the normal tissue is done using wavelet features and is compared with different feature vectors extracted using Gray Level Spatial Dependence Matrix (GLSDM) and Gabor filter based techniques. A total of 120 Region of Interests (ROIs) extracted from 55 mammogram images of mini-Mias database, including normal and microcalcification images are used in the current research. The network is trained with the above mentioned features and the results denote that ELM produces relatively better classification accuracy (94%) with a significant reduction in training time than the other artificial neural networks like Bayesnet classifier, Naivebayes classifier, and Support Vector Machine. ELM also avoids problems like local minima, improper learning rate, and over fitting. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Spatial Distribution Characteristics of Healthcare Facilities in Nanjing: Network Point Pattern Analysis and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jianhua Ni

    2016-08-01

    Full Text Available The spatial distribution of urban service facilities is largely constrained by the road network. In this study, network point pattern analysis and correlation analysis were used to analyze the relationship between road network and healthcare facility distribution. The weighted network kernel density estimation method proposed in this study identifies significant differences between the outside and inside areas of the Ming city wall. The results of network K-function analysis show that private hospitals are more evenly distributed than public hospitals, and pharmacy stores tend to cluster around hospitals along the road network. After computing the correlation analysis between different categorized hospitals and street centrality, we find that the distribution of these hospitals correlates highly with the street centralities, and that the correlations are higher with private and small hospitals than with public and large hospitals. The comprehensive analysis results could help examine the reasonability of existing urban healthcare facility distribution and optimize the location of new healthcare facilities.

  18. Volumetric Mammogram Assessment: A Helpful Tool in the Treatment of Breast Asymmetries.

    Science.gov (United States)

    Zimman, Oscar A; Butto, Carlos D; Rostagno, Román; Rostagno, Camila

    2017-12-01

    The surgical approach to breast asymmetry depends on several factors, including the surgeon's experience, the anatomy of the patient, and several methods that may help to choose a technique and define the size of the implant or the amount of breast tissue to be excised. The aim of this study is to assist in evaluation of breast volumes with the Quantra™ software application, intended for use with Hologic™ digital mammography systems. Twenty-eight women were studied with full-field digital mammography (FFDM) with the Quantra™ software application, for use with Hologic™ digital mammography systems preoperatively. The case diagnoses were as follows: breast hypertrophy, ptosis, hypoplasia, and reconstruction, and the surgeries included breast reduction, mastopexy, mastopexy and breast reduction, mastoplasty and breast augmentation, breast augmentation, and immediate or delayed breast reconstruction. Patients were evaluated from 6 to 18 months after surgery. Volumetric mammogram studies help to decide the amount of tissue to be excised, the size of the implants, and the combination of both. The results of this study were evaluated by surgeons and patients and found to be highly satisfactory. The use of full-field digital mammography with adequate software should be considered as another tool to assist in making decisions regarding the correction of breast asymmetries. Level of Evidence IV This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  19. About normal distribution on SO(3) group in texture analysis

    Science.gov (United States)

    Savyolova, T. I.; Filatov, S. V.

    2017-12-01

    This article studies and compares different normal distributions (NDs) on SO(3) group, which are used in texture analysis. Those NDs are: Fisher normal distribution (FND), Bunge normal distribution (BND), central normal distribution (CND) and wrapped normal distribution (WND). All of the previously mentioned NDs are central functions on SO(3) group. CND is a subcase for normal CLT-motivated distributions on SO(3) (CLT here is Parthasarathy’s central limit theorem). WND is motivated by CLT in R 3 and mapped to SO(3) group. A Monte Carlo method for modeling normally distributed values was studied for both CND and WND. All of the NDs mentioned above are used for modeling different components of crystallites orientation distribution function in texture analysis.

  20. Mass-Like Fibrocystic Disease of the Breast : Characteristic Findings on Mammogram and Sonogram

    International Nuclear Information System (INIS)

    KIm, Hyeon Hee; Yun, Duk Hee; Hwang, Ho Kyumg; Kim, Jang Min; Kim, Young Sun; Lee, Jung Hee

    1995-01-01

    This study was performed to evaluate the mammographic and sonographic features of mass-like fibrocystic disease of the breast to differentiate from other breast mass. We retrospectively analyzed characteristics mammographic(16 cases) and sonographic findings(39 cases) of histopathologically proven mass-like fibrocystic disease of the breast in 39 patients. Of 16 patients with mammogram, mass-like fibrocystic disease of the breast was round shape in 12 cases, high density in 14 cases.The margin of the mass was well marginated in 8 cases, poorly marginated in 8 cases. The calcification within the mass was not detected in 13 cases. In 39 patients with sonogram, mass-like fibrocystic disease of the breast was mostly ovoid shape in 24 cases, hypoechoic in 23 cases, with homogenous internal echo in 36 cases, well defined in 28 cases, and with equivocal posterior shadowing in 26 cases. The T/AP ratios of the mass was not less than 1.5 in 29 cases. The bilateral edge-shadowing of the mass was not noted in 24 cases. Characteristic findings of the mass-like fibrocystic disease of the breast are round shape, high density, well defined mass on mammograrn and ovoid shape, homogeneous internal echo, well marginated mass on sonogram which were similar to those in other benign lesion. Mass-like fibrocystic disease, which in a frequent cause of breast lumps, should be included in the differential diagnosis of breast mass with benign mammographic and/or sonographic findings

  1. Mass-Like Fibrocystic Disease of the Breast : Characteristic Findings on Mammogram and Sonogram

    Energy Technology Data Exchange (ETDEWEB)

    KIm, Hyeon Hee; Yun, Duk Hee; Hwang, Ho Kyumg; Kim, Jang Min; Kim, Young Sun; Lee, Jung Hee [Kwang Myung Sung Ae Hospital, Gwangmyeong (Korea, Republic of)

    1995-12-15

    This study was performed to evaluate the mammographic and sonographic features of mass-like fibrocystic disease of the breast to differentiate from other breast mass. We retrospectively analyzed characteristics mammographic(16 cases) and sonographic findings(39 cases) of histopathologically proven mass-like fibrocystic disease of the breast in 39 patients. Of 16 patients with mammogram, mass-like fibrocystic disease of the breast was round shape in 12 cases, high density in 14 cases.The margin of the mass was well marginated in 8 cases, poorly marginated in 8 cases. The calcification within the mass was not detected in 13 cases. In 39 patients with sonogram, mass-like fibrocystic disease of the breast was mostly ovoid shape in 24 cases, hypoechoic in 23 cases, with homogenous internal echo in 36 cases, well defined in 28 cases, and with equivocal posterior shadowing in 26 cases. The T/AP ratios of the mass was not less than 1.5 in 29 cases. The bilateral edge-shadowing of the mass was not noted in 24 cases. Characteristic findings of the mass-like fibrocystic disease of the breast are round shape, high density, well defined mass on mammograrn and ovoid shape, homogeneous internal echo, well marginated mass on sonogram which were similar to those in other benign lesion. Mass-like fibrocystic disease, which in a frequent cause of breast lumps, should be included in the differential diagnosis of breast mass with benign mammographic and/or sonographic findings

  2. Distributed Data Analysis in ATLAS

    CERN Document Server

    Nilsson, P; The ATLAS collaboration

    2012-01-01

    Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and NorduGrid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interfa...

  3. Analysis of framelets for breast cancer diagnosis.

    Science.gov (United States)

    Thivya, K S; Sakthivel, P; Venkata Sai, P M

    2016-01-01

    Breast cancer is the second threatening tumor among the women. The effective way of reducing breast cancer is its early detection which helps to improve the diagnosing process. Digital mammography plays a significant role in mammogram screening at earlier stage of breast carcinoma. Even though, it is very difficult to find accurate abnormality in prevalent screening by radiologists. But the possibility of precise breast cancer screening is encouraged by predicting the accurate type of abnormality through Computer Aided Diagnosis (CAD) systems. The two most important indicators of breast malignancy are microcalcifications and masses. In this study, framelet transform, a multiresolutional analysis is investigated for the classification of the above mentioned two indicators. The statistical and co-occurrence features are extracted from the framelet decomposed mammograms with different resolution levels and support vector machine is employed for classification with k-fold cross validation. This system achieves 94.82% and 100% accuracy in normal/abnormal classification (stage I) and benign/malignant classification (stage II) of mass classification system and 98.57% and 100% for microcalcification system when using the MIAS database.

  4. Investigation of psychophysical similarity measures for selection of similar images in the diagnosis of clustered microcalcifications on mammograms

    International Nuclear Information System (INIS)

    Muramatsu, Chisako; Li Qiang; Schmidt, Robert; Shiraishi, Junji; Doi, Kunio

    2008-01-01

    The presentation of images with lesions of known pathology that are similar to an unknown lesion may be helpful to radiologists in the diagnosis of challenging cases for improving the diagnostic accuracy and also for reducing variation among different radiologists. The authors have been developing a computerized scheme for automatically selecting similar images with clustered microcalcifications on mammograms from a large database. For similar images to be useful, they must be similar from the point of view of the diagnosing radiologists. In order to select such images, subjective similarity ratings were obtained for a number of pairs of clustered microcalcifications by breast radiologists for establishment of a ''gold standard'' of image similarity, and the gold standard was employed for determination and evaluation of the selection of similar images. The images used in this study were obtained from the Digital Database for Screening Mammography developed by the University of South Florida. The subjective similarity ratings for 300 pairs of images with clustered microcalcifications were determined by ten breast radiologists. The authors determined a number of image features which represent the characteristics of clustered microcalcifications that radiologists would use in their diagnosis. For determination of objective similarity measures, an artificial neural network (ANN) was employed. The ANN was trained with the average subjective similarity ratings as teacher and selected image features as input data. The ANN was trained to learn the relationship between the image features and the radiologists' similarity ratings; therefore, once the training was completed, the ANN was able to determine the similarity, called a psychophysical similarity measure, which was expected to be close to radiologists' impressions, for an unknown pair of clustered microcalcifications. By use of a leave-one-out test method, the best combination of features was selected. The correlation

  5. Investigation of psychophysical similarity measures for selection of similar images in the diagnosis of clustered microcalcifications on mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Chisako; Li Qiang; Schmidt, Robert; Shiraishi, Junji; Doi, Kunio [Department of Radiology, University of Chicago, 5841 South Maryland Avenue, Chicago, Illinois 60637 (United States) and Department of Intelligent Image Information, Gifu University, 1-1 Yanagido, Gifu (Japan); Department of Radiology, Duke Advanced Imaging Labs, Duke University, 2424 Erwin Road, Suite 302, Durham, North Carolina 27705 (United States); Department of Radiology, University of Chicago, 5841 South Maryland Avenue, Chicago, Illinois 60637 (United States)

    2008-12-15

    The presentation of images with lesions of known pathology that are similar to an unknown lesion may be helpful to radiologists in the diagnosis of challenging cases for improving the diagnostic accuracy and also for reducing variation among different radiologists. The authors have been developing a computerized scheme for automatically selecting similar images with clustered microcalcifications on mammograms from a large database. For similar images to be useful, they must be similar from the point of view of the diagnosing radiologists. In order to select such images, subjective similarity ratings were obtained for a number of pairs of clustered microcalcifications by breast radiologists for establishment of a ''gold standard'' of image similarity, and the gold standard was employed for determination and evaluation of the selection of similar images. The images used in this study were obtained from the Digital Database for Screening Mammography developed by the University of South Florida. The subjective similarity ratings for 300 pairs of images with clustered microcalcifications were determined by ten breast radiologists. The authors determined a number of image features which represent the characteristics of clustered microcalcifications that radiologists would use in their diagnosis. For determination of objective similarity measures, an artificial neural network (ANN) was employed. The ANN was trained with the average subjective similarity ratings as teacher and selected image features as input data. The ANN was trained to learn the relationship between the image features and the radiologists' similarity ratings; therefore, once the training was completed, the ANN was able to determine the similarity, called a psychophysical similarity measure, which was expected to be close to radiologists' impressions, for an unknown pair of clustered microcalcifications. By use of a leave-one-out test method, the best combination of features

  6. A method to test the reproducibility and to improve performance of computer-aided detection schemes for digitized mammograms

    International Nuclear Information System (INIS)

    Zheng Bin; Gur, David; Good, Walter F.; Hardesty, Lara A.

    2004-01-01

    The purpose of this study is to develop a new method for assessment of the reproducibility of computer-aided detection (CAD) schemes for digitized mammograms and to evaluate the possibility of using the implemented approach for improving CAD performance. Two thousand digitized mammograms (representing 500 cases) with 300 depicted verified masses were selected in the study. Series of images were generated for each digitized image by resampling after a series of slight image rotations. A CAD scheme developed in our laboratory was applied to all images to detect suspicious mass regions. We evaluated the reproducibility of the scheme using the detection sensitivity and false-positive rates for the original and resampled images. We also explored the possibility of improving CAD performance using three methods of combining results from the original and resampled images, including simple grouping, averaging output scores, and averaging output scores after grouping. The CAD scheme generated a detection score (from 0 to 1) for each identified suspicious region. A region with a detection score >0.5 was considered as positive. The CAD scheme detected 238 masses (79.3% case-based sensitivity) and identified 1093 false-positive regions (average 0.55 per image) in the original image dataset. In eleven repeated tests using original and ten sets of rotated and resampled images, the scheme detected a maximum of 271 masses and identified as many as 2359 false-positive regions. Two hundred and eighteen masses (80.4%) and 618 false-positive regions (26.2%) were detected in all 11 sets of images. Combining detection results improved reproducibility and the overall CAD performance. In the range of an average false-positive detection rate between 0.5 and 1 per image, the sensitivity of the scheme could be increased approximately 5% after averaging the scores of the regions detected in at least four images. At low false-positive rate (e.g., ≤average 0.3 per image), the grouping method

  7. A novel featureless approach to mass detection in digital mammograms based on support vector machines

    Energy Technology Data Exchange (ETDEWEB)

    Campanini, Renato [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Dongiovanni, Danilo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Iampieri, Emiro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Lanconelli, Nico [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Masotti, Matteo [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Palermo, Giuseppe [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Riccardi, Alessandro [Department of Physics, University of Bologna, and INFN, Bologna (Italy); Roffilli, Matteo [Department of Computer Science, University of Bologna, Bologna (Italy)

    2004-03-21

    In this work, we present a novel approach to mass detection in digital mammograms. The great variability of the appearance of masses is the main obstacle to building a mass detection method. It is indeed demanding to characterize all the varieties of masses with a reduced set of features. Hence, in our approach we have chosen not to extract any feature, for the detection of the region of interest; in contrast, we exploit all the information available on the image. A multiresolution overcomplete wavelet representation is performed, in order to codify the image with redundancy of information. The vectors of the very-large space obtained are then provided to a first support vector machine (SVM) classifier. The detection task is considered here as a two-class pattern recognition problem: crops are classified as suspect or not, by using this SVM classifier. False candidates are eliminated with a second cascaded SVM. To further reduce the number of false positives, an ensemble of experts is applied: the final suspect regions are achieved by using a voting strategy. The sensitivity of the presented system is nearly 80% with a false-positive rate of 1.1 marks per image, estimated on images coming from the USF DDSM database.

  8. Measurement based scenario analysis of short-range distribution system planning

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    This paper focuses on short-range distribution system planning using a probabilistic approach. Empirical probabilistic distributions of load demand and distributed generations are derived from the historical measurement data and incorporated into the system planning. Simulations with various...... feasible scenarios are performed based on a local distribution system at Støvring in Denmark. Simulation results provide more accurate and insightful information for the decision-maker when using the probabilistic analysis than using the worst-case analysis, so that a better planning can be achieved....

  9. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  10. Analysis of Oriented Texture - with application to the Detection of Architectural Distortion in Mammograms with Application to the Detection of Architectural Distortion in Mammograms

    CERN Document Server

    Ayres, Fabio; Desautels, JE Leo

    2011-01-01

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over

  11. HammerCloud: A Stress Testing System for Distributed Analysis

    CERN Document Server

    van der Ster, Daniel C; Ubeda Garcia, Mario; Paladin, Massimo

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud (HC) is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HC was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HC has been ...

  12. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  13. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    Energy Technology Data Exchange (ETDEWEB)

    Dietzel, Matthias, E-mail: dietzelmatthias2@hotmail.com [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Hopp, Torsten; Ruiter, Nicole [Karlsruhe Institute of Technology (KIT), Institute for Data Processing and Electronics, Postfach 3640, D-76021 Karlsruhe (Germany); Zoubi, Ramy [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Runnebaum, Ingo B. [Clinic of Gynecology and Obstetrics, Friedrich-Schiller-University Jena, Bachstrasse 18, D-07743 Jena (Germany); Kaiser, Werner A. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Medical School, University of Harvard, 25 Shattuck Street, Boston, MA 02115 (United States); Baltzer, Pascal A.T. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany)

    2011-08-15

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE {+-} Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  14. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    International Nuclear Information System (INIS)

    Dietzel, Matthias; Hopp, Torsten; Ruiter, Nicole; Zoubi, Ramy; Runnebaum, Ingo B.; Kaiser, Werner A.; Baltzer, Pascal A.T.

    2011-01-01

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE ± Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  15. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  16. A new parameter enhancing breast cancer detection in computer-aided diagnosis of X-ray mammograms

    International Nuclear Information System (INIS)

    Tanki, Nobuyoshi; Murase, Kenya; Nagao, Michinobu

    2006-01-01

    The purpose of this study was to introduce a new parameter which enhances breast cancer detection using X-ray mammography. We used the database of X-ray mammograms generated by the Japan Society of Radiological Technology. The new parameter called 'quasi-fractal dimension (Q-FD)' was calculated from the relationship between the cutoff values for the maximum image intensity in the lesion set at 21 levels from 20% to 100% at equal intervals and the number of pixels with an intensity exceeding the cutoff value. In addition to Q-FD, the image features such as curvature (C) and eccentricity (E) were extracted. The conventional fractal dimension (C-FD) was also calculated using the box-counting method. We used artificial neural networks (ANNs) as a classification method. When using C, E, C-FD and age as inputs in ANNs and taking the number of neurons in the hidden layer as 50, we found the area under the receiver operating characteristic curve (A Z ) was 0.87±0.07 in the task differentiating between benign and malignant masses. When Q-FD was added to inputs in addition to the above parameters, the A Z value was significantly improved to become 0.93±0.09. These results suggested that Q-FD is effective for discriminating between benign and malignant masses. (author)

  17. Comparison Between Digital and Synthetic 2D Mammograms in Breast Density Interpretation.

    Science.gov (United States)

    Alshafeiy, Taghreed I; Wadih, Antoine; Nicholson, Brandi T; Rochman, Carrie M; Peppard, Heather R; Patrie, James T; Harvey, Jennifer A

    2017-07-01

    The purpose of this study was to compare assessments of breast density on synthetic 2D images as compared with digital 2D mammograms. This retrospective study included consecutive women undergoing screening with digital 2D mammography and tomosynthesis during May 2015 with a negative or benign outcome. In separate reading sessions, three radiologists with 5-25 years of clinical experience and 1 year of experience with synthetic 2D mammography read digital 2D and synthetic 2D images and assigned breast density categories according to the 5th edition of BI-RADS. Inter- and intrareader agreement was assessed for each BI-RADS density assessment and combined dense and nondense categories using percent agreement and Cohen kappa coefficient for consensus and all reads. A total of 309 patients met study inclusion criteria. Agreement between consensus BI-RADS density categories assigned for digital and synthetic 2D mammography was 80.3% (95% CI, 75.4-84.5%) with κ = 0.73 (95% CI, 0.66-0.79). For combined dense and nondense categories, agreement reached 91.9% (95% CI, 88.2-94.7%). For consensus readings, similar numbers of patients were shifted between nondense and dense categories (11 and 14, respectively) with the synthetic 2D compared with digital 2D mammography. Interreader differences were apparent; assignment to dense categories was greater with digital 2D mammography for reader 1 (odds ratio [OR], 1.26; p = 0.002), the same for reader 2 (OR, 0.91; p = 0.262), and greater with synthetic 2D mammography for reader 3 (OR, 0.86; p = 0.033). Overall, synthetic 2D mammography is comparable with digital 2D mammography in assessment of breast density, though there is some variability by reader. Practices can readily adopt synthetic 2D mammography without concern that it will affect density assessment and subsequent recommendations for supplemental screening.

  18. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  19. Distributed bearing fault diagnosis based on vibration analysis

    Science.gov (United States)

    Dolenc, Boštjan; Boškoski, Pavle; Juričić, Đani

    2016-01-01

    Distributed bearing faults appear under various circumstances, for example due to electroerosion or the progression of localized faults. Bearings with distributed faults tend to generate more complex vibration patterns than those with localized faults. Despite the frequent occurrence of such faults, their diagnosis has attracted limited attention. This paper examines a method for the diagnosis of distributed bearing faults employing vibration analysis. The vibrational patterns generated are modeled by incorporating the geometrical imperfections of the bearing components. Comparing envelope spectra of vibration signals shows that one can distinguish between localized and distributed faults. Furthermore, a diagnostic procedure for the detection of distributed faults is proposed. This is evaluated on several bearings with naturally born distributed faults, which are compared with fault-free bearings and bearings with localized faults. It is shown experimentally that features extracted from vibrations in fault-free, localized and distributed fault conditions form clearly separable clusters, thus enabling diagnosis.

  20. System analysis and planning of a gas distribution network

    Energy Technology Data Exchange (ETDEWEB)

    Salas, Edwin F.M.; Farias, Helio Monteiro [AUTOMIND, Rio de Janeiro, RJ (Brazil); Costa, Carla V.R. [Universidade Salvador (UNIFACS), BA (Brazil)

    2009-07-01

    The increase in demand by gas consumers require that projects or improvements in gas distribution networks be made carefully and safely to ensure a continuous, efficient and economical supply. Gas distribution companies must ensure that the networks and equipment involved are defined and designed at the appropriate time to attend to the demands of the market. To do that a gas distribution network analysis and planning tool should use distribution networks and transmission models for the current situation and the future changes to be implemented. These models are used to evaluate project options and help in making appropriate decisions in order to minimize the capital investment in new components or simple changes in operational procedures. Gas demands are increasing and it is important that gas distribute design new distribution systems to ensure this growth, considering financial constraints of the company, as well as local legislation and regulation. In this study some steps of developing a flexible system that attends to those needs will be described. The analysis of distribution requires geographically referenced data for the models as well as an accurate connectivity and the attributes of the equipment. GIS systems are often used as a deposit center that holds the majority of this information. GIS systems are constantly updated as distribution network equipment is modified. The distribution network modeling gathered from this system ensures that the model represents the current network condition. The benefits of this architecture drastically reduce the creation and maintenance cost of the network models, because network components data are conveniently made available to populate the distribution network. This architecture ensures that the models are continually reflecting the reality of the distribution network. (author)

  1. Distributed analysis in ATLAS using GANGA

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Brochu, Frederic; Egede, Ulrik; Reece, Will; Williams, Michael; Gaidioz, Benjamin; Maier, Andrew; Moscicki, Jakub; Vanderster, Daniel; Lee, Hurng-Chun; Pajchel, Katarina; Samset, Bjorn; Slater, Mark; Soroko, Alexander; Cowan, Greig

    2010-01-01

    Distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The needs to manage the resources are very high. In every experiment up to a thousand physicists will be submitting analysis jobs to the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without expertise in Grid technology. These tools enlarge the number of Grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments, provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment. Support for all Grids presently used by ATLAS, namely the LCG/EGEE, NDGF/NorduGrid, and OSG/PanDA is provided. The integration and interaction with the ATLAS data management system DQ2 into GANGA is a key functionality. An intelligent job brokering is set up by using the job splitting mechanism together with data-set and file location knowledge. The brokering is aided by an automated system that regularly processes test analysis jobs at all ATLAS DQ2 supported sites. Large numbers of analysis jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports amongst other things tasks of user analysis with reconstructed data and small scale production of Monte Carlo data.

  2. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  3. Retrospective analysis of 'gamma distribution' based IMRT QA criteria

    International Nuclear Information System (INIS)

    Wen, C.; Chappell, R.A.

    2010-01-01

    Full text: IMRT has been implemented into clinical practice at Royal Hobart Hospital (RHH) since mid 2006 for treating patients with Head and Neck (H and N) or prostate tumours. A local quality assurance (QA) acceptance criteria based on 'gamma distribution' for approving IMRT plan was developed and implemented in early 2007. A retrospective analysis of such criteria over 194 clinical cases will be presented. The RHH IMRT criteria was established with assumption that gamma distribution obtained through inter-comparison of 2 D dose maps between planned and delivered was governed by a positive-hail' normal distribution. A commercial system-MapCheck was used for 2 D dose map comparison with a built-in gamma analysis tool. Gamma distribution histogram was generated and recorded for all cases. By retrospectively analysing those distributions using curve fitting technique, a statistical gamma distribution can be obtained and evaluated. This analytical result can be used for future IMRT planing and treatment delivery. The analyses indicate that gamma distribution obtained through MapCheckTM is well under the normal distribution, particularly for prostate cases. The applied pass/fail criteria is not overly sensitive to identify 'false fails' but can be further tighten-up for smaller field while for larger field found in both H and N and prostate cases, the criteria was correctly applied. Non-uniform distribution of detectors in MapCheck and experience level of planners are two major factors to variation in gamma distribution among clinical cases. This criteria derived from clinical statistics is superior and more accurate than single-valued criteria for lMRT QA acceptance procedure. (author)

  4. An integrated economic and distributional analysis of energy policies

    Energy Technology Data Exchange (ETDEWEB)

    Labandeira, Xavier [Facultade de CC. Economicas, University of Vigo, 36310 Vigo (Spain); Labeaga, Jose M. [Instituto de Estudios Fiscales, Avda. Cardenal Herrera Oria 378, 28035 Madrid (Spain); Rodriguez, Miguel [Facultade de CC. Empresariais e Turismo, University of Vigo, 32004 Ourense (Spain)

    2009-12-15

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  5. An integrated economic and distributional analysis of energy policies

    International Nuclear Information System (INIS)

    Labandeira, Xavier; Labeaga, Jose M.; Rodriguez, Miguel

    2009-01-01

    Most public policies, particularly those in the energy sphere, have not only efficiency but also distributional effects. However, there is a trade-off between modelling approaches suitable for calculating those impacts on the economy. For the former most of the studies have been conducted with general equilibrium models, whereas partial equilibrium models represent the main approach for distributional analysis. This paper proposes a methodology to simultaneously carry out an analysis of the distributional and efficiency consequences of changes in energy taxation. In order to do so, we have integrated a microeconomic household demand model and a computable general equilibrium model for the Spanish economy. We illustrate the advantages of this approach by simulating a revenue-neutral reform in Spanish indirect taxation, with a large increase of energy taxes that serve an environmental purpose. The results show that the reforms bring about significant efficiency and distributional effects, in some cases counterintuitive, and demonstrate the academic and social utility of this approximation. (author)

  6. Effect of the Availability of Prior Full-Field Digital Mammography and Digital Breast Tomosynthesis Images on the Interpretation of Mammograms

    Science.gov (United States)

    Catullo, Victor J.; Chough, Denise M.; Ganott, Marie A.; Kelly, Amy E.; Shinde, Dilip D.; Sumkin, Jules H.; Wallace, Luisa P.; Bandos, Andriy I.; Gur, David

    2015-01-01

    Purpose To assess the effect of and interaction between the availability of prior images and digital breast tomosynthesis (DBT) images in decisions to recall women during mammogram interpretation. Materials and Methods Verbal informed consent was obtained for this HIPAA-compliant institutional review board–approved protocol. Eight radiologists independently interpreted twice deidentified mammograms obtained in 153 women (age range, 37–83 years; mean age, 53.7 years ± 9.3 [standard deviation]) in a mode by reader by case-balanced fully crossed study. Each study consisted of current and prior full-field digital mammography (FFDM) images and DBT images that were acquired in our facility between June 2009 and January 2013. For one reading, sequential ratings were provided by using (a) current FFDM images only, (b) current FFDM and DBT images, and (c) current FFDM, DBT, and prior FFDM images. The other reading consisted of (a) current FFDM images only, (b) current and prior FFDM images, and (c) current FFDM, prior FFDM, and DBT images. Fifty verified cancer cases, 60 negative and benign cases (clinically not recalled), and 43 benign cases (clinically recalled) were included. Recall recommendations and interaction between the effect of prior FFDM and DBT images were assessed by using a generalized linear model accounting for case and reader variability. Results Average recall rates in noncancer cases were significantly reduced with the addition of prior FFDM images by 34% (145 of 421) and 32% (106 of 333) without and with DBT images, respectively (P < .001). However, this recall reduction was achieved at the cost of a corresponding 7% (23 of 345) and 4% (14 of 353) reduction in sensitivity (P = .006). In contrast, availability of DBT images resulted in a smaller reduction in recall rates (false-positive interpretations) of 19% (76 of 409) and 26% (71 of 276) without and with prior FFDM images, respectively (P = .001). Availability of DBT images resulted in 4% (15 of

  7. First Experiences with LHC Grid Computing and Distributed Analysis

    CERN Document Server

    Fisk, Ian

    2010-01-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  8. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  9. Size distribution of magnetic iron oxide nanoparticles using Warren-Averbach XRD analysis

    Science.gov (United States)

    Mahadevan, S.; Behera, S. P.; Gnanaprakash, G.; Jayakumar, T.; Philip, J.; Rao, B. P. C.

    2012-07-01

    We use the Fourier transform based Warren-Averbach (WA) analysis to separate the contributions of X-ray diffraction (XRD) profile broadening due to crystallite size and microstrain for magnetic iron oxide nanoparticles. The profile shape of the column length distribution, obtained from WA analysis, is used to analyze the shape of the magnetic iron oxide nanoparticles. From the column length distribution, the crystallite size and its distribution are estimated for these nanoparticles which are compared with size distribution obtained from dynamic light scattering measurements. The crystallite size and size distribution of crystallites obtained from WA analysis are explained based on the experimental parameters employed in preparation of these magnetic iron oxide nanoparticles. The variation of volume weighted diameter (Dv, from WA analysis) with saturation magnetization (Ms) fits well to a core shell model wherein it is known that Ms=Mbulk(1-6g/Dv) with Mbulk as bulk magnetization of iron oxide and g as magnetic shell disorder thickness.

  10. Mammographic feature enhancement by multiscale analysis

    International Nuclear Information System (INIS)

    Laine, A.F.; Schuler, S.; Fan, J.; Huda, W.

    1994-01-01

    This paper introduces a novel approach for accomplishing mammographic feature analysis by overcomplete multiresolution representations. The authors show that efficient representations may be identified within a continuum of scale-space and used to enhance features of importance to mammography. Methods of contrast enhancement are described based on three overcomplete multiscale representations: (1) the dyadic wavelet transform (separable), (2) the var-phi-transform (nonseparable, nonorthogonal), and (3) the hexagonal wavelet transform (nonseparable). Multiscale edges identified within distinct levels of transform space provide local support for image enhancement. Mammograms are reconstructed from wavelet coefficients modified at one or more levels by local and global nonlinear operators. In each case, edges and gain parameters are identified adaptively by a measure of energy within each level of scale-space. The authors show quantitatively that transform coefficients, modified by adaptive nonlinear operators, can make more obvious unseen or barely seen features of mammography without requiring additional radiation. The results are compared with traditional image enhancement techniques by measuring the local contrast of known mammographic features. The authors demonstrate that features extracted from multiresolution representations can provide an adaptive mechanism for accomplishing local contrast enhancement. By improving the visualization of breast pathology, they can improve chances of early detection while requiring less time to evaluate mammograms for most patients

  11. Effect of dose reduction on the detection of mammographic lesions: A mathematical observer model analysis

    International Nuclear Information System (INIS)

    Chawla, Amarpreet S.; Samei, Ehsan; Saunders, Robert; Abbey, Craig; Delong, David

    2007-01-01

    The effect of reduction in dose levels normally used in mammographic screening procedures on the detection of breast lesions were analyzed. Four types of breast lesions were simulated and inserted into clinically-acquired digital mammograms. Dose reduction by 50% and 75% of the original clinically-relevant exposure levels were simulated by adding corresponding simulated noise into the original mammograms. The mammograms were converted into luminance values corresponding to those displayed on a clinical soft-copy display station and subsequently analyzed by Laguerre-Gauss and Gabor channelized Hotelling observer models for differences in detectability performance with reduction in radiation dose. Performance was measured under a signal known exactly but variable detection task paradigm in terms of receiver operating characteristics (ROC) curves and area under the ROC curves. The results suggested that luminance mapping of digital mammograms affects performance of model observers. Reduction in dose levels by 50% lowered the detectability of masses with borderline statistical significance. Dose reduction did not have a statistically significant effect on detection of microcalcifications. The model results indicate that there is room for optimization of dose level in mammographic screening procedures

  12. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  13. Evaluation of information-theoretic similarity measures for content-based retrieval and detection of masses in mammograms

    International Nuclear Information System (INIS)

    Tourassi, Georgia D.; Harrawood, Brian; Singh, Swatee; Lo, Joseph Y.; Floyd, Carey E.

    2007-01-01

    The purpose of this study was to evaluate image similarity measures employed in an information-theoretic computer-assisted detection (IT-CAD) scheme. The scheme was developed for content-based retrieval and detection of masses in screening mammograms. The study is aimed toward an interactive clinical paradigm where physicians query the proposed IT-CAD scheme on mammographic locations that are either visually suspicious or indicated as suspicious by other cuing CAD systems. The IT-CAD scheme provides an evidence-based, second opinion for query mammographic locations using a knowledge database of mass and normal cases. In this study, eight entropy-based similarity measures were compared with respect to retrieval precision and detection accuracy using a database of 1820 mammographic regions of interest. The IT-CAD scheme was then validated on a separate database for false positive reduction of progressively more challenging visual cues generated by an existing, in-house mass detection system. The study showed that the image similarity measures fall into one of two categories; one category is better suited to the retrieval of semantically similar cases while the second is more effective with knowledge-based decisions regarding the presence of a true mass in the query location. In addition, the IT-CAD scheme yielded a substantial reduction in false-positive detections while maintaining high detection rate for malignant masses

  14. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    Science.gov (United States)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  15. THE MAMMOGRAPHIC CALCIFICATIONS IN BREAST CANCER

    Institute of Scientific and Technical Information of China (English)

    Tang Ruiying; Liu Jingxian; Gaowen

    1998-01-01

    Objective: This study was performed to exam the relativeship between mammographic calcifications and breast cancer. Methods: All of the 184 patients with breast diseases underwent mammography before either an open biopsy or a mastectomy. The presence,morphology, and distribution of calcifications visualized on mammograms for breast cancer were compared with the controls who remained cancer free. Statistical comparisons were made by using the x2 test. Results:Of the 184 patients with breast diaeases, 93 malignant and 91 benign lesions were histologically confirmed.Calcifications were visualized on mammograms in 60(64%) of 93 breast cancers and 26 (28%) of 91 non breast cancers. The estimated odds ratio (OR) of breast cancer was 4.5 in women with calcifications seen on mammograms, compared with those having none (P<0.01). Of the 60 breast carcinomas having mammographic calcifications, 28 (47%) were infiltrating ductal carcinomas.There were only 8 (24%) cases with infiltrating ductal cancers in the group of without calcifications seen on the mammograms (P<0.05). Conclusion: Our finding suggests that mammographic calcification appears to be a risk factor for breast cancer. The granular and linear cast type calcification provide clues to the presence of breast cancer, especially when the carcinomas without associated masses were seen on mammograms.

  16. Fissure formation in coke. 3: Coke size distribution and statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    D.R. Jenkins; D.E. Shaw; M.R. Mahoney [CSIRO, North Ryde, NSW (Australia). Mathematical and Information Sciences

    2010-07-15

    A model of coke stabilization, based on a fundamental model of fissuring during carbonisation is used to demonstrate the applicability of the fissuring model to actual coke size distributions. The results indicate that the degree of stabilization is important in determining the size distribution. A modified form of the Weibull distribution is shown to provide a better representation of the whole coke size distribution compared to the Rosin-Rammler distribution, which is generally only fitted to the lump coke. A statistical analysis of a large number of experiments in a pilot scale coke oven shows reasonably good prediction of the coke mean size, based on parameters related to blend rank, amount of low rank coal, fluidity and ash. However, the prediction of measures of the spread of the size distribution is more problematic. The fissuring model, the size distribution representation and the statistical analysis together provide a comprehensive capability for understanding and predicting the mean size and distribution of coke lumps produced during carbonisation. 12 refs., 16 figs., 4 tabs.

  17. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  18. Textural Classification of Mammographic Parenchymal Patterns with the SONNET Selforganizing Neural Network

    Directory of Open Access Journals (Sweden)

    Daniel Howard

    2008-01-01

    Full Text Available In nationwide mammography screening, thousands of mammography examinations must be processed. Each consists of two standard views of each breast, and each mammogram must be visually examined by an experienced radiologist to assess it for any anomalies. The ability to detect an anomaly in mammographic texture is important to successful outcomes in mammography screening and, in this study, a large number of mammograms were digitized with a highly accurate scanner; and textural features were derived from the mammograms as input data to a SONNET selforganizing neural network. The paper discusses how SONNET was used to produce a taxonomic organization of the mammography archive in an unsupervised manner. This process is subject to certain choices of SONNET parameters, in these numerical experiments using the craniocaudal view, and typically produced O(10, for example, 39 mammogram classes, by analysis of features from O(103 mammogram images. The mammogram taxonomy captured typical subtleties to discriminate mammograms, and it is submitted that this may be exploited to aid the detection of mammographic anomalies, for example, by acting as a preprocessing stage to simplify the task for a computational detection scheme, or by ordering mammography examinations by mammogram taxonomic class prior to screening in order to encourage more successful visual examination during screening. The resulting taxonomy may help train screening radiologists and conceivably help to settle legal cases concerning a mammography screening examination because the taxonomy can reveal the frequency of mammographic patterns in a population.

  19. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  20. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  1. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  2. Assessing breast cancer masking risk in full field digital mammography with automated texture analysis

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lillholm, Martin; Diao, Pengfei

    2015-01-01

    Purpose: The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. Method: From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen negative...... and subsequently appeared as interval cancers. To obtain mammograms without cancerous tissue, we took the contralateral mammograms. We developed a novel machine learning based method called convolutional sparse autoencoder to characterize mammographic texture. The method was trained and tested on raw mammograms...... to determine cancer detection status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade was determined for each image. Results: We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low (Quartile 1/2) versus high (Q 3...

  3. User-friendly Tool for Power Flow Analysis and Distributed Generation Optimisation in Radial Distribution Networks

    Directory of Open Access Journals (Sweden)

    M. F. Akorede

    2017-06-01

    Full Text Available The intent of power distribution companies (DISCOs is to deliver electric power to their customers in an efficient and reliable manner – with minimal energy loss cost. One major way to minimise power loss on a given power system is to install distributed generation (DG units on the distribution networks. However, to maximise benefits, it is highly crucial for a DISCO to ensure that these DG units are of optimal size and sited in the best locations on the network. This paper gives an overview of a software package developed in this study, called Power System Analysis and DG Optimisation Tool (PFADOT. The main purpose of the graphical user interface-based package is to guide a DISCO in finding the optimal size and location for DG placement in radial distribution networks. The package, which is also suitable for load flow analysis, employs the GUI feature of MATLAB. Three objective functions are formulated into a single optimisation problem and solved with fuzzy genetic algorithm to simultaneously obtain DG optimal size and location. The accuracy and reliability of the developed tool was validated using several radial test systems, and the results obtained are evaluated against the existing similar package cited in the literature, which are impressive and computationally efficient.

  4. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  5. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  6. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  7. Thermographic Analysis of Stress Distribution in Welded Joints

    Directory of Open Access Journals (Sweden)

    Domazet Ž.

    2010-06-01

    Full Text Available The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  8. Thermographic Analysis of Stress Distribution in Welded Joints

    Science.gov (United States)

    Piršić, T.; Krstulović Opara, L.; Domazet, Ž.

    2010-06-01

    The fatigue life prediction of welded joints based on S-N curves in conjunction with nominal stresses generally is not reliable. Stress distribution in welded area affected by geometrical inhomogeneity, irregular welded surface and weld toe radius is quite complex, so the local (structural) stress concept is accepted in recent papers. The aim of this paper is to determine the stress distribution in plate type aluminum welded joints, to analyze the reliability of TSA (Thermal Stress Analysis) in this kind of investigations, and to obtain numerical values for stress concentration factors for practical use. Stress distribution in aluminum butt and fillet welded joints is determined by using the three different methods: strain gauges measurement, thermal stress analysis and FEM. Obtained results show good agreement - the TSA mutually confirmed the FEM model and stresses measured by strain gauges. According to obtained results, it may be stated that TSA, as a relatively new measurement technique may in the future become a standard tool for the experimental investigation of stress concentration and fatigue in welded joints that can help to develop more accurate numerical tools for fatigue life prediction.

  9. Study of Solid State Drives performance in PROOF distributed analysis system

    Science.gov (United States)

    Panitkin, S. Y.; Ernst, M.; Petkus, R.; Rind, O.; Wenaus, T.

    2010-04-01

    Solid State Drives (SSD) is a promising storage technology for High Energy Physics parallel analysis farms. Its combination of low random access time and relatively high read speed is very well suited for situations where multiple jobs concurrently access data located on the same drive. It also has lower energy consumption and higher vibration tolerance than Hard Disk Drive (HDD) which makes it an attractive choice in many applications raging from personal laptops to large analysis farms. The Parallel ROOT Facility - PROOF is a distributed analysis system which allows to exploit inherent event level parallelism of high energy physics data. PROOF is especially efficient together with distributed local storage systems like Xrootd, when data are distributed over computing nodes. In such an architecture the local disk subsystem I/O performance becomes a critical factor, especially when computing nodes use multi-core CPUs. We will discuss our experience with SSDs in PROOF environment. We will compare performance of HDD with SSD in I/O intensive analysis scenarios. In particular we will discuss PROOF system performance scaling with a number of simultaneously running analysis jobs.

  10. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  11. Scaling analysis of meteorite shower mass distributions

    DEFF Research Database (Denmark)

    Oddershede, Lene; Meibom, A.; Bohr, Jakob

    1998-01-01

    Meteorite showers are the remains of extraterrestrial objects which are captivated by the gravitational field of the Earth. We have analyzed the mass distribution of fragments from 16 meteorite showers for scaling. The distributions exhibit distinct scaling behavior over several orders of magnetude......; the observed scaling exponents vary from shower to shower. Half of the analyzed showers show a single scaling region while the orther half show multiple scaling regimes. Such an analysis can provide knowledge about the fragmentation process and about the original meteoroid. We also suggest to compare...... the observed scaling exponents to exponents observed in laboratory experiments and discuss the possibility that one can derive insight into the original shapes of the meteoroids....

  12. Performance optimisations for distributed analysis in ALICE

    CERN Document Server

    Betev, L; Gheata, M; Grigoras, C; Hristov, P

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the framewo rks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available r esources and ranging from fully I/O - bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by a...

  13. Parametric distribution approach for flow availability in small hydro potential analysis

    Science.gov (United States)

    Abdullah, Samizee; Basri, Mohd Juhari Mat; Jamaluddin, Zahrul Zamri; Azrulhisham, Engku Ahmad; Othman, Jamel

    2016-10-01

    Small hydro system is one of the important sources of renewable energy and it has been recognized worldwide as clean energy sources. Small hydropower generation system uses the potential energy in flowing water to produce electricity is often questionable due to inconsistent and intermittent of power generated. Potential analysis of small hydro system which is mainly dependent on the availability of water requires the knowledge of water flow or stream flow distribution. This paper presented the possibility of applying Pearson system for stream flow availability distribution approximation in the small hydro system. By considering the stochastic nature of stream flow, the Pearson parametric distribution approximation was computed based on the significant characteristic of Pearson system applying direct correlation between the first four statistical moments of the distribution. The advantage of applying various statistical moments in small hydro potential analysis will have the ability to analyze the variation shapes of stream flow distribution.

  14. Distributed Analysis Experience using Ganga on an ATLAS Tier2 infrastructure

    International Nuclear Information System (INIS)

    Fassi, F.; Cabrera, S.; Vives, R.; Fernandez, A.; Gonzalez de la Hoz, S.; Sanchez, J.; March, L.; Salt, J.; Kaci, M.; Lamas, A.; Amoros, G.

    2007-01-01

    The ATLAS detector will explore the high-energy frontier of Particle Physics collecting the proton-proton collisions delivered by the LHC (Large Hadron Collider). Starting in spring 2008, the LHC will produce more than 10 Peta bytes of data per year. The adapted tiered hierarchy for computing model at the LHC is: Tier-0 (CERN), Tiers-1 and Tiers-2 centres distributed around the word. The ATLAS Distributed Analysis (DA) system has the goal of enabling physicists to perform Grid-based analysis on distributed data using distributed computing resources. IFIC Tier-2 facility is participating in several aspects of DA. In support of the ATLAS DA activities a prototype is being tested, deployed and integrated. The analysis data processing applications are based on the Athena framework. GANGA, developed by LHCb and ATLAS experiments, allows simple switching between testing on a local batch system and large-scale processing on the Grid, hiding Grid complexities. GANGA deals with providing physicists an integrated environment for job preparation, bookkeeping and archiving, job splitting and merging. The experience with the deployment, configuration and operation of the DA prototype will be presented. Experiences gained of using DA system and GANGA in the Top physics analysis will be described. (Author)

  15. HammerCloud: A Stress Testing System for Distributed Analysis

    International Nuclear Information System (INIS)

    Ster, Daniel C van der; García, Mario Úbeda; Paladin, Massimo; Elmsheuser, Johannes

    2011-01-01

    Distributed analysis of LHC data is an I/O-intensive activity which places large demands on the internal network, storage, and local disks at remote computing facilities. Commissioning and maintaining a site to provide an efficient distributed analysis service is therefore a challenge which can be aided by tools to help evaluate a variety of infrastructure designs and configurations. HammerCloud is one such tool; it is a stress testing service which is used by central operations teams, regional coordinators, and local site admins to (a) submit arbitrary number of analysis jobs to a number of sites, (b) maintain at a steady-state a predefined number of jobs running at the sites under test, (c) produce web-based reports summarizing the efficiency and performance of the sites under test, and (d) present a web-interface for historical test results to both evaluate progress and compare sites. HammerCloud was built around the distributed analysis framework Ganga, exploiting its API for grid job management. HammerCloud has been employed by the ATLAS experiment for continuous testing of many sites worldwide, and also during large scale computing challenges such as STEP'09 and UAT'09, where the scale of the tests exceeded 10,000 concurrently running and 1,000,000 total jobs over multi-day periods. In addition, HammerCloud is being adopted by the CMS experiment; the plugin structure of HammerCloud allows the execution of CMS jobs using their official tool (CRAB).

  16. Reliability of self-reported diagnostic radiation history in BRCA1/2 mutation carriers

    International Nuclear Information System (INIS)

    Pijpe, Anouk; Manders, Peggy; Mulder, Renee L.; Leeuwen, Flora E. van; Rookus, Matti A.

    2010-01-01

    We assessed reliability of self-reported diagnostic radiation history in BRCA1/2 mutation carriers with and without breast cancer. Within the frame-work of the HEBON study, 401 BRCA1/2 mutation carriers completed a baseline (1999-2004) and a follow-up questionnaire (2006-2007). Test-retest reliability of self-reported exposure to chest X-rays, fluoroscopies and mammograms was assessed for the entire study population and by case status. Overall proportion agreement on reporting ever/never exposure was good (> 75%), while the corresponding kappa coefficients were between 0.40 and 0.75, indicating at least moderate reliability beyond chance. Reliability of number of exposures was also good (> 75%). Proportion agreement on reporting age at first mammogram was low (40%) for exact consistency and moderate (60%) for consistency ± 1 year. Reliability of age at first mammogram was higher for cases than for unaffected carriers (P < 0.001) but this difference disappeared when excluding diagnostic mammograms (P = 0.60). In unaffected carriers proportion agreement on age at last mammogram was 50%. In general, the direction of disagreement on all items was equally distributed. More consistent reporting was mainly determined by a younger age at questionnaire completion. In conclusion, inconsistent self-report of diagnostic radiation by BRCA1/2 mutation carriers was mainly non-differential by disease status.

  17. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  18. User-friendly Tool for Power Flow Analysis and Distributed ...

    African Journals Online (AJOL)

    Akorede

    AKOREDE et al: TOOL FOR POWER FLOW ANALYSIS AND DISTRIBUTED GENERATION OPTIMISATION. 23 ... greenhouse gas emissions and the current deregulation of electric energy ..... Visual composition and temporal behaviour of GUI.

  19. Grids to aid breast cancer diagnosis and research

    CERN Multimedia

    2005-01-01

    The Mammo Grid project is studying the commercial possibilities for its distributed computing environment taht emplys existing Grid technologies for the creation of a European database of mammogram data (1 page)

  20. Molecular Isotopic Distribution Analysis (MIDAs) with adjustable mass accuracy.

    Science.gov (United States)

    Alves, Gelio; Ogurtsov, Aleksey Y; Yu, Yi-Kuo

    2014-01-01

    In this paper, we present Molecular Isotopic Distribution Analysis (MIDAs), a new software tool designed to compute molecular isotopic distributions with adjustable accuracies. MIDAs offers two algorithms, one polynomial-based and one Fourier-transform-based, both of which compute molecular isotopic distributions accurately and efficiently. The polynomial-based algorithm contains few novel aspects, whereas the Fourier-transform-based algorithm consists mainly of improvements to other existing Fourier-transform-based algorithms. We have benchmarked the performance of the two algorithms implemented in MIDAs with that of eight software packages (BRAIN, Emass, Mercury, Mercury5, NeutronCluster, Qmass, JFC, IC) using a consensus set of benchmark molecules. Under the proposed evaluation criteria, MIDAs's algorithms, JFC, and Emass compute with comparable accuracy the coarse-grained (low-resolution) isotopic distributions and are more accurate than the other software packages. For fine-grained isotopic distributions, we compared IC, MIDAs's polynomial algorithm, and MIDAs's Fourier transform algorithm. Among the three, IC and MIDAs's polynomial algorithm compute isotopic distributions that better resemble their corresponding exact fine-grained (high-resolution) isotopic distributions. MIDAs can be accessed freely through a user-friendly web-interface at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/midas/index.html.

  1. Sensitivity Analysis of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Liu, Zhaoxi

    2015-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate the congestions that might occur in a distribution network with high penetration of distribute energy resources (DERs). Sensitivity analysis of the DT method is crucial because of its decentralized...... control manner. The sensitivity analysis can obtain the changes of the optimal energy planning and thereby the line loading profiles over the infinitely small changes of parameters by differentiating the KKT conditions of the convex quadratic programming, over which the DT method is formed. Three case...

  2. Mammography image quality and evidence based practice: Analysis of the demonstration of the inframammary angle in the digital setting.

    Science.gov (United States)

    Spuur, Kelly; Webb, Jodi; Poulos, Ann; Nielsen, Sharon; Robinson, Wayne

    2018-03-01

    The aim of this study is to determine the clinical rates of the demonstration of the inframammary angle (IMA) on the mediolateral oblique (MLO) view of the breast on digital mammograms and to compare the outcomes with current accreditation standards for compliance. Relationships between the IMA, age, the posterior nipple line (PNL) and compressed breast thickness will be identified and the study outcomes validated using appropriate analyses of inter-reader and inter-rater reliability and variability. Differences in left versus right data were also investigated. A quantitative retrospective study of 2270 randomly selected paired digital mammograms performed by BreastScreen NSW was undertaken. Data was collected by direct measurement and visual analysis. Intra-class correlation analyses were used to evaluate inter- and intra-rater reliability. The IMA was demonstrated on 52.4% of individual and 42.6% of paired mammograms. A linear relationship was found between the posterior nipple line (PNL) and age (p-value PNL was predicted to increase by 0.48 mm for every one year increment in age. The odds of demonstrating the IMA reduced by 2% for every one year increase in age (p-value = 0.001); are 0.4% higher for every 1 mm increase in PNL (p-value = 0.001) and 1.6% lower for every 1 mm increase in compressed breast thickness, (p-valuePNL while there was 100% agreement for the demonstration of the IMA. Analysis of the demonstration of the IMA indicates clinically achievable rates (42.6%) well below that required for compliance (50%-75%) to known worldwide accreditation standards for screening mammography. These standards should be aligned to the reported evidence base. Visualisation of the IMA is impacted negatively by increasing age and compressed breast thickness but positively by breast size (PNL). Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Analysis of rainfall distribution in Kelantan river basin, Malaysia

    Science.gov (United States)

    Che Ros, Faizah; Tosaka, Hiroyuki

    2018-03-01

    Using rainfall gauge on its own as input carries great uncertainties regarding runoff estimation, especially when the area is large and the rainfall is measured and recorded at irregular spaced gauging stations. Hence spatial interpolation is the key to obtain continuous and orderly rainfall distribution at unknown points to be the input to the rainfall runoff processes for distributed and semi-distributed numerical modelling. It is crucial to study and predict the behaviour of rainfall and river runoff to reduce flood damages of the affected area along the Kelantan river. Thus, a good knowledge on rainfall distribution is essential in early flood prediction studies. Forty six rainfall stations and their daily time-series were used to interpolate gridded rainfall surfaces using inverse-distance weighting (IDW), inverse-distance and elevation weighting (IDEW) methods and average rainfall distribution. Sensitivity analysis for distance and elevation parameters were conducted to see the variation produced. The accuracy of these interpolated datasets was examined using cross-validation assessment.

  4. Silicon Bipolar Distributed Oscillator Design and Analysis | Aku ...

    African Journals Online (AJOL)

    The design of high frequency silicon bipolar oscillator using common emitter (CE) with distributed output and analysis is carried out. The general condition for oscillation and the resulting analytical expressions for the frequency of oscillators were reviewed. Transmission line design was carried out using Butterworth LC ...

  5. On process capability and system availability analysis of the inverse Rayleigh distribution

    Directory of Open Access Journals (Sweden)

    Sajid Ali

    2015-04-01

    Full Text Available In this article, process capability and system availability analysis is discussed for the inverse Rayleigh lifetime distribution. Bayesian approach with a conjugate gamma distribution is adopted for the analysis. Different types of loss functions are considered to find Bayes estimates of the process capability and system availability. A simulation study is conducted for the comparison of different loss functions.

  6. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    OpenAIRE

    S. M. Musaeva

    2012-01-01

    The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  7. AND LANDSCAPE-ECOLOGICAL ANALYSIS OF ITS DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    S. M. Musaeva

    2012-01-01

    Full Text Available The article is devoted to the study of helminthofauna of the striped lizard in Lankaran natural region. The landscape and ecological analysis of distribution of the helminthofauna is provided. As a result of studies on 99 individuals of striped lizard totally 14 species of helminthes, including 1 trematode species, 1 species of cestode, 3 species of akantocefals and 9 species of nematodes were found.

  8. Breast ultrasonographic and histopathological characteristics without any mammographic abnormalities

    International Nuclear Information System (INIS)

    Tamaki, Kentaro; Kamada, Yoshihiko; Uehara, Kano; Tamaki, Nobumitsu; Ishida, Takanori; Miyashita, Minoru; Amari, Masakazu; Ohuchi, Noriaki; Sasano, Hironobu

    2012-01-01

    We evaluated ultrasonographic findings and the corresponding histopathological characteristics of breast cancer patients with Breast Imaging Reporting and Data System (BI-RADS) category 1 mammogram. We retrospectively reviewed the ultrasonographic findings and the corresponding histopathological features of 45 breast cancer patients with BI-RADS category 1 mammogram and 537 controls with mammographic abnormalities. We evaluated the ultrasonographic findings including mass shape, periphery, internal and posterior echo pattern, interruption of mammary borders and the distribution of low-echoic lesions, and the corresponding histopathological characteristics including histological classification, hormone receptor and human epidermal growth factor receptor 2 status of invasive ductal carcinoma and ductal carcinoma in situ, histological grade, mitotic counts and lymphovascular invasion in individual cases of BI-RADS category 1 mammograms and compared with those of the control group. The ultrasonographic characteristics of the BI-RADS category 1 group were characterized by a higher ratio of round shape (P<0.001), non-spiculated periphery (P=0.021), non-interruption of mammary borders (P<0.001) and non-attenuation (P=0.011) compared with the control group. A total of 52.6% of low-echoic lesions were associated with spotted distribution in the BI-RADS 1 group, whereas 25.8% of low-echoic lesions were associated with spotted distribution in the control group (P=0.012). As for histopathological characteristics, there was a statistically higher ratio of triple-negative subtype (P=0.021), and this particular tendency was detected in histological grade 3 in the BI-RADS category 1 group (P=0.094). We evaluated ultrasonographic findings and the corresponding histopathological characteristics for BI-RADS category 1 mammograms and noted significant differences among these findings in this study. Evaluation of these ultrasonographic and histopathological characteristics may provide

  9. Residual stress distribution analysis of heat treated APS TBC using image based modelling.

    Science.gov (United States)

    Li, Chun; Zhang, Xun; Chen, Ying; Carr, James; Jacques, Simon; Behnsen, Julia; di Michiel, Marco; Xiao, Ping; Cernik, Robert

    2017-08-01

    We carried out a residual stress distribution analysis in a APS TBC throughout the depth of the coatings. The samples were heat treated at 1150 °C for 190 h and the data analysis used image based modelling based on the real 3D images measured by Computed Tomography (CT). The stress distribution in several 2D slices from the 3D model is included in this paper as well as the stress distribution along several paths shown on the slices. Our analysis can explain the occurrence of the "jump" features near the interface between the top coat and the bond coat. These features in the residual stress distribution trend were measured (as a function of depth) by high-energy synchrotron XRD (as shown in our related research article entitled 'Understanding the Residual Stress Distribution through the Thickness of Atmosphere Plasma Sprayed (APS) Thermal Barrier Coatings (TBCs) by high energy Synchrotron XRD; Digital Image Correlation (DIC) and Image Based Modelling') (Li et al., 2017) [1].

  10. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    Efficient and cost effective transportation and logistics plays a vital role in the supply chains of the modern world’s manufacturers. Global distribution of goods is a very complicated matter as it involves many different distinct planning problems. The focus of this presentation is to demonstrate...... a number of important issues which have been identified when addressing the Distribution Network Design problem from a modelling angle. More specifically, we present an analysis of the research which has been performed in utilizing operational research in developing and optimising distribution systems....

  11. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  12. Risk analysis for a local gas distribution network

    International Nuclear Information System (INIS)

    Peters, J.W.

    1991-01-01

    Cost control and service reliability are popular topics when discussing strategic issues facing local distribution companies (LDCs) in the 1990s. The ability to provide secure and uninterrupted gas service is crucial for growth and company image, both with the public and regulatory agencies. At the same time, the industry is facing unprecedented competition from alternate fuels, and cost control is essential for maintaining a competitive edge in the market. On the surface, it would appear that cost control and service reliability are contradictory terms. Improvement in service reliability should cost something, or does it? Risk analysis can provide the answer from a distribution design perspective. From a gas distribution engineer's perspective, projects such as loops, backfeeds and even valve placement are designed to reduce, minimize and/or eliminate potential customer outages. These projects improve service reliability by acting as backups should a failure occur on a component of the distribution network. These contingency projects are cost-effective but their longterm benefit or true value is under question. Their purpose is to maintain supply to an area in the distribution network in the event of a failure somewhere else. Two phrases, potential customer outages and in the event of failure, identify uncertainty

  13. Spectral analysis of full field digital mammography data

    International Nuclear Information System (INIS)

    Heine, John J.; Velthuizen, Robert P.

    2002-01-01

    The spectral content of mammograms acquired from using a full field digital mammography (FFDM) system are analyzed. Fourier methods are used to show that the FFDM image power spectra obey an inverse power law; in an average sense, the images may be considered as 1/f fields. Two data representations are analyzed and compared (1) the raw data, and (2) the logarithm of the raw data. Two methods are employed to analyze the power spectra (1) a technique based on integrating the Fourier plane with octave ring sectioning developed previously, and (2) an approach based on integrating the Fourier plane using rings of constant width developed for this work. Both methods allow theoretical modeling. Numerical analysis indicates that the effects due to the transformation influence the power spectra measurements in a statistically significant manner in the high frequency range. However, this effect has little influence on the inverse power law estimation for a given image regardless of the data representation or the theoretical analysis approach. The analysis is presented from two points of view (1) each image is treated independently with the results presented as distributions, and (2) for a given representation, the entire image collection is treated as an ensemble with the results presented as expected values. In general, the constant ring width analysis forms the foundation for a spectral comparison method for finding spectral differences, from an image distribution sense, after applying a nonlinear transformation to the data. The work also shows that power law estimation may be influenced due to the presence of noise in the higher frequency range, which is consistent with the known attributes of the detector efficiency. The spectral modeling and inverse power law determinations obtained here are in agreement with that obtained from the analysis of digitized film-screen images presented previously. The form of the power spectrum for a given image is approximately 1/f 2

  14. A Script Analysis of the Distribution of Counterfeit Alcohol Across Two European Jurisdictions

    OpenAIRE

    Lord, Nicholas; Spencer, Jonathan; Bellotti, Elisa; Benson, Katie

    2017-01-01

    This article presents a script analysis of the distribution of counterfeit alcohols across two European jurisdictions. Based on an analysis of case file data from a European regulator and interviews with investigators, the article deconstructs the organisation of the distribution of the alcohol across jurisdictions into five scenes (collection, logistics, delivery, disposal, proceeds/finance) and analyses the actual (or likely permutations of) behaviours within each scene. The analysis also i...

  15. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  16. Location Analysis of Freight Distribution Terminal of Jakarta City, Indonesia

    Directory of Open Access Journals (Sweden)

    Nahry Nahry

    2016-03-01

    Full Text Available Currently Jakarta has two freight terminals, namely Pulo Gebang and Tanah Merdeka. But, both terminals are just functioned for parking and have not been utilized properly yet, e.g. for consolidation. Goods consolidation, which is usually performed in distribution terminal, may reduce number of freight flow within the city. This paper is aimed to determine the best location of distribution terminal in Jakarta among those two terminals and two additional alternative sites, namely Lodan and Rawa Buaya. It is initialized by the identification of important factors that affect the location selection. It is carried out by Likert analysis through the questionnaires distributed to logistics firms. The best location is determined by applying Overlay Analysis using ArcGIS 9.2. Four grid maps are produced to represent the accessibility, cost, time, and environment factors as the important factors of location. The result shows that the ranking from the best is; Lodan, Tanah Merdeka, Pulo Gebang, and Rawa Buaya.

  17. Radiographic information theory: correction for x-ray spectral distribution

    International Nuclear Information System (INIS)

    Brodie, I.; Gutcheck, R.A.

    1983-01-01

    A more complete computational method is developed to account for the effect of the spectral distribution of the incident x-ray fluence on the minimum exposure required to record a specified information set in a diagnostic radiograph. It is shown that an earlier, less rigorous, but simpler computational technique does not introduce serious errors provided that both a good estimate of the mean energy per photon can be made and the detector does not contain an absorption edge in the spectral range. Also shown is that to a first approximation, it is immaterial whether the detecting surface counts the number of photons incident from each pixel or measures the energy incident on each pixel. A previous result is confirmed that, for mammography, the present methods of processing data from the detector utilize only a few percent of the incident information, suggesting that techniques can be developed for obtaining mammograms at substantially lower doses than those presently used. When used with film-screen combinations, x-ray tubes with tungsten anodes should require substantially lower exposures than devices using molybdenum anodes, when both are operated at their optimal voltage

  18. Distributed resistance model for the analysis of wire-wrapped rod bundles

    International Nuclear Information System (INIS)

    Ha, K. S.; Jung, H. Y.; Kwon, Y. M.; Jang, W. P.; Lee, Y. B.

    2003-01-01

    A partial flow blockage within a fuel assembly in liquid metal reactor may result in localized boiling or a failure of the fuel cladding. Thus, the precise analysis for the phenomenon is required for a safe design of LMR. MATRA-LMR code developed by KAERI models the flow distribution in an assembly by using the wire forcing function to consider the effects of wire-wrap spacers, which is important to the analysis for flow blockage. However, the wire forcing function does not have the capabilities of analysis when the flow blockage is occurred. And thus this model was altered to the distributed resistance model and the validation calculation was carried out against to the experiment of FFM 2A

  19. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  20. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  1. Energy system analysis of fuel cells and distributed generation

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Lund, Henrik

    2007-01-01

    This chapter introduces Energy System Analysis methodologies and tools, which can be used for identifying the best application of different Fuel Cell (FC) technologies to different regional or national energy systems. The main point is that the benefits of using FC technologies indeed depend...... on the energy system in which they are used. Consequently, coherent energy systems analyses of specific and complete energy systems must be conducted in order to evaluate the benefits of FC technologies and in order to be able to compare alternative solutions. In relation to distributed generation, FC...... technologies are very often connected to the use of hydrogen, which has to be provided e.g. from electrolysers. Decentralised and distributed generation has the possibility of improving the overall energy efficiency and flexibility of energy systems. Therefore, energy system analysis tools and methodologies...

  2. Making distributed ALICE analysis simple using the GRID plug-in

    International Nuclear Information System (INIS)

    Gheata, A; Gheata, M

    2012-01-01

    We have developed an interface within the ALICE analysis framework that allows transparent usage of the experiment's distributed resources. This analysis plug-in makes it possible to configure back-end specific parameters from a single interface and to run with no change the same custom user analysis in many computing environments, from local workstations to PROOF clusters or GRID resources. The tool is used now extensively in the ALICE collaboration for both end-user analysis and large scale productions.

  3. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  4. Comparing Distributions of Environmental Outcomes for Regulatory Environmental Justice Analysis

    Directory of Open Access Journals (Sweden)

    Glenn Sheriff

    2011-05-01

    Full Text Available Economists have long been interested in measuring distributional impacts of policy interventions. As environmental justice (EJ emerged as an ethical issue in the 1970s, the academic literature has provided statistical analyses of the incidence and causes of various environmental outcomes as they relate to race, income, and other demographic variables. In the context of regulatory impacts, however, there is a lack of consensus regarding what information is relevant for EJ analysis, and how best to present it. This paper helps frame the discussion by suggesting a set of questions fundamental to regulatory EJ analysis, reviewing past approaches to quantifying distributional equity, and discussing the potential for adapting existing tools to the regulatory context.

  5. Analysis and Comparison of Typical Models within Distribution Network Design

    DEFF Research Database (Denmark)

    Jørgensen, Hans Jacob; Larsen, Allan; Madsen, Oli B.G.

    This paper investigates the characteristics of typical optimisation models within Distribution Network Design. During the paper fourteen models known from the literature will be thoroughly analysed. Through this analysis a schematic approach to categorisation of distribution network design models...... for educational purposes. Furthermore, the paper can be seen as a practical introduction to network design modelling as well as a being an art manual or recipe when constructing such a model....

  6. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  7. Combining Static Analysis and Runtime Checking in Security Aspects for Distributed Tuple Spaces

    DEFF Research Database (Denmark)

    Yang, Fan; Aotani, Tomoyuki; Masuhara, Hidehiko

    2011-01-01

    Enforcing security policies to distributed systems is difficult, in particular, to a system containing untrusted components. We designed AspectKE*, an aspect-oriented programming language based on distributed tuple spaces to tackle this issue. One of the key features in AspectKE* is the program...... analysis predicates and functions that provide information on future behavior of a program. With a dual value evaluation mechanism that handles results of static analysis and runtime values at the same time, those functions and predicates enable the users to specify security policies in a uniform manner....... Our two-staged implementation strategy gathers fundamental static analysis information at load-time, so as to avoid performing all analysis at runtime. We built a compiler for AspectKE*, and successfully implemented security aspects for a distributed chat system and an electronic healthcare record...

  8. Vibrational Energy Distribution Analysis (VEDA): Scopes and limitations

    Science.gov (United States)

    Jamróz, Michał H.

    2013-10-01

    The principle of operations of the VEDA program written by the author for Potential Energy Distribution (PED) analysis of theoretical vibrational spectra is described. Nowadays, the PED analysis is indispensible tool in serious analysis of the vibrational spectra. To perform the PED analysis it is necessary to define 3N-6 linearly independent local mode coordinates. Already for 20-atomic molecules it is a difficult task. The VEDA program reads the input data automatically from the Gaussian program output files. Then, VEDA automatically proposes an introductory set of local mode coordinates. Next, the more adequate coordinates are proposed by the program and optimized to obtain maximal elements of each column (internal coordinate) of the PED matrix (the EPM parameter). The possibility for an automatic optimization of PED contributions is a unique feature of the VEDA program absent in any other programs performing PED analysis.

  9. Survival Function Analysis of Planet Size Distribution

    OpenAIRE

    Zeng, Li; Jacobsen, Stein B.; Sasselov, Dimitar D.; Vanderburg, Andrew

    2018-01-01

    Applying the survival function analysis to the planet radius distribution of the Kepler exoplanet candidates, we have identified two natural divisions of planet radius at 4 Earth radii and 10 Earth radii. These divisions place constraints on planet formation and interior structure model. The division at 4 Earth radii separates small exoplanets from large exoplanets above. When combined with the recently-discovered radius gap at 2 Earth radii, it supports the treatment of planets 2-4 Earth rad...

  10. Just fracking: a distributive environmental justice analysis of unconventional gas development in Pennsylvania, USA

    Science.gov (United States)

    Clough, Emily; Bell, Derek

    2016-02-01

    This letter presents a distributive environmental justice analysis of unconventional gas development in the area of Pennsylvania lying over the Marcellus Shale, the largest shale gas formation in play in the United States. The extraction of shale gas using unconventional wells, which are hydraulically fractured (fracking), has increased dramatically since 2005. As the number of wells has grown, so have concerns about the potential public health effects on nearby communities. These concerns make shale gas development an environmental justice issue. This letter examines whether the hazards associated with proximity to wells and the economic benefits of shale gas production are fairly distributed. We distinguish two types of distributive environmental justice: traditional and benefit sharing. We ask the traditional question: are there a disproportionate number of minority or low-income residents in areas near to unconventional wells in Pennsylvania? However, we extend this analysis in two ways: we examine income distribution and level of education; and we compare before and after shale gas development. This contributes to discussions of benefit sharing by showing how the income distribution of the population has changed. We use a binary dasymetric technique to remap the data from the 2000 US Census and the 2009-2013 American Communities Survey and combine that data with a buffer containment analysis of unconventional wells to compare the characteristics of the population living nearer to unconventional wells with those further away before and after shale gas development. Our analysis indicates that there is no evidence of traditional distributive environmental injustice: there is not a disproportionate number of minority or low-income residents in areas near to unconventional wells. However, our analysis is consistent with the claim that there is benefit sharing distributive environmental injustice: the income distribution of the population nearer to shale gas wells

  11. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Science.gov (United States)

    Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar

    2014-01-01

    Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  12. Powerlaw: a Python package for analysis of heavy-tailed distributions.

    Directory of Open Access Journals (Sweden)

    Jeff Alstott

    Full Text Available Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.

  13. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  14. Neighborhood Structural Similarity Mapping for the Classification of Masses in Mammograms.

    Science.gov (United States)

    Rabidas, Rinku; Midya, Abhishek; Chakraborty, Jayasree

    2018-05-01

    In this paper, two novel feature extraction methods, using neighborhood structural similarity (NSS), are proposed for the characterization of mammographic masses as benign or malignant. Since gray-level distribution of pixels is different in benign and malignant masses, more regular and homogeneous patterns are visible in benign masses compared to malignant masses; the proposed method exploits the similarity between neighboring regions of masses by designing two new features, namely, NSS-I and NSS-II, which capture global similarity at different scales. Complementary to these global features, uniform local binary patterns are computed to enhance the classification efficiency by combining with the proposed features. The performance of the features are evaluated using the images from the mini-mammographic image analysis society (mini-MIAS) and digital database for screening mammography (DDSM) databases, where a tenfold cross-validation technique is incorporated with Fisher linear discriminant analysis, after selecting the optimal set of features using stepwise logistic regression method. The best area under the receiver operating characteristic curve of 0.98 with an accuracy of is achieved with the mini-MIAS database, while the same for the DDSM database is 0.93 with accuracy .

  15. Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Emma [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McParland, Charles [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Roberts, Ciaran [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-07-01

    This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation. Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve

  16. Distributed analysis using GANGA on the EGEE/LCG infrastructure

    International Nuclear Information System (INIS)

    Elmsheuser, J; Brochu, F; Harrison, K; Egede, U; Gaidioz, B; Liko, D; Maier, A; Moscicki, J; Muraru, A; Lee, H-C; Romanovsky, V; Soroko, A; Tan, C L

    2008-01-01

    The distributed data analysis using Grid resources is one of the fundamental applications in high energy physics to be addressed and realized before the start of LHC data taking. The need to facilitate the access to the resources is very high. In every experiment up to a thousand physicist will be submitting analysis jobs into the Grid. Appropriate user interfaces and helper applications have to be made available to assure that all users can use the Grid without too much expertise in Grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. The GANGA job management system (http://cern.ch/ganga), developed as a common project between the ATLAS and LHCb experiments provides and integrates these kind of tools. GANGA provides a simple and consistent way of preparing, organizing and executing analysis tasks within the experiment analysis framework, implemented through a plug-in system. It allows trivial switching between running test jobs on a local batch system and running large-scale analyzes on the Grid, hiding Grid technicalities. We will be reporting on the plug-ins and our experiences of distributed data analysis using GANGA within the ATLAS experiment and the EGEE/LCG infrastructure. The integration with the ATLAS data management system DQ2 into GANGA is a key functionality. In combination with the job splitting mechanism large amounts of jobs can be sent to the locations of data following the ATLAS computing model. GANGA supports tasks of user analysis with reconstructed data and small scale production of Monte Carlo data

  17. Comparing the use and interpretation of PGMI scoring to assess the technical quality of screening mammograms in the UK and Norway

    International Nuclear Information System (INIS)

    Boyce, M.; Gullien, R.; Parashar, D.; Taylor, K.

    2015-01-01

    Objectives: To compare PGMI systems used in the UK and Norway, determine levels of agreement in its interpretation for radiographers within and between centres, informing further research towards developing a more quantitative, uniform system. Methods: Mammograms from 112 women consecutively screened in the UK and Norway were anonymised, numbered and enriched to include all four PGMI categories. Cases were scored by five mammographers from each centre using local PGMI. Sets were exchanged and the process repeated. Distribution of categories was recorded and faults documented for images scored less than perfect. These were compared within and between centres and agreement analysed using non-weighted kappa statistic. Results: Norway uses 38 assessment criteria, the UK uses 15. Best agreement was between Norway raters scoring MLO views from both UK(RMLO k = 0.57, LMLO k = 0.490) and Norway (RMLO k = 0.48, LMLO k = 0.470). Least agreement was between UK raters scoring CC views from both UK(RCC k = 0.007, LCC k = 0.01) and Norway(RCC k = −0.04, LCC k = −0.003). There were no other apparent trends in inter-rater assessment. Most frequent faults in both test sets were on MLO views. Two out of three most common faults were the same for UK and Norway raters. Conclusions: Use of PGMI varied between centres in both number and interpretation of criteria employed. We identified the most common mammographic faults highlighting possible training needs. We suggest further work to provide a consensus list of visual criteria with accurate descriptors for each classification category. A validated way of applying them could help to standardise the process. - Highlights: • No previous published work comparing PGMI use between different countries. • Variation in number of assessment criteria used and their interpretation. • Best agreement was Norway scoring MLO views from both centres-moderate. • Least agreement was UK raters scoring CC views from both

  18. Water hammer analysis in a water distribution system

    Directory of Open Access Journals (Sweden)

    John Twyman

    2017-04-01

    Full Text Available The solution to water hammer in a water distribution system (WDS is shown by applying three hybrid methods (HM based on the Box’s scheme, McCormack's method and Diffusive Scheme. Each HM formulation in conjunction with their relative advantages and disadvantages are reviewed. The analyzed WDS has pipes with different lengths, diameters and wave speeds, being the Courant number different in each pipe according to the adopted discretization. The HM results are compared with the results obtained by the Method of Characteristics (MOC. In reviewing the numerical attenuation, second order schemes based on Box and McCormack are more conservative from a numerical point of view, being recommendable their application in the analysis of water hammer in water distribution systems.

  19. Improving mass candidate detection in mammograms via feature maxima propagation and local feature selection.

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I; van Ginneken, Bram; Karssemeijer, Nico

    2014-08-01

    Mass candidate detection is a crucial component of multistep computer-aided detection (CAD) systems. It is usually performed by combining several local features by means of a classifier. When these features are processed on a per-image-location basis (e.g., for each pixel), mismatching problems may arise while constructing feature vectors for classification, which is especially true when the behavior expected from the evaluated features is a peaked response due to the presence of a mass. In this study, two of these problems, consisting of maxima misalignment and differences of maxima spread, are identified and two solutions are proposed. The first proposed method, feature maxima propagation, reproduces feature maxima through their neighboring locations. The second method, local feature selection, combines different subsets of features for different feature vectors associated with image locations. Both methods are applied independently and together. The proposed methods are included in a mammogram-based CAD system intended for mass detection in screening. Experiments are carried out with a database of 382 digital cases. Sensitivity is assessed at two sets of operating points. The first one is the interval of 3.5-15 false positives per image (FPs/image), which is typical for mass candidate detection. The second one is 1 FP/image, which allows to estimate the quality of the mass candidate detector's output for use in subsequent steps of the CAD system. The best results are obtained when the proposed methods are applied together. In that case, the mean sensitivity in the interval of 3.5-15 FPs/image significantly increases from 0.926 to 0.958 (p < 0.0002). At the lower rate of 1 FP/image, the mean sensitivity improves from 0.628 to 0.734 (p < 0.0002). Given the improved detection performance, the authors believe that the strategies proposed in this paper can render mass candidate detection approaches based on image location classification more robust to feature

  20. DIRAC - The Distributed MC Production and Analysis for LHCb

    CERN Document Server

    Tsaregorodtsev, A

    2004-01-01

    DIRAC is the LHCb distributed computing grid infrastructure for MC production and analysis. Its architecture is based on a set of distributed collaborating services. The service decomposition broadly follows the ARDA project proposal, allowing for the possibility of interchanging the EGEE/ARDA and DIRAC components in the future. Some components developed outside the DIRAC project are already in use as services, for example the File Catalog developed by the AliEn project. An overview of the DIRAC architecture will be given, in particular the recent developments to support user analysis. The main design choices will be presented. One of the main design goals of DIRAC is the simplicity of installation, configuring and operation of various services. This allows all the DIRAC resources to be easily managed by a single Production Manager. The modular design of the DIRAC components allows its functionality to be easily extended to include new computing and storage elements or to handle new tasks. The DIRAC system al...

  1. Space station electrical power distribution analysis using a load flow approach

    Science.gov (United States)

    Emanuel, Ervin M.

    1987-01-01

    The space station's electrical power system will evolve and grow in a manner much similar to the present terrestrial electrical power system utilities. The initial baseline reference configuration will contain more than 50 nodes or busses, inverters, transformers, overcurrent protection devices, distribution lines, solar arrays, and/or solar dynamic power generating sources. The system is designed to manage and distribute 75 KW of power single phase or three phase at 20 KHz, and grow to a level of 300 KW steady state, and must be capable of operating at a peak of 450 KW for 5 to 10 min. In order to plan far into the future and keep pace with load growth, a load flow power system analysis approach must be developed and utilized. This method is a well known energy assessment and management tool that is widely used throughout the Electrical Power Utility Industry. The results of a comprehensive evaluation and assessment of an Electrical Distribution System Analysis Program (EDSA) is discussed. Its potential use as an analysis and design tool for the 20 KHz space station electrical power system is addressed.

  2. Hyperdense dots mimicking microcalcifications : Mammographic findings

    International Nuclear Information System (INIS)

    Kim, Nam Hyeon; Park, Jeong Mi; Goo, Hyun Woo; Bang, Sun Woo

    1996-01-01

    To differentiate fine hyperdense dots mimicking microcalcifications from true microcalcifications on mammography. Mammograms showing hyperdense dots in ten patients (mean age, 59 years) were evaluated. Two radiologists were asked to differentiate with the naked eye the hyperdense dots seen on ten mammograms and proven microcalcifications seen on ten mammograms. Densitometry was also performed for all lesions and the contrast index was calculated. The shape and distribution of the hyperdense dots were evaluated and enquires were made regarding any history of breast disease and corresponding treatment. Biopsies were performed for two patients with hyperdense dots. Two radiologists made correct diagnoses in 19/20 cases(95%). The contrast index was 0.10-0.88 (mean 0.58) for hyperdense dots and 0.02-0.45 (mean 0.17) for true microcalcifications. The hyperdense dots were finer and homogeneously rounder than the microcalcifications. Distribution of the hyperdense dots was more superficial in subcutaneous fat (seven cases) and subareolar area (six cases). All ten patients with hyperdense dots had history of mastitis and abscesses and had been treated by open drainage (six cases) and/or folk remedy (four cases). In eight patients, herb patches had been attached. Biopsies of hyperdense dots did not show any microcalcification or evidence of malignancy. These hyperdense dots were seen mainly in older patients. Their characteristic density, shape, distribution and clinical history makes differential diagnosis from true microcalcifications easy and could reduce unnecessary diagnostic procedures such as surgical biopsy

  3. Hyperdense dots mimicking microcalcifications : Mammographic findings

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Nam Hyeon; Park, Jeong Mi; Goo, Hyun Woo; Bang, Sun Woo [Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of)

    1996-12-01

    To differentiate fine hyperdense dots mimicking microcalcifications from true microcalcifications on mammography. Mammograms showing hyperdense dots in ten patients (mean age, 59 years) were evaluated. Two radiologists were asked to differentiate with the naked eye the hyperdense dots seen on ten mammograms and proven microcalcifications seen on ten mammograms. Densitometry was also performed for all lesions and the contrast index was calculated. The shape and distribution of the hyperdense dots were evaluated and enquires were made regarding any history of breast disease and corresponding treatment. Biopsies were performed for two patients with hyperdense dots. Two radiologists made correct diagnoses in 19/20 cases(95%). The contrast index was 0.10-0.88 (mean 0.58) for hyperdense dots and 0.02-0.45 (mean 0.17) for true microcalcifications. The hyperdense dots were finer and homogeneously rounder than the microcalcifications. Distribution of the hyperdense dots was more superficial in subcutaneous fat (seven cases) and subareolar area (six cases). All ten patients with hyperdense dots had history of mastitis and abscesses and had been treated by open drainage (six cases) and/or folk remedy (four cases). In eight patients, herb patches had been attached. Biopsies of hyperdense dots did not show any microcalcification or evidence of malignancy. These hyperdense dots were seen mainly in older patients. Their characteristic density, shape, distribution and clinical history makes differential diagnosis from true microcalcifications easy and could reduce unnecessary diagnostic procedures such as surgical biopsy.

  4. Estimation of monthly solar radiation distribution for solar energy system analysis

    International Nuclear Information System (INIS)

    Coskun, C.; Oktay, Z.; Dincer, I.

    2011-01-01

    The concept of probability density frequency, which is successfully used for analyses of wind speed and outdoor temperature distributions, is now modified and proposed for estimating solar radiation distributions for design and analysis of solar energy systems. In this study, global solar radiation distribution is comprehensively analyzed for photovoltaic (PV) panel and thermal collector systems. In this regard, a case study is conducted with actual global solar irradiation data of the last 15 years recorded by the Turkish State Meteorological Service. It is found that intensity of global solar irradiance greatly affects energy and exergy efficiencies and hence the performance of collectors. -- Research highlights: → The first study to apply global solar radiation distribution in solar system analyzes. → The first study showing global solar radiation distribution as a parameter of the solar irradiance intensity. → Time probability intensity frequency and probability power distribution do not have similar distribution patterns for each month. → There is no relation between the distribution of annual time lapse and solar energy with the intensity of solar irradiance.

  5. Power distribution, the environment, and public health. A state-level analysis

    International Nuclear Information System (INIS)

    Boyce, James K.; Klemer, Andrew R.; Templet, Paul H.; Willis, Cleve E.

    1999-01-01

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  6. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes.

  7. Power distribution, the environment, and public health. A state-level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boyce, James K. [Department of Economics, University of Massachusetts, Amherst, MA 01003 (United States); Klemer, Andrew R. [Department of Biology, University of Minnesota, Duluth, MN (United States); Templet, Paul H. [Institute of Environmental Studies, Louisiana State University, Baton Rouge, LA (United States); Willis, Cleve E. [Department of Resource Economics, University of Massachusetts, Amherst, MA 01003 (United States)

    1999-04-15

    This paper examines relationships among power distribution, the environment, and public health by means of a cross-sectional analysis of the 50 US states. A measure of inter-state variations in power distribution is derived from data on voter participation, tax fairness, Medicaid access, and educational attainment. We develop and estimate a recursive model linking the distribution of power to environmental policy, environmental stress, and public health. The results support the hypothesis that greater power inequality leads to weaker environmental policies, which in turn lead to greater environmental degradation and to adverse public health outcomes

  8. Analysis of the international distribution of per capita CO2 emissions using the polarization concept

    International Nuclear Information System (INIS)

    Duro, Juan Antonio; Padilla, Emilio

    2008-01-01

    The concept of polarization is linked to the extent that a given distribution leads to the formation of homogeneous groups with opposing interests. This concept, which is basically different from the traditional one of inequality, is related to the level of inherent potential conflict in a distribution. The polarization approach has been widely applied in the analysis of income distribution. The extension of this approach to the analysis of international distribution of CO 2 emissions is quite useful as it gives a potent informative instrument for characterizing the state and evolution of the international distribution of emissions and its possible political consequences in terms of tensions and the probability of achieving agreements. In this paper we analyze the international distribution of per capita CO 2 emissions between 1971 and 2001 through the adaptation of the polarization concept and measures. We find that the most interesting grouped description deriving from the analysis is a two groups' one, which broadly coincide with Annex B and non-Annex B countries of the Kyoto Protocol, which shows the power of polarization analysis for explaining the generation of groups in the real world. The analysis also shows a significant reduction in international polarization in per capita CO 2 emissions between 1971 and 1995, but not much change since 1995, which might indicate that polarized distribution of emission is still one of the important factors leading to difficulties in achieving agreements for reducing global emissions. (author)

  9. Neuronal model with distributed delay: analysis and simulation study for gamma distribution memory kernel.

    Science.gov (United States)

    Karmeshu; Gupta, Varun; Kadambari, K V

    2011-06-01

    A single neuronal model incorporating distributed delay (memory)is proposed. The stochastic model has been formulated as a Stochastic Integro-Differential Equation (SIDE) which results in the underlying process being non-Markovian. A detailed analysis of the model when the distributed delay kernel has exponential form (weak delay) has been carried out. The selection of exponential kernel has enabled the transformation of the non-Markovian model to a Markovian model in an extended state space. For the study of First Passage Time (FPT) with exponential delay kernel, the model has been transformed to a system of coupled Stochastic Differential Equations (SDEs) in two-dimensional state space. Simulation studies of the SDEs provide insight into the effect of weak delay kernel on the Inter-Spike Interval(ISI) distribution. A measure based on Jensen-Shannon divergence is proposed which can be used to make a choice between two competing models viz. distributed delay model vis-á-vis LIF model. An interesting feature of the model is that the behavior of (CV(t))((ISI)) (Coefficient of Variation) of the ISI distribution with respect to memory kernel time constant parameter η reveals that neuron can switch from a bursting state to non-bursting state as the noise intensity parameter changes. The membrane potential exhibits decaying auto-correlation structure with or without damped oscillatory behavior depending on the choice of parameters. This behavior is in agreement with empirically observed pattern of spike count in a fixed time window. The power spectral density derived from the auto-correlation function is found to exhibit single and double peaks. The model is also examined for the case of strong delay with memory kernel having the form of Gamma distribution. In contrast to fast decay of damped oscillations of the ISI distribution for the model with weak delay kernel, the decay of damped oscillations is found to be slower for the model with strong delay kernel.

  10. Current, voltage and temperature distribution modeling of light-emitting diodes based on electrical and thermal circuit analysis

    International Nuclear Information System (INIS)

    Yun, J; Shim, J-I; Shin, D-S

    2013-01-01

    We demonstrate a modeling method based on the three-dimensional electrical and thermal circuit analysis to extract current, voltage and temperature distributions of light-emitting diodes (LEDs). In our model, the electrical circuit analysis is performed first to extract the current and voltage distributions in the LED. Utilizing the result obtained from the electrical circuit analysis as distributed heat sources, the thermal circuit is set up by using the duality between Fourier's law and Ohm's law. From the analysis of the thermal circuit, the temperature distribution at each epitaxial film is successfully obtained. Comparisons of experimental and simulation results are made by employing an InGaN/GaN multiple-quantum-well blue LED. Validity of the electrical circuit analysis is confirmed by comparing the light distribution at the surface. Since the temperature distribution at each epitaxial film cannot be obtained experimentally, the apparent temperature distribution is compared at the surface of the LED chip. Also, experimentally obtained average junction temperature is compared with the value calculated from the modeling, yielding a very good agreement. The analysis method based on the circuit modeling has an advantage of taking distributed heat sources as inputs, which is essential for high-power devices with significant self-heating. (paper)

  11. Lognormal Distribution of Cellular Uptake of Radioactivity: Statistical Analysis of α-Particle Track Autoradiography

    Science.gov (United States)

    Neti, Prasad V.S.V.; Howell, Roger W.

    2010-01-01

    Recently, the distribution of radioactivity among a population of cells labeled with 210Po was shown to be well described by a log-normal (LN) distribution function (J Nucl Med. 2006;47:1049–1058) with the aid of autoradiography. To ascertain the influence of Poisson statistics on the interpretation of the autoradiographic data, the present work reports on a detailed statistical analysis of these earlier data. Methods The measured distributions of α-particle tracks per cell were subjected to statistical tests with Poisson, LN, and Poisson-lognormal (P-LN) models. Results The LN distribution function best describes the distribution of radioactivity among cell populations exposed to 0.52 and 3.8 kBq/mL of 210Po-citrate. When cells were exposed to 67 kBq/mL, the P-LN distribution function gave a better fit; however, the underlying activity distribution remained log-normal. Conclusion The present analysis generally provides further support for the use of LN distributions to describe the cellular uptake of radioactivity. Care should be exercised when analyzing autoradiographic data on activity distributions to ensure that Poisson processes do not distort the underlying LN distribution. PMID:18483086

  12. Benefits analysis of Soft Open Points for electrical distribution network operation

    International Nuclear Information System (INIS)

    Cao, Wanyu; Wu, Jianzhong; Jenkins, Nick; Wang, Chengshan; Green, Timothy

    2016-01-01

    Highlights: • An analysis framework was developed to quantify the operational benefits. • The framework considers both network reconfiguration and SOP control. • Benefits were analyzed through both quantitative and sensitivity analysis. - Abstract: Soft Open Points (SOPs) are power electronic devices installed in place of normally-open points in electrical power distribution networks. They are able to provide active power flow control, reactive power compensation and voltage regulation under normal network operating conditions, as well as fast fault isolation and supply restoration under abnormal conditions. A steady state analysis framework was developed to quantify the operational benefits of a distribution network with SOPs under normal network operating conditions. A generic power injection model was developed and used to determine the optimal SOP operation using an improved Powell’s Direct Set method. Physical limits and power losses of the SOP device (based on back to back voltage-source converters) were considered in the model. Distribution network reconfiguration algorithms, with and without SOPs, were developed and used to identify the benefits of using SOPs. Test results on a 33-bus distribution network compared the benefits of using SOPs, traditional network reconfiguration and the combination of both. The results showed that using only one SOP achieved a similar improvement in network operation compared to the case of using network reconfiguration with all branches equipped with remotely controlled switches. A combination of SOP control and network reconfiguration provided the optimal network operation.

  13. A MODEL OF HETEROGENEOUS DISTRIBUTED SYSTEM FOR FOREIGN EXCHANGE PORTFOLIO ANALYSIS

    Directory of Open Access Journals (Sweden)

    Dragutin Kermek

    2006-06-01

    Full Text Available The paper investigates the design of heterogeneous distributed system for foreign exchange portfolio analysis. The proposed model includes few separated and dislocated but connected parts through distributed mechanisms. Making system distributed brings new perspectives to performance busting where software based load balancer gets very important role. Desired system should spread over multiple, heterogeneous platforms in order to fulfil open platform goal. Building such a model incorporates different patterns from GOF design patterns, business patterns, J2EE patterns, integration patterns, enterprise patterns, distributed design patterns to Web services patterns. The authors try to find as much as possible appropriate patterns for planned tasks in order to capture best modelling and programming practices.

  14. Computer-aided mass detection in mammography: False positive reduction via gray-scale invariant ranklet texture features

    International Nuclear Information System (INIS)

    Masotti, Matteo; Lanconelli, Nico; Campanini, Renato

    2009-01-01

    In this work, gray-scale invariant ranklet texture features are proposed for false positive reduction (FPR) in computer-aided detection (CAD) of breast masses. Two main considerations are at the basis of this proposal. First, false positive (FP) marks surviving our previous CAD system seem to be characterized by specific texture properties that can be used to discriminate them from masses. Second, our previous CAD system achieves invariance to linear/nonlinear monotonic gray-scale transformations by encoding regions of interest into ranklet images through the ranklet transform, an image transformation similar to the wavelet transform, yet dealing with pixels' ranks rather than with their gray-scale values. Therefore, the new FPR approach proposed herein defines a set of texture features which are calculated directly from the ranklet images corresponding to the regions of interest surviving our previous CAD system, hence, ranklet texture features; then, a support vector machine (SVM) classifier is used for discrimination. As a result of this approach, texture-based information is used to discriminate FP marks surviving our previous CAD system; at the same time, invariance to linear/nonlinear monotonic gray-scale transformations of the new CAD system is guaranteed, as ranklet texture features are calculated from ranklet images that have this property themselves by construction. To emphasize the gray-scale invariance of both the previous and new CAD systems, training and testing are carried out without any in-between parameters' adjustment on mammograms having different gray-scale dynamics; in particular, training is carried out on analog digitized mammograms taken from a publicly available digital database, whereas testing is performed on full-field digital mammograms taken from an in-house database. Free-response receiver operating characteristic (FROC) curve analysis of the two CAD systems demonstrates that the new approach achieves a higher reduction of FP marks

  15. A novel image toggle tool for comparison of serial mammograms: automatic density normalization and alignment-development of the tool and initial experience.

    Science.gov (United States)

    Honda, Satoshi; Tsunoda, Hiroko; Fukuda, Wataru; Saida, Yukihisa

    2014-12-01

    The purpose is to develop a new image toggle tool with automatic density normalization (ADN) and automatic alignment (AA) for comparing serial digital mammograms (DMGs). We developed an ADN and AA process to compare the images of serial DMGs. In image density normalization, a linear interpolation was applied by taking two points of high- and low-brightness areas. The alignment was calculated by determining the point of the greatest correlation while shifting the alignment between the current and prior images. These processes were performed on a PC with a 3.20-GHz Xeon processor and 8 GB of main memory. We selected 12 suspected breast cancer patients who had undergone screening DMGs in the past. Automatic processing was retrospectively performed on these images. Two radiologists subjectively evaluated them. The process of the developed algorithm took approximately 1 s per image. In our preliminary experience, two images could not be aligned approximately. When they were aligned, image toggling allowed detection of differences between examinations easily. We developed a new tool to facilitate comparative reading of DMGs on a mammography viewing system. Using this tool for toggling comparisons might improve the interpretation efficiency of serial DMGs.

  16. Modeling and Analysis of Shape with Applications in Computer-aided Diagnosis of Breast Cancer

    CERN Document Server

    Guliato, Denise

    2011-01-01

    Malignant tumors due to breast cancer and masses due to benign disease appear in mammograms with different shape characteristics: the former usually have rough, spiculated, or microlobulated contours, whereas the latter commonly have smooth, round, oval, or macrolobulated contours. Features that characterize shape roughness and complexity can assist in distinguishing between malignant tumors and benign masses. In spite of the established importance of shape factors in the analysis of breast tumors and masses, difficulties exist in obtaining accurate and artifact-free boundaries of the related

  17. Analysis of Jingdong Mall Logistics Distribution Model

    Science.gov (United States)

    Shao, Kang; Cheng, Feng

    In recent years, the development of electronic commerce in our country to speed up the pace. The role of logistics has been highlighted, more and more electronic commerce enterprise are beginning to realize the importance of logistics in the success or failure of the enterprise. In this paper, the author take Jingdong Mall for example, performing a SWOT analysis of their current situation of self-built logistics system, find out the problems existing in the current Jingdong Mall logistics distribution and give appropriate recommendations.

  18. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Grum-Grzhimailo, A.N.; Lucchese, R.R.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    A projection method is developed for extracting the nondipole contribution from the molecular frame photoelectron angular distributions of linear molecules. A corresponding convenient parametric form for the angular distributions is derived. The analysis was performed for the N 1s photoionization of the NO molecule a few eV above the ionization threshold. No detectable nondipole contribution was found for the photon energy of 412 eV

  19. CMS distributed analysis infrastructure and operations: experience with the first LHC data

    International Nuclear Information System (INIS)

    Vaandering, E W

    2011-01-01

    The CMS distributed analysis infrastructure represents a heterogeneous pool of resources distributed across several continents. The resources are harnessed using glite and glidein-based work load management systems (WMS). We provide the operational experience of the analysis workflows using CRAB-based servers interfaced with the underlying WMS. The automatized interaction of the server with the WMS provides a successful analysis workflow. We present the operational experience as well as methods used in CMS to analyze the LHC data. The interaction with CMS Run-registry for Run and luminosity block selections via CRAB is discussed. The variations of different workflows during the LHC data-taking period and the lessons drawn from this experience are also outlined.

  20. Distributed activation energy model for kinetic analysis of multi-stage hydropyrolysis of coal

    Energy Technology Data Exchange (ETDEWEB)

    Liu, X.; Li, W.; Wang, N.; Li, B. [Chinese Academy of Sciences, Taiyuan (China). Inst. of Coal Chemistry

    2003-07-01

    Based on the new analysis of distributed activation energy model, a bicentral distribution model was introduced to the analysis of multi-stage hydropyrolysis of coal. The hydropyrolysis for linear temperature programming with and without holding stage were mathematically described and the corresponding kinetic expressions were achieved. Based on the kinetics, the hydropyrolysis (HyPr) and multi-stage hydropyrolysis (MHyPr) of Xundian brown coal was simulated. The results shows that both Mo catalyst and 2-stage holding can lower the apparent activation energy of hydropyrolysis and make activation energy distribution become narrow. Besides, there exists an optimum Mo loading of 0.2% for HyPy of Xundian lignite. 10 refs.

  1. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  2. Tense Usage Analysis in Verb Distribution in Brazilian Portuguese.

    Science.gov (United States)

    Hoge, Henry W., Comp.

    This section of a four-part research project investigating the syntax of Brazilian Portuguese presents data concerning tense usage in verb distribution. The data are derived from the analysis of selected literary samples from representative and contemporary writers. The selection of authors and tabulation of data are also described. Materials…

  3. Evaluating Domestic Hot Water Distribution System Options with Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E. [Alliance for Residential Building Innovation, Davis, CA (United States); Hoeschele, E. [Alliance for Residential Building Innovation, Davis, CA (United States)

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. Transient System Simulation Tool (TRNSYS) is a full distribution system developed that has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. In this study, the Building America team built upon previous analysis modeling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall, 124 different TRNSYS models were simulated. The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  4. Distributed Leadership in Drainage Basin Management: A Critical Analysis of ‘River Chief Policy’ from a Distributed Leadership Perspective

    Science.gov (United States)

    Zhang, Liuyi

    2018-02-01

    Water resources management has been more significant than ever since the official file stipulated ‘three red lines’ to scrupulously control water usage and water pollution, accelerating the promotion of ‘River Chief Policy’ throughout China. The policy launches creative approaches to include people from different administrative levels to participate and distributes power to increase drainage basin management efficiency. Its execution resembles features of distributed leadership theory, a vastly acknowledged western leadership theory with innovative perspective and visions to suit the modern world. This paper intends to analyse the policy from a distributed leadership perspective using Taylor’s critical policy analysis framework.

  5. "Thanks for Letting Us All Share Your Mammogram Experience Virtually": Developing a Web-Based Hub for Breast Cancer Screening.

    Science.gov (United States)

    Galpin, Adam; Meredith, Joanne; Ure, Cathy; Robinson, Leslie

    2017-10-27

    The decision around whether to attend breast cancer screening can often involve making sense of confusing and contradictory information on its risks and benefits. The Word of Mouth Mammogram e-Network (WoMMeN) project was established to create a Web-based resource to support decision making regarding breast cancer screening. This paper presents data from our user-centered approach in engaging stakeholders (both health professionals and service users) in the design of this Web-based resource. Our novel approach involved creating a user design group within Facebook to allow them access to ongoing discussion between researchers, radiographers, and existing and potential service users. This study had two objectives. The first was to examine the utility of an online user design group for generating insight for the creation of Web-based health resources. We sought to explore the advantages and limitations of this approach. The second objective was to analyze what women want from a Web-based resource for breast cancer screening. We recruited a user design group on Facebook and conducted a survey within the group, asking questions about design considerations for a Web-based breast cancer screening hub. Although the membership of the Facebook group varied over time, there were 71 members in the Facebook group at the end point of analysis. We next conducted a framework analysis on 70 threads from Facebook and a thematic analysis on the 23 survey responses. We focused additionally on how the themes were discussed by the different stakeholders within the context of the design group. Two major themes were found across both the Facebook discussion and the survey data: (1) the power of information and (2) the hub as a place for communication and support. Information was considered as empowering but also recognized as threatening. Communication and the sharing of experiences were deemed important, but there was also recognition of potential miscommunication within online

  6. Teaching atlas of mammography. 2. rev. ed.

    International Nuclear Information System (INIS)

    Tabar, L.; Dean, P.B.

    1985-01-01

    The purpose of this Atlas is to teach radiologists how to analyze mammograms and arrive at the correct diagnosis through proper evaluation of the findings. The illustrated cases cover practically the entire spectrum of breast abnormalities. They are based upon referred patient material as well as 80000 mammographic screening examinations. There are two basic steps in the interpretation of mammograms: perception and analysis. Since the greatest benefit of mammography lies in the detection of breast carcinoma in its earliest possible stages, every mammogram must be systematically surveyed for the subtle hints of malignancy. Perception is taught in this Atlas by describing a method for systematic viewing. The reader is then provided with a series of mammograms with obscure lesions to encourage practice with this method. With the help of a coordinate system, the lesions can be precisely located. Practice in perception continues throughout the Atlas. After detecting an abnormality on the mammogram, the diagnosis can be reached through a careful analysis of the X-ray signs. Additional projections, coned-down compression and magnetification views provide further help in this analytic workup. Rather than starting with the diagnosis and demonstrating typical findings, the approach of this Atlas is to teach the reader how to analyze the image and reach the correct diagnosis through proper evaluation of the X-ray signs. Prerequisites for the perception and evaluation of the X-ray signs are optimum technique, knowledge of anatomy and understanding of the pathological processes leading to the mammographic appearances. (orig.)

  7. Application of adaptive boosting to EP-derived multilayer feed-forward neural networks (MLFN) to improve benign/malignant breast cancer classification

    Science.gov (United States)

    Land, Walker H., Jr.; Masters, Timothy D.; Lo, Joseph Y.; McKee, Dan

    2001-07-01

    A new neural network technology was developed for improving the benign/malignant diagnosis of breast cancer using mammogram findings. A new paradigm, Adaptive Boosting (AB), uses a markedly different theory in solutioning Computational Intelligence (CI) problems. AB, a new machine learning paradigm, focuses on finding weak learning algorithm(s) that initially need to provide slightly better than random performance (i.e., approximately 55%) when processing a mammogram training set. Then, by successive development of additional architectures (using the mammogram training set), the adaptive boosting process improves the performance of the basic Evolutionary Programming derived neural network architectures. The results of these several EP-derived hybrid architectures are then intelligently combined and tested using a similar validation mammogram data set. Optimization focused on improving specificity and positive predictive value at very high sensitivities, where an analysis of the performance of the hybrid would be most meaningful. Using the DUKE mammogram database of 500 biopsy proven samples, on average this hybrid was able to achieve (under statistical 5-fold cross-validation) a specificity of 48.3% and a positive predictive value (PPV) of 51.8% while maintaining 100% sensitivity. At 97% sensitivity, a specificity of 56.6% and a PPV of 55.8% were obtained.

  8. Development of neural network for analysis of local power distributions in BWR fuel bundles

    International Nuclear Information System (INIS)

    Tanabe, Akira; Yamamoto, Toru; Shinfuku, Kimihiro; Nakamae, Takuji.

    1993-01-01

    A neural network model has been developed to learn the local power distributions in a BWR fuel bundle. A two layers neural network with total 128 elements is used for this model. The neural network learns 33 cases of local power peaking factors of fuel rods with given enrichment distribution as the teacher signals, which were calculated by a fuel bundle nuclear analysis code based on precise physical models. This neural network model studied well the teacher signals within 1 % error. It is also able to calculate the local power distributions within several % error for the different enrichment distributions from the teacher signals when the average enrichment is close to 2 %. This neural network is simple and the computing speed of this model is 300 times faster than that of the precise nuclear analysis code. This model was applied to survey the enrichment distribution to meet a target local power distribution in a fuel bundle, and the enrichment distribution with flat power shape are obtained within short computing time. (author)

  9. Pseudodifferential Analysis, Automorphic Distributions in the Plane and Modular Forms

    CERN Document Server

    Unterberger, Andre

    2011-01-01

    Pseudodifferential analysis, introduced in this book in a way adapted to the needs of number theorists, relates automorphic function theory in the hyperbolic half-plane I to automorphic distribution theory in the plane. Spectral-theoretic questions are discussed in one or the other environment: in the latter one, the problem of decomposing automorphic functions in I according to the spectral decomposition of the modular Laplacian gives way to the simpler one of decomposing automorphic distributions in R2 into homogeneous components. The Poincare summation process, which consists in building au

  10. Decomposition and Projection Methods for Distributed Robustness Analysis of Interconnected Uncertain Systems

    DEFF Research Database (Denmark)

    Pakazad, Sina Khoshfetrat; Hansson, Anders; Andersen, Martin Skovgaard

    2013-01-01

    We consider a class of convex feasibility problems where the constraints that describe the feasible set are loosely coupled. These problems arise in robust stability analysis of large, weakly interconnected uncertain systems. To facilitate distributed implementation of robust stability analysis o...

  11. CMS distributed data analysis with CRAB3

    Science.gov (United States)

    Mascheroni, M.; Balcas, J.; Belforte, S.; Bockelman, B. P.; Hernandez, J. M.; Ciangottini, D.; Konstantinov, P. B.; Silva, J. M. D.; Ali, M. A. B. M.; Melo, A. M.; Riahi, H.; Tanasijczuk, A. J.; Yusli, M. N. B.; Wolf, M.; Woodard, A. E.; Vaandering, E.

    2015-12-01

    The CMS Remote Analysis Builder (CRAB) is a distributed workflow management tool which facilitates analysis tasks by isolating users from the technical details of the Grid infrastructure. Throughout LHC Run 1, CRAB has been successfully employed by an average of 350 distinct users each week executing about 200,000 jobs per day. CRAB has been significantly upgraded in order to face the new challenges posed by LHC Run 2. Components of the new system include 1) a lightweight client, 2) a central primary server which communicates with the clients through a REST interface, 3) secondary servers which manage user analysis tasks and submit jobs to the CMS resource provisioning system, and 4) a central service to asynchronously move user data from temporary storage in the execution site to the desired storage location. The new system improves the robustness, scalability and sustainability of the service. Here we provide an overview of the new system, operation, and user support, report on its current status, and identify lessons learned from the commissioning phase and production roll-out.

  12. Projection methods for the analysis of molecular-frame photoelectron angular distributions

    International Nuclear Information System (INIS)

    Lucchese, R.R.; Montuoro, R.; Grum-Grzhimailo, A.N.; Liu, X.-J.; Pruemper, G.; Morishita, Y.; Saito, N.; Ueda, K.

    2007-01-01

    The analysis of the molecular-frame photoelectron angular distributions (MFPADs) is discussed within the dipole approximation. The general expressions are reviewed and strategies for extracting the maximum amount of information from different types of experimental measurements are considered. The analysis of the N 1s photoionization of NO is given to illustrate the method

  13. First experience and adaptation of existing tools to ATLAS distributed analysis

    International Nuclear Information System (INIS)

    De La Hoz, S.G.; Ruiz, L.M.; Liko, D.

    2008-01-01

    The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale in ATLAS. Up to 10000 jobs were processed on about 100 sites in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC file catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the grid; it provides job splitting and merging, and includes automated job monitoring and output retrieval. (orig.)

  14. Performance optimisations for distributed analysis in ALICE

    International Nuclear Information System (INIS)

    Betev, L; Gheata, A; Grigoras, C; Hristov, P; Gheata, M

    2014-01-01

    Performance is a critical issue in a production system accommodating hundreds of analysis users. Compared to a local session, distributed analysis is exposed to services and network latencies, remote data access and heterogeneous computing infrastructure, creating a more complex performance and efficiency optimization matrix. During the last 2 years, ALICE analysis shifted from a fast development phase to the more mature and stable code. At the same time, the frameworks and tools for deployment, monitoring and management of large productions have evolved considerably too. The ALICE Grid production system is currently used by a fair share of organized and individual user analysis, consuming up to 30% or the available resources and ranging from fully I/O-bound analysis code to CPU intensive correlations or resonances studies. While the intrinsic analysis performance is unlikely to improve by a large factor during the LHC long shutdown (LS1), the overall efficiency of the system has still to be improved by an important factor to satisfy the analysis needs. We have instrumented all analysis jobs with ''sensors'' collecting comprehensive monitoring information on the job running conditions and performance in order to identify bottlenecks in the data processing flow. This data are collected by the MonALISa-based ALICE Grid monitoring system and are used to steer and improve the job submission and management policy, to identify operational problems in real time and to perform automatic corrective actions. In parallel with an upgrade of our production system we are aiming for low level improvements related to data format, data management and merging of results to allow for a better performing ALICE analysis

  15. Detection and Evaluation of Early Breast Cancer via Magnetic Resonance Imaging: Studies of Mouse Models and Clinical Implementation

    Science.gov (United States)

    2009-03-01

    Medellin D, Mohsin SK, Hilsenbeck SG, Lamph WW, Gottardis MM, Shirley MA, Kuhn JG et al: 9-cis-Retinoic acid suppresses mammary tumorigenesis in C3...asymptomatic women who were imaged with a 3D bilateral dynamic MR sequence. Breast density was classified independently by one reader on digital x...plane. Digital mammograms acquired on GE Senograph 2000D. Image Analysis: Breast density was classified on mammograms by one reader according to BI

  16. Improvements of an objective model of compressed breasts undergoing mammography: Generation and characterization of breast shapes.

    Science.gov (United States)

    Rodríguez-Ruiz, Alejandro; Feng, Steve Si Jia; van Zelst, Jan; Vreemann, Suzan; Mann, Jessica Rice; D'Orsi, Carl Joseph; Sechopoulos, Ioannis

    2017-06-01

    To develop a set of accurate 2D models of compressed breasts undergoing mammography or breast tomosynthesis, based on objective analysis, to accurately characterize mammograms with few linearly independent parameters, and to generate novel clinically realistic paired cranio-caudal (CC) and medio-lateral oblique (MLO) views of the breast. We seek to improve on an existing model of compressed breasts by overcoming detector size bias, removing the nipple and non-mammary tissue, pairing the CC and MLO views from a single breast, and incorporating the pectoralis major muscle contour into the model. The outer breast shapes in 931 paired CC and MLO mammograms were automatically detected with an in-house developed segmentation algorithm. From these shapes three generic models (CC-only, MLO-only, and joint CC/MLO) with linearly independent components were constructed via principal component analysis (PCA). The ability of the models to represent mammograms not used for PCA was tested via leave-one-out cross-validation, by measuring the average distance error (ADE). The individual models based on six components were found to depict breast shapes with accuracy (mean ADE-CC = 0.81 mm, ADE-MLO = 1.64 mm, ADE-Pectoralis = 1.61 mm), outperforming the joint CC/MLO model (P ≤ 0.001). The joint model based on 12 principal components contains 99.5% of the total variance of the data, and can be used to generate new clinically realistic paired CC and MLO breast shapes. This is achieved by generating random sets of 12 principal components, following the Gaussian distributions of the histograms of each component, which were obtained from the component values determined from the images in the mammography database used. Our joint CC/MLO model can successfully generate paired CC and MLO view shapes of the same simulated breast, while the individual models can be used to represent with high accuracy clinical acquired mammograms with a small set of parameters. This is the first

  17. Real-Time Analysis and Forecasting of Multisite River Flow Using a Distributed Hydrological Model

    Directory of Open Access Journals (Sweden)

    Mingdong Sun

    2014-01-01

    Full Text Available A spatial distributed hydrological forecasting system was developed to promote the analysis of river flow dynamic state in a large basin. The research presented the real-time analysis and forecasting of multisite river flow in the Nakdong River Basin using a distributed hydrological model with radar rainfall forecast data. A real-time calibration algorithm of hydrological distributed model was proposed to investigate the particular relationship between the water storage and basin discharge. Demonstrate the approach of simulating multisite river flow using a distributed hydrological model couple with real-time calibration and forecasting of multisite river flow with radar rainfall forecasts data. The hydrographs and results exhibit that calibrated flow simulations are very approximate to the flow observation at all sites and the accuracy of forecasting flow is gradually decreased with lead times extending from 1 hr to 3 hrs. The flow forecasts are lower than the flow observation which is likely caused by the low estimation of radar rainfall forecasts. The research has well demonstrated that the distributed hydrological model is readily applicable for multisite real-time river flow analysis and forecasting in a large basin.

  18. Scheduling mammograms for asymptomatic women

    International Nuclear Information System (INIS)

    Gohagan, J.K.; Darby, W.P.; Spitznagel, E.L.; Tome, A.E.

    1988-01-01

    A decision theoretic model was used to investigate the relative importance of risk level, radiation hazard, mammographic accuracy, and cost in mammographic screening decision. The model uses woman-specific medical and family history facts and clinic-specific information regarding mammographic accuracy and practice to profile both woman and clinic, and to formulate periodic screening recommendations. Model parameters were varied extensively to investigate the sensitivity of screening schedules to input values. Multivariate risk was estimated within the program using published data from the Breast Cancer Detection Demonstration Project 5-year follow-up study. Radiation hazard estimates were developed from published radiation physics and radioepidemiologic risk data. Benchmark values for mammographic sensitivity and specificity under screening conditions were calculated from Breast Cancer Detection Demonstration Project data. Procedural costs used in the analysis were varied around values reflecting conditions at the Washington University Medical Center. Mortality advantages of early versus late breast cancer detection were accounted for using Health Insurance Plan of New York case survival rates. Results are compared with published screening policies to provide insight into implicit assumptions behind those policies. This analysis emphasizes the importance of accounting for variations in clinical accuracy under screening circumstances, in costs, in radiation exposure, and in woman-specific risk when recommending mammographic screening

  19. A new measure of uncertainty importance based on distributional sensitivity analysis for PSA

    International Nuclear Information System (INIS)

    Han, Seok Jung; Tak, Nam Il; Chun, Moon Hyun

    1996-01-01

    The main objective of the present study is to propose a new measure of uncertainty importance based on distributional sensitivity analysis. The new measure is developed to utilize a metric distance obtained from cumulative distribution functions (cdfs). The measure is evaluated for two cases: one is a cdf given by a known analytical distribution and the other given by an empirical distribution generated by a crude Monte Carlo simulation. To study its applicability, the present measure has been applied to two different cases. The results are compared with those of existing three methods. The present approach is a useful measure of uncertainty importance which is based on cdfs. This method is simple and easy to calculate uncertainty importance without any complex process. On the basis of the results obtained in the present work, the present method is recommended to be used as a tool for the analysis of uncertainty importance

  20. Breast Density and Breast Cancer Incidence in the Lebanese Population: Results from a Retrospective Multicenter Study

    Directory of Open Access Journals (Sweden)

    Christine Salem

    2017-01-01

    Full Text Available Purpose. To study the distribution of breast mammogram density in Lebanese women and correlate it with breast cancer (BC incidence. Methods. Data from 1,049 women who had screening or diagnostic mammography were retrospectively reviewed. Age, menopausal status, contraceptives or hormonal replacement therapy (HRT, parity, breastfeeding, history of BC, breast mammogram density, and final BI-RADS assessment were collected. Breast density was analyzed in each age category and compared according to factors that could influence breast density and BC incidence. Results. 120 (11.4% patients had BC personal history with radiation and/or chemotherapy; 66 patients were postmenopausal under HRT. Mean age was 52.58±11.90 years. 76.4% of the patients (30–39 years had dense breasts. Parity, age, and menopausal status were correlated to breast density whereas breastfeeding and personal/family history of BC and HRT were not. In multivariate analysis, it was shown that the risk of breast cancer significantly increases 3.3% with age (P=0.005, 2.5 times in case of menopause (P=0.004, and 1.4 times when breast density increases (P=0.014. Conclusion. Breast density distribution in Lebanon is similar to the western society. Similarly to other studies, it was shown that high breast density was statistically related to breast cancer, especially in older and menopausal women.

  1. Simplified distributed parameters BWR dynamic model for transient and stability analysis

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Nunez-Carrera, Alejandro; Vazquez-Rodriguez, Alejandro

    2006-01-01

    This paper describes a simplified model to perform transient and linear stability analysis for a typical boiling water reactor (BWR). The simplified transient model was based in lumped and distributed parameters approximations, which includes vessel dome and the downcomer, recirculation loops, neutron process, fuel pin temperature distribution, lower and upper plenums reactor core and pressure and level controls. The stability was determined by studying the linearized versions of the equations representing the BWR system in the frequency domain. Numerical examples are used to illustrate the wide application of the simplified BWR model. We concluded that this simplified model describes properly the dynamic of a BWR and can be used for safety analysis or as a first approach in the design of an advanced BWR

  2. Mammograms

    Science.gov (United States)

    ... Reporting & Auditing Grant Transfer Grant Closeout Contracts & Small Business Training Cancer Training at NCI (Intramural) Resources for ... several new technologies to detect breast tumors. This research ranges from methods being developed in research labs to those that ...

  3. Mammograms

    Science.gov (United States)

    ... federal government website managed by the Office on Women's Health in the Office of the Assistant Secretary for Health at the U.S. Department of Health and Human Services . 200 Independence Avenue, S.W., Washington, DC 20201 1-800-994- ...

  4. Spherical Harmonic Analysis of Particle Velocity Distribution Function: Comparison of Moments and Anisotropies using Cluster Data

    Science.gov (United States)

    Gurgiolo, Chris; Vinas, Adolfo F.

    2009-01-01

    This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.

  5. Size distribution measurements and chemical analysis of aerosol components

    Energy Technology Data Exchange (ETDEWEB)

    Pakkanen, T.A.

    1995-12-31

    The principal aims of this work were to improve the existing methods for size distribution measurements and to draw conclusions about atmospheric and in-stack aerosol chemistry and physics by utilizing size distributions of various aerosol components measured. A sample dissolution with dilute nitric acid in an ultrasonic bath and subsequent graphite furnace atomic absorption spectrometric analysis was found to result in low blank values and good recoveries for several elements in atmospheric fine particle size fractions below 2 {mu}m of equivalent aerodynamic particle diameter (EAD). Furthermore, it turned out that a substantial amount of analyses associated with insoluble material could be recovered since suspensions were formed. The size distribution measurements of in-stack combustion aerosols indicated two modal size distributions for most components measured. The existence of the fine particle mode suggests that a substantial fraction of such elements with two modal size distributions may vaporize and nucleate during the combustion process. In southern Norway, size distributions of atmospheric aerosol components usually exhibited one or two fine particle modes and one or two coarse particle modes. Atmospheric relative humidity values higher than 80% resulted in significant increase of the mass median diameters of the droplet mode. Important local and/or regional sources of As, Br, I, K, Mn, Pb, Sb, Si and Zn were found to exist in southern Norway. The existence of these sources was reflected in the corresponding size distributions determined, and was utilized in the development of a source identification method based on size distribution data. On the Finnish south coast, atmospheric coarse particle nitrate was found to be formed mostly through an atmospheric reaction of nitric acid with existing coarse particle sea salt but reactions and/or adsorption of nitric acid with soil derived particles also occurred. Chloride was depleted when acidic species reacted

  6. Uncertainty Visualization Using Copula-Based Analysis in Mixed Distribution Models.

    Science.gov (United States)

    Hazarika, Subhashis; Biswas, Ayan; Shen, Han-Wei

    2018-01-01

    Distributions are often used to model uncertainty in many scientific datasets. To preserve the correlation among the spatially sampled grid locations in the dataset, various standard multivariate distribution models have been proposed in visualization literature. These models treat each grid location as a univariate random variable which models the uncertainty at that location. Standard multivariate distributions (both parametric and nonparametric) assume that all the univariate marginals are of the same type/family of distribution. But in reality, different grid locations show different statistical behavior which may not be modeled best by the same type of distribution. In this paper, we propose a new multivariate uncertainty modeling strategy to address the needs of uncertainty modeling in scientific datasets. Our proposed method is based on a statistically sound multivariate technique called Copula, which makes it possible to separate the process of estimating the univariate marginals and the process of modeling dependency, unlike the standard multivariate distributions. The modeling flexibility offered by our proposed method makes it possible to design distribution fields which can have different types of distribution (Gaussian, Histogram, KDE etc.) at the grid locations, while maintaining the correlation structure at the same time. Depending on the results of various standard statistical tests, we can choose an optimal distribution representation at each location, resulting in a more cost efficient modeling without significantly sacrificing on the analysis quality. To demonstrate the efficacy of our proposed modeling strategy, we extract and visualize uncertain features like isocontours and vortices in various real world datasets. We also study various modeling criterion to help users in the task of univariate model selection.

  7. Data synthesis and display programs for wave distribution function analysis

    Science.gov (United States)

    Storey, L. R. O.; Yeh, K. J.

    1992-01-01

    At the National Space Science Data Center (NSSDC) software was written to synthesize and display artificial data for use in developing the methodology of wave distribution analysis. The software comprises two separate interactive programs, one for data synthesis and the other for data display.

  8. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  9. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  10. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  11. Localized foreign body granulomas of the breast : clinical and mammographic findings

    International Nuclear Information System (INIS)

    Choi, Dong Il; Han, Boo Kyung; Choe, Yeon Hyeon; Park, Jeong Mi; Yang, Jung Hyun; Nam, Seok Jin

    1998-01-01

    The purpose of this study is to evaluate the clinical and radiographic findings of localized foreign body (FB) granulomas on mammograms. This study involved 13 patients with localized FB granulomas on mammograms; their history of mammoplasty or other plastic procedures was obtained by telephone interviews. Two radiologists analyzed the location and morphology of FB granulomas and the presence of associated linear densities or parenchymal distortion on mammograms. Four patients underwent ultrasonography. No patient had a history of mammoplasty. All 13, however, had a history of plastic procedure, three to 22 (average, 12) years previously, as follows: foreign materials including silicone liquid and oil such as paraffin had been injected into the anterior neck area of nine patients, the representing FB granulomas were distributed bilaterally in nine patients; they were noted in the suggesting fibrosis. There was no calcification or parenchymal distortion. Though in three cases, the masses were palpable. Ultrasonography revealed several anechoic nodules with posterior enhancement in subcutaneous fatty layers, and in one, 0.2 cc of oil droplet had been aspirated under ultrasonographic guidance. Localized FB granulomas of the breast could be caused by the migration of FB from cervicofacial areas. Mammography showed characteristic distribution of upper inner portions, and the findings were similiar to those of mild interstitial mammoplasty. (author). 13 refs., 2 figs

  12. Development of a web service for analysis in a distributed network.

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among researchers at geographically dispersed institutes.

  13. Development of a Web Service for Analysis in a Distributed Network

    Science.gov (United States)

    Jiang, Xiaoqian; Wu, Yuan; Marsolo, Keith; Ohno-Machado, Lucila

    2014-01-01

    Objective: We describe functional specifications and practicalities in the software development process for a web service that allows the construction of the multivariate logistic regression model, Grid Logistic Regression (GLORE), by aggregating partial estimates from distributed sites, with no exchange of patient-level data. Background: We recently developed and published a web service for model construction and data analysis in a distributed environment. This recent paper provided an overview of the system that is useful for users, but included very few details that are relevant for biomedical informatics developers or network security personnel who may be interested in implementing this or similar systems. We focus here on how the system was conceived and implemented. Methods: We followed a two-stage development approach by first implementing the backbone system and incrementally improving the user experience through interactions with potential users during the development. Our system went through various stages such as concept proof, algorithm validation, user interface development, and system testing. We used the Zoho Project management system to track tasks and milestones. We leveraged Google Code and Apache Subversion to share code among team members, and developed an applet-servlet architecture to support the cross platform deployment. Discussion: During the development process, we encountered challenges such as Information Technology (IT) infrastructure gaps and limited team experience in user-interface design. We figured out solutions as well as enabling factors to support the translation of an innovative privacy-preserving, distributed modeling technology into a working prototype. Conclusion: Using GLORE (a distributed model that we developed earlier) as a pilot example, we demonstrated the feasibility of building and integrating distributed modeling technology into a usable framework that can support privacy-preserving, distributed data analysis among

  14. AspectKE*:Security Aspects with Program Analysis for Distributed Systems

    DEFF Research Database (Denmark)

    2010-01-01

    AspectKE* is the first distributed AOP language based on a tuple space system. It is designed to enforce security policies to applications containing untrusted processes. One of the key features is the high-level predicates that extract results of static program analysis. These predicates provide...

  15. Data intensive high energy physics analysis in a distributed cloud

    International Nuclear Information System (INIS)

    Charbonneau, A; Impey, R; Podaima, W; Agarwal, A; Anderson, M; Armstrong, P; Fransham, K; Gable, I; Harris, D; Leavett-Brown, C; Paterson, M; Sobie, R J; Vliet, M

    2012-01-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  16. Data intensive high energy physics analysis in a distributed cloud

    Science.gov (United States)

    Charbonneau, A.; Agarwal, A.; Anderson, M.; Armstrong, P.; Fransham, K.; Gable, I.; Harris, D.; Impey, R.; Leavett-Brown, C.; Paterson, M.; Podaima, W.; Sobie, R. J.; Vliet, M.

    2012-02-01

    We show that distributed Infrastructure-as-a-Service (IaaS) compute clouds can be effectively used for the analysis of high energy physics data. We have designed a distributed cloud system that works with any application using large input data sets requiring a high throughput computing environment. The system uses IaaS-enabled science and commercial clusters in Canada and the United States. We describe the process in which a user prepares an analysis virtual machine (VM) and submits batch jobs to a central scheduler. The system boots the user-specific VM on one of the IaaS clouds, runs the jobs and returns the output to the user. The user application accesses a central database for calibration data during the execution of the application. Similarly, the data is located in a central location and streamed by the running application. The system can easily run one hundred simultaneous jobs in an efficient manner and should scale to many hundreds and possibly thousands of user jobs.

  17. Nonlinear analysis of field distribution in electric motor with periodicity conditions

    Energy Technology Data Exchange (ETDEWEB)

    Stabrowski, M M; Sikora, J

    1981-01-01

    Numerical analysis of electromagnetic field distribution in linear motion tubular electric motor has been performed with the aid of finite element method. Two Fortran programmes for the solution of DBBF and BF large linear symmetric equation systems have been developed for purposes of this analysis. A new iterative algorithm, taking into account iron nonlinearity and periodicity conditions, has been introduced. Final results of the analysis in the form of induction diagrammes and motor driving force are directly useful for motor designers.

  18. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  19. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  20. Analysis of distribution systems with a high penetration of distributed generation

    DEFF Research Database (Denmark)

    Lund, Torsten

    Since the mid eighties, a large number of wind turbines and distributed combined heat and power plants (CHPs) have been connected to the Danish power system. Especially in the Western part, comprising Jutland and Funen, the penetration is high compared to the load demand. In some periods the wind...... power alone can cover the entire load demand. The objective of the work is to investigate the influence of wind power and distributed combined heat and power production on the operation of the distribution systems. Where other projects have focused on the modeling and control of the generators and prime...... movers, the focus of this project is on the operation of an entire distribution system with several wind farms and CHPs. Firstly, the subject of allocation of power system losses in a distribution system with distributed generation is treated. A new approach to loss allocation based on current injections...

  1. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  2. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    overrepresentation score (SOS) and the geographic node divergence (GND) score, which together combine ecological and evolutionary patterns into a single framework and avoids many of the problems that characterize community phylogenetic methods in current use.This approach goes through each node in the phylogeny...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  3. Growing axons analysis by using Granulometric Size Distribution

    International Nuclear Information System (INIS)

    Gonzalez, Mariela A; Ballarin, Virginia L; Rapacioli, Melina; CelIn, A R; Sanchez, V; Flores, V

    2011-01-01

    Neurite growth (neuritogenesis) in vitro is a common methodology in the field of developmental neurobiology. Morphological analyses of growing neurites are usually difficult because their thinness and low contrast usually prevent to observe clearly their shape, number, length and spatial orientation. This paper presents the use of the granulometric size distribution in order to automatically obtain information about the shape, size and spatial orientation of growing axons in tissue cultures. The results here presented show that the granulometric size distribution results in a very useful morphological tool since it allows the automatic detection of growing axons and the precise characterization of a relevant parameter indicative of the axonal growth spatial orientation such as the quantification of the angle of deviation of the growing direction. The developed algorithms automatically quantify this orientation by facilitating the analysis of these images, which is important given the large number of images that need to be processed for this type of study.

  4. Modelling and analysis of solar cell efficiency distributions

    Science.gov (United States)

    Wasmer, Sven; Greulich, Johannes

    2017-08-01

    We present an approach to model the distribution of solar cell efficiencies achieved in production lines based on numerical simulations, metamodeling and Monte Carlo simulations. We validate our methodology using the example of an industrial feasible p-type multicrystalline silicon “passivated emitter and rear cell” process. Applying the metamodel, we investigate the impact of each input parameter on the distribution of cell efficiencies in a variance-based sensitivity analysis, identifying the parameters and processes that need to be improved and controlled most accurately. We show that if these could be optimized, the mean cell efficiencies of our examined cell process would increase from 17.62% ± 0.41% to 18.48% ± 0.09%. As the method relies on advanced characterization and simulation techniques, we furthermore introduce a simplification that enhances applicability by only requiring two common measurements of finished cells. The presented approaches can be especially helpful for ramping-up production, but can also be applied to enhance established manufacturing.

  5. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  6. Fractal analysis of the dark matter and gas distributions in the Mare-Nostrum universe

    International Nuclear Information System (INIS)

    Gaite, José

    2010-01-01

    We develop a method of multifractal analysis of N-body cosmological simulations that improves on the customary counts-in-cells method by taking special care of the effects of discreteness and large scale homogeneity. The analysis of the Mare-Nostrum simulation with our method provides strong evidence of self-similar multifractal distributions of dark matter and gas, with a halo mass function that is of Press-Schechter type but has a power-law exponent -2, as corresponds to a multifractal. Furthermore, our analysis shows that the dark matter and gas distributions are indistinguishable as multifractals. To determine if there is any gas biasing, we calculate the cross-correlation coefficient, with negative but inconclusive results. Hence, we develop an effective Bayesian analysis connected with information theory, which clearly demonstrates that the gas is biased in a long range of scales, up to the scale of homogeneity. However, entropic measures related to the Bayesian analysis show that this gas bias is small (in a precise sense) and is such that the fractal singularities of both distributions coincide and are identical. We conclude that this common multifractal cosmic web structure is determined by the dynamics and is independent of the initial conditions

  7. Local breast density assessment using reacquired mammographic images.

    Science.gov (United States)

    García, Eloy; Diaz, Oliver; Martí, Robert; Diez, Yago; Gubern-Mérida, Albert; Sentís, Melcior; Martí, Joan; Oliver, Arnau

    2017-08-01

    The aim of this paper is to evaluate the spatial glandular volumetric tissue distribution as well as the density measures provided by Volpara™ using a dataset composed of repeated pairs of mammograms, where each pair was acquired in a short time frame and in a slightly changed position of the breast. We conducted a retrospective analysis of 99 pairs of repeatedly acquired full-field digital mammograms from 99 different patients. The commercial software Volpara™ Density Maps (Volpara Solutions, Wellington, New Zealand) is used to estimate both the global and the local glandular tissue distribution in each image. The global measures provided by Volpara™, such as breast volume, volume of glandular tissue, and volumetric breast density are compared between the two acquisitions. The evaluation of the local glandular information is performed using histogram similarity metrics, such as intersection and correlation, and local measures, such as statistics from the difference image and local gradient correlation measures. Global measures showed a high correlation (breast volume R=0.99, volume of glandular tissue R=0.94, and volumetric breast density R=0.96) regardless the anode/filter material. Similarly, histogram intersection and correlation metric showed that, for each pair, the images share a high degree of information. Regarding the local distribution of glandular tissue, small changes in the angle of view do not yield significant differences in the glandular pattern, whilst changes in the breast thickness between both acquisition affect the spatial parenchymal distribution. This study indicates that Volpara™ Density Maps is reliable in estimating the local glandular tissue distribution and can be used for its assessment and follow-up. Volpara™ Density Maps is robust to small variations of the acquisition angle and to the beam energy, although divergences arise due to different breast compression conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  9. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yang Dan

    2008-12-01

    Full Text Available Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation. The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  10. Benefits of the quality assured double and arbitration reading of mammograms in the early diagnosis of breast cancer in symptomatic women

    International Nuclear Information System (INIS)

    Waldmann, Annika; Katalinic, Alexander; Kapsimalakou, Smaragda; Grande-Nagel, Isabell; Barkhausen, Joerg; Vogt, Florian M.; Stoeckelhuber, Beate M.; Fischer, Dorothea

    2012-01-01

    To address the benefits of double and arbitration reading regarding tumour detection rates, percentage of in situ tumours, and number (of patients) needed to send for expert reading (number needed to treat; NNT) for one additional tumour finding. QuaMaDi is a quality assured breast cancer diagnosis programme; with two-view mammography (craniocaudal, mediolateral oblique) and, in case of breast density ACR 3 or 4, routine ultrasound imaging; and with independent double reading of all images. A consecutive sample of symptomatic women, i.e. women at risk for breast cancer, women aged 70 and above, and/or women with preceding BI-RADS III findings, was analysed. 28,558 mammograms were performed (mean age of women: 57.3 [standard deviation: 12.3] years). Discordant findings were present in 3,837 double readings and were sent for arbitration reading. After histopathological assessment, 52 carcinomas were found (thereof 32% in situ). These carcinomas accounted for 1.8 tumours per 1,000 examinations in the total cohort and increased the tumour detection rate up to 16.4/1,000. The NNT in discordant cases was 74. Double and arbitration reading appears to be a useful tool to ensure the quality of early detection of breast lesions in symptomatic women during indication-based, standardised mammography. circle Quality assured breast cancer diagnosis is feasible outside organised screening structures. (orig.)

  11. AMIC: an expandable integrated analog front-end for light distribution moments analysis

    OpenAIRE

    SPAGGIARI, MICHELE; Herrero Bosch, Vicente; Lerche, Christoph Werner; Aliaga Varea, Ramón José; Monzó Ferrer, José María; Gadea Gironés, Rafael

    2011-01-01

    In this article we introduce AMIC (Analog Moments Integrated Circuit), a novel analog Application Specific Integrated Circuit (ASIC) front-end for Positron Emission Tomography (PET) applications. Its working principle is based on mathematical analysis of light distribution through moments calculation. Each moment provides useful information about light distribution, such as energy, position, depth of interaction, skewness (deformation due to border effect) etc. A current buffer delivers a cop...

  12. Aeroelastic Analysis of a Distributed Electric Propulsion Wing

    Science.gov (United States)

    Massey, Steven J.; Stanford, Bret K.; Wieseman, Carol D.; Heeg, Jennifer

    2017-01-01

    An aeroelastic analysis of a prototype distributed electric propulsion wing is presented. Results using MSC Nastran (Registered Trademark) doublet lattice aerodynamics are compared to those based on FUN3D Reynolds Averaged Navier- Stokes aerodynamics. Four levels of grid refinement were examined for the FUN3D solutions and solutions were seen to be well converged. It was found that no oscillatory instability existed, only that of divergence, which occurred in the first bending mode at a dynamic pressure of over three times the flutter clearance condition.

  13. Analysis of previous screening examinations for patients with breast cancer

    International Nuclear Information System (INIS)

    Lee, Eun Hye; Cha, Joo Hee; Han, Dae Hee; Choi, Young Ho; Hwang, Ki Tae; Ryu, Dae Sik; Kwak, Jin Ho; Moon, Woo Kyung

    2007-01-01

    We wanted to improve the quality of subsequent screening by reviewing the previous screening of breast cancer patients. Twenty-four breast cancer patients who underwent previous screening were enrolled. All 24 took mammograms and 15 patients also took sonograms. We reviewed the screening retrospectively according to the BI-RADS criteria and we categorized the results into false negative, true negative, true positive and occult cancers. We also categorized the causes of false negative cancers into misperception, misinterpretation and technical factors and then we analyzed the attributing factors. Review of the previous screening revealed 66.7% (16/24) false negative, 25.0% (6/24) true negative, and 8.3% (2/24) true positive cancers. False negative cancers were caused by the mammogram in 56.3% (9/16) and by the sonogram in 43.7% (7/16). For the false negative cases, all of misperception were related with mammograms and this was attributed to dense breast, a lesion located at the edge of glandular tissue or the image, and findings seen on one view only. Almost all misinterpretations were related with sonograms and attributed to loose application of the final assessment. To improve the quality of breast screening, it is essential to overcome the main causes of false negative examinations, including misperception and misinterpretation. We need systematic education and strict application of final assessment categories of BI-RADS. For effective communication among physicians, it is also necessary to properly educate them about BI-RADS

  14. Leontief Input-Output Method for The Fresh Milk Distribution Linkage Analysis

    Directory of Open Access Journals (Sweden)

    Riski Nur Istiqomah

    2016-11-01

    Full Text Available This research discusses about linkage analysis and identifies the key sector in the fresh milk distribution using Leontief Input-Output method. This method is one of the application of Mathematics in economy. The current fresh milk distribution system includes dairy farmers →collectors→fresh milk processing industries→processed milk distributors→consumers. Then, the distribution is merged between the collectors’ axctivity and the fresh milk processing industry. The data used are primary and secondary data taken in June 2016 in Kecamatan Jabung Kabupaten Malang. The collected data are then analysed using Leontief Input-Output Matriks and Python (PYIO 2.1 software. The result is that the merging of the collectors’ and the fresh milk processing industry’s activities shows high indices of forward linkages and backward linkages. It is shown that merging of the two activities is the key sector which has an important role in developing the whole activities in the fresh milk distribution.

  15. Designing Sustainable Systems for Urban Freight Distribution through techniques of Multicriteria Decision Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Muerza, V.; Larrode, E.; Moreno- Jimenez, J.M.

    2016-07-01

    This paper focuses on the analysis and selection of the parameters that have a major influence on the optimization of the urban freight distribution system by using sustainable means of transport, such as electric vehicles. In addition, a procedure has been be studied to identify the alternatives that may exist to establish the best system for urban freight distribution, which suits the stage that is considered using the most appropriate means of transportation available. To do this, it has been used the Analytic Hierarchy Process, one of the tools of multicriteria decision analysis. In order to establish an adequate planning of an urban freight distribution system using electric vehicles three hypotheses are necessary: (i) it is necessary to establish the strategic planning of the distribution process by defining the relative importance of the strategic objectives of the process of distribution of goods in the urban environment, both economically and technically and in social and environmental terms; (ii) it must be established the operational planning that allows the achievement of the strategic objectives with the most optimized allocation of available resources; and (iii) to determine the optimal architecture of the vehicle that best suits the operating conditions in which it will work and ensures optimum energy efficiency in operation. (Author)

  16. The MammoGrid Project Grids Architecture

    CERN Document Server

    McClatchey, Richard; Hauer, Tamas; Estrella, Florida; Saiz, Pablo; Rogulin, Dmitri; Buncic, Predrag; Clatchey, Richard Mc; Buncic, Predrag; Manset, David; Hauer, Tamas; Estrella, Florida; Saiz, Pablo; Rogulin, Dmitri

    2003-01-01

    The aim of the recently EU-funded MammoGrid project is, in the light of emerging Grid technology, to develop a European-wide database of mammograms that will be used to develop a set of important healthcare applications and investigate the potential of this Grid to support effective co-working between healthcare professionals throughout the EU. The MammoGrid consortium intends to use a Grid model to enable distributed computing that spans national borders. This Grid infrastructure will be used for deploying novel algorithms as software directly developed or enhanced within the project. Using the MammoGrid clinicians will be able to harness the use of massive amounts of medical image data to perform epidemiological studies, advanced image processing, radiographic education and ultimately, tele-diagnosis over communities of medical "virtual organisations". This is achieved through the use of Grid-compliant services [1] for managing (versions of) massively distributed files of mammograms, for handling the distri...

  17. The Analysis of Tree Species Distribution Information Extraction and Landscape Pattern Based on Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Yi Zeng

    2017-08-01

    Full Text Available The forest ecosystem is the largest land vegetation type, which plays the role of unreplacement with its unique value. And in the landscape scale, the research on forest landscape pattern has become the current hot spot, wherein the study of forest canopy structure is very important. They determines the process and the strength of forests energy flow, which influences the adjustments of ecosystem for climate and species diversity to some extent. The extraction of influencing factors of canopy structure and the analysis of the vegetation distribution pattern are especially important. To solve the problems, remote sensing technology, which is superior to other technical means because of its fine timeliness and large-scale monitoring, is applied to the study. Taking Lingkong Mountain as the study area, the paper uses the remote sensing image to analyze the forest distribution pattern and obtains the spatial characteristics of canopy structure distribution, and DEM data are as the basic data to extract the influencing factors of canopy structure. In this paper, pattern of trees distribution is further analyzed by using terrain parameters, spatial analysis tools and surface processes quantitative simulation. The Hydrological Analysis tool is used to build distributed hydrological model, and corresponding algorithm is applied to determine surface water flow path, rivers network and basin boundary. Results show that forest vegetation distribution of dominant tree species present plaque on the landscape scale and their distribution have spatial heterogeneity which is related to terrain factors closely. After the overlay analysis of aspect, slope and forest distribution pattern respectively, the most suitable area for stand growth and the better living condition are obtained.

  18. Breast Cancer Outreach for Underserved Women: A Randomized Trial and Cost-Effectiveness Analysis

    National Research Council Canada - National Science Library

    Pasick, Rena

    2001-01-01

    ...+ with no mammogram past two years, and to provide their names to project staff. Women were then called by part-time staff who offered education, motivation and assistance in obtaining screening...

  19. Distributional patterns of cecropia (Cecropiaceae: a panbiogeographic analysis

    Directory of Open Access Journals (Sweden)

    Franco Rosselli Pilar

    1997-06-01

    Full Text Available A panbiogeographic analysis of the distributional patterns of 60 species of Cecropia was carried out. Based on the distributional ranges of 36 species, we found eight generalized tracks for Cecropia species. whereas distributional patterns of 24 species were uninformative for the analysis. The major concentration of species of Cecropia is in the Neotropical Andean region. where there are three generalized tracks and two nodes. The northern Andes in Colombia and Ecuador are richer than the Central Andes in Perú. they contain two generalized tracks; one to the west and another to the east, formed by individual tracks of eight species each. There are four generalized tracks outside the Andean region: two in the Amazonian region in Guayana-Pará and in Manaus. one in Roraima. one in Serra do Mar in the Atlantic forest of Brazil and one in Central America. Speciation in Cecropia may be related to the Andean first uplift.Con base en la distribución de 60 especies del género Cecropia, se hizo un análisis panbiogeográfico. Se construyeron 8 trazos generalizados con base en el patrón de distribución de 36 especies; la distribución de las demás especies no aportaba información para la definición de los trazos. La región andina tiene la mayor concentración de especies de Cecropia representada por la presencia de tres trazos generalizados y dos nodos; los dos trazos con mayor número de especies se localizan en su parte norte, en Colombia y Ecuador y el otro en los Andes centrales en Perú. Se encontraron además, cuatro trazos extrandinos: dos en la región amazónica, en Pará-Guayana y en Manaus, uno en Roraima, uno en Serra do Mar en la Selva Atlánfíca del Brasil y uno en Centro América. La especiación en Cecropia parece estar relacionada con el primer levantamiento de los Andes.

  20. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  1. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  2. Distributed Data Analysis in the ATLAS Experiment: Challenges and Solutions

    International Nuclear Information System (INIS)

    Elmsheuser, Johannes; Van der Ster, Daniel

    2012-01-01

    The ATLAS experiment at the LHC at CERN is recording and simulating several 10's of PetaBytes of data per year. To analyse these data the ATLAS experiment has developed and operates a mature and stable distributed analysis (DA) service on the Worldwide LHC Computing Grid. The service is actively used: more than 1400 users have submitted jobs in the year 2011 and a total of more 1 million jobs run every week. Users are provided with a suite of tools to submit Athena, ROOT or generic jobs to the Grid, and the PanDA workload management system is responsible for their execution. The reliability of the DA service is high but steadily improving; Grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. This paper will review the state of the DA tools and services, summarize the past year of distributed analysis activity, and present the directions for future improvements to the system.

  3. Statistical analysis of the spatial distribution of galaxies and clusters

    International Nuclear Information System (INIS)

    Cappi, Alberto

    1993-01-01

    This thesis deals with the analysis of the distribution of galaxies and clusters, describing some observational problems and statistical results. First chapter gives a theoretical introduction, aiming to describe the framework of the formation of structures, tracing the history of the Universe from the Planck time, t_p = 10"-"4"3 sec and temperature corresponding to 10"1"9 GeV, to the present epoch. The most usual statistical tools and models of the galaxy distribution, with their advantages and limitations, are described in chapter two. A study of the main observed properties of galaxy clustering, together with a detailed statistical analysis of the effects of selecting galaxies according to apparent magnitude or diameter, is reported in chapter three. Chapter four delineates some properties of groups of galaxies, explaining the reasons of discrepant results on group distributions. Chapter five is a study of the distribution of galaxy clusters, with different statistical tools, like correlations, percolation, void probability function and counts in cells; it is found the same scaling-invariant behaviour of galaxies. Chapter six describes our finding that rich galaxy clusters too belong to the fundamental plane of elliptical galaxies, and gives a discussion of its possible implications. Finally chapter seven reviews the possibilities offered by multi-slit and multi-fibre spectrographs, and I present some observational work on nearby and distant galaxy clusters. In particular, I show the opportunities offered by ongoing surveys of galaxies coupled with multi-object fibre spectrographs, focusing on the ESO Key Programme A galaxy redshift survey in the south galactic pole region to which I collaborate and on MEFOS, a multi-fibre instrument with automatic positioning. Published papers related to the work described in this thesis are reported in the last appendix. (author) [fr

  4. Characterizing single-molecule FRET dynamics with probability distribution analysis.

    Science.gov (United States)

    Santoso, Yusdi; Torella, Joseph P; Kapanidis, Achillefs N

    2010-07-12

    Probability distribution analysis (PDA) is a recently developed statistical tool for predicting the shapes of single-molecule fluorescence resonance energy transfer (smFRET) histograms, which allows the identification of single or multiple static molecular species within a single histogram. We used a generalized PDA method to predict the shapes of FRET histograms for molecules interconverting dynamically between multiple states. This method is tested on a series of model systems, including both static DNA fragments and dynamic DNA hairpins. By fitting the shape of this expected distribution to experimental data, the timescale of hairpin conformational fluctuations can be recovered, in good agreement with earlier published results obtained using different techniques. This method is also applied to studying the conformational fluctuations in the unliganded Klenow fragment (KF) of Escherichia coli DNA polymerase I, which allows both confirmation of the consistency of a simple, two-state kinetic model with the observed smFRET distribution of unliganded KF and extraction of a millisecond fluctuation timescale, in good agreement with rates reported elsewhere. We expect this method to be useful in extracting rates from processes exhibiting dynamic FRET, and in hypothesis-testing models of conformational dynamics against experimental data.

  5. The Environmental Scenario Generator (ESG: a distributed environmental data archive analysis tool

    Directory of Open Access Journals (Sweden)

    E A Kihn

    2006-01-01

    Full Text Available The Environmental Scenario Generator (ESG is a network distributed software system designed to allow a user to interact with archives of environmental data for the purpose of scenario extraction, data analysis and integration with existing models that require environmental input. The ESG uses fuzzy-logic based search tools to allow a user to look for specific environmental scenarios in vast archives by specifying the search in human linguistic terms. For example, the user can specify a scenario such as a "cloud free week" or "high winds and low pressure" and then search relevant archives available across the network to get a list of matching events. The ESG hooks to existing archives of data by providing a simple communication framework and an efficient data model for exchanging data. Once data has been delivered by the distributed archives in the ESG data model, it can easily be accessed by the visualization, integration and analysis components to meet specific user requests. The ESG implementation provides a framework which can be taken as a pattern applicable to other distributed archive systems.

  6. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  7. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  8. An approach to prospective consequential life cycle assessment and net energy analysis of distributed electricity generation

    International Nuclear Information System (INIS)

    Jones, Christopher; Gilbert, Paul; Raugei, Marco; Mander, Sarah; Leccisi, Enrica

    2017-01-01

    Increasing distributed renewable electricity generation is one of a number of technology pathways available to policy makers to meet environmental and other sustainability goals. Determining the efficacy of such a pathway for a national electricity system implies evaluating whole system change in future scenarios. Life cycle assessment (LCA) and net energy analysis (NEA) are two methodologies suitable for prospective and consequential analysis of energy performance and associated impacts. This paper discusses the benefits and limitations of prospective and consequential LCA and NEA analysis of distributed generation. It concludes that a combined LCA and NEA approach is a valuable tool for decision makers if a number of recommendations are addressed. Static and dynamic temporal allocation are both needed for a fair comparison of distributed renewables with thermal power stations to account for their different impact profiles over time. The trade-offs between comprehensiveness and uncertainty in consequential analysis should be acknowledged, with system boundary expansion and system simulation models limited to those clearly justified by the research goal. The results of this approach are explorative, rather than for accounting purposes; this interpretive remit, and the assumptions in scenarios and system models on which results are contingent, must be clear to end users. - Highlights: • A common LCA and NEA framework for prospective, consequential analysis is discussed. • Approach to combined LCA and NEA of distributed generation scenarios is proposed. • Static and dynamic temporal allocation needed to assess distributed generation uptake.

  9. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  10. Data envelopment analysis with uncertain data: An application for Iranian electricity distribution companies

    International Nuclear Information System (INIS)

    Sadjadi, S.J.; Omrani, H.

    2008-01-01

    This paper presents Data Envelopment Analysis (DEA) model with uncertain data for performance assessment of electricity distribution companies. During the past two decades, DEA has been widely used for benchmarking the electricity distribution companies. However, there is no study among many existing DEA approaches where the uncertainty in data is allowed and, at the same time, the distribution of the random data is permitted to be unknown. The proposed method of this paper develops a new DEA method with the consideration of uncertainty on output parameters. The method is based on the adaptation of recently developed robust optimization approaches proposed by Ben-Tal and Nemirovski [2000. Robust solutions of linear programming problems contaminated with uncertain data. Mathematical Programming 88, 411-421] and Bertsimas et al. [2004. Robust linear optimization under general norms. Operations Research Letters 32, 510-516]. The results are compared with an existing parametric Stochastic Frontier Analysis (SFA) using data from 38 electricity distribution companies in Iran to show the effects of the data uncertainties on the performance of DEA outputs. The results indicate that the robust DEA approach can be a relatively more reliable method for efficiency estimating and ranking strategies

  11. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  12. Mathematical Model and Stability Analysis of Inverter-Based Distributed Generator

    Directory of Open Access Journals (Sweden)

    Alireza Khadem Abbasi

    2013-01-01

    Full Text Available This paper presents a mathematical (small-signal model of an electronically interfaced distributed generator (DG by considering the effect of voltage and frequency variations of the prime source. Dynamic equations are found by linearization about an operating point. In this study, the dynamic of DC part of the interface is included in the model. The stability analysis shows with proper selection of system parameters; the system is stable during steady-state and dynamic situations, and oscillatory modes are well damped. The proposed model is useful to study stability analysis of a standalone DG or a Microgrid.

  13. Vibration analysis of continuous maglev guideways with a moving distributed load model

    International Nuclear Information System (INIS)

    Teng, N G; Qiao, B P

    2008-01-01

    A model of moving distributed load with a constant speed is established for vertical vibration analysis of a continuous guideway in maglev transportation system. The guideway is considered as a continuous structural system and the action of maglev vehicles on guideways is considered as a moving distributed load. Vibration of the continuous guideways used in Shanghai maglev line is analyzed with this model. The factors that affect the vibration of the guideways, such as speeds, guideway's spans, frequency and damping, are discussed

  14. Inverse analysis of non-uniform temperature distributions using multispectral pyrometry

    Science.gov (United States)

    Fu, Tairan; Duan, Minghao; Tian, Jibin; Shi, Congling

    2016-05-01

    Optical diagnostics can be used to obtain sub-pixel temperature information in remote sensing. A multispectral pyrometry method was developed using multiple spectral radiation intensities to deduce the temperature area distribution in the measurement region. The method transforms a spot multispectral pyrometer with a fixed field of view into a pyrometer with enhanced spatial resolution that can give sub-pixel temperature information from a "one pixel" measurement region. A temperature area fraction function was defined to represent the spatial temperature distribution in the measurement region. The method is illustrated by simulations of a multispectral pyrometer with a spectral range of 8.0-13.0 μm measuring a non-isothermal region with a temperature range of 500-800 K in the spot pyrometer field of view. The inverse algorithm for the sub-pixel temperature distribution (temperature area fractions) in the "one pixel" verifies this multispectral pyrometry method. The results show that an improved Levenberg-Marquardt algorithm is effective for this ill-posed inverse problem with relative errors in the temperature area fractions of (-3%, 3%) for most of the temperatures. The analysis provides a valuable reference for the use of spot multispectral pyrometers for sub-pixel temperature distributions in remote sensing measurements.

  15. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  16. Performance Analysis of Radial Distribution Systems with UPQC and D-STATCOM

    Science.gov (United States)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-08-01

    This paper presents an effective method for finding optimum location of unified power quality conditioner (UPQC) and distributed static compensator (D-STATCOM) in radial distribution system. The bus having the minimum losses is selected as the candidate bus for UPQC placement and the optimal location of D-STATCOM is found by power loss index (PLI) method. The PLI values of all the buses are calculated and the bus having the highest PLI value is the most favorable bus and thus selected as candidate bus for D-STATCOM placement. Main contribution of this paper are: (i) finding optimum location of UPQC in radial distribution system (RDS) based on minimum power loss; (ii) finding the optimal size of UPQC which offers minimum losses; (iii) calculation of annual energy saving using UPQC and D-STATCOM; (iv) cost analysis with and without UPQC and D-STATCOM placement; and (v) comparison of results with and without UPQC and D-STATCOM placement in RDS. The algorithm is tested on IEEE 33-bus and 69-bus radial distribution systems by using MATLAB software.

  17. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-01-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  18. Sensitivity analysis and parameter estimation for distributed hydrological modeling: potential of variational methods

    Directory of Open Access Journals (Sweden)

    W. Castaings

    2009-04-01

    Full Text Available Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised with respect to model inputs.

    In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations but didactic application case.

    It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run and the singular value decomposition (SVD of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation.

    For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers is adopted.

    Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting.

  19. Integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management

    International Nuclear Information System (INIS)

    Catrinu, M.D.; Nordgard, D.E.

    2011-01-01

    Asset managers in electricity distribution companies generally recognize the need and the challenge of adding structure and a higher degree of formal analysis into the increasingly complex asset management decisions. This implies improving the present asset management practice by making the best use of the available data and expert knowledge and by adopting new methods for risk analysis and decision support and nevertheless better ways to document the decisions made. This paper discusses methods for integrating risk analysis and multi-criteria decision support under uncertainty in electricity distribution system asset management. The focus is on how to include the different company objectives and risk analyses into a structured decision framework when deciding how to handle the physical assets of the electricity distribution network. This paper presents an illustrative example of decision support for maintenance and reinvestment strategies based, using expert knowledge, simplified risk analyses and multi-criteria decision analysis under uncertainty.

  20. Application of «Sensor signal analysis network» complex for distributed, time synchronized analysis of electromagnetic radiation

    Science.gov (United States)

    Mochalov, Vladimir; Mochalova, Anastasia

    2017-10-01

    The paper considers a developing software-hardware complex «Sensor signal analysis network» for distributed and time synchronized analysis of electromagnetic radiations. The areas of application and the main features of the complex are described. An example of application of the complex to monitor natural electromagnetic radiation sources is considered based on the data recorded in VLF range. A generalized functional scheme of stream analysis of signals by a complex functional node is suggested and its application for stream detection of atmospherics, whistlers and tweaks is considered.

  1. Geographic distribution of hospital beds throughout China: a county-level econometric analysis.

    Science.gov (United States)

    Pan, Jay; Shallcross, David

    2016-11-08

    Geographical distribution of healthcare resources is an important dimension of healthcare access. Little work has been published on healthcare resource allocation patterns in China, despite public equity concerns. Using national data from 2043 counties, this paper investigates the geographic distribution of hospital beds at the county level in China. We performed Gini coefficient analysis to measure inequalities and ordinary least squares regression with fixed provincial effects and additional spatial specifications to assess key determinants. We found that provinces in west China have the least equitable resource distribution. We also found that the distribution of hospital beds is highly spatially clustered. Finally, we found that both county-level savings and government revenue show a strong positive relationship with county level hospital bed density. We argue for more widespread use of disaggregated, geographical data in health policy-making in China to support the rational allocation of healthcare resources, thus promoting efficiency and equity.

  2. Analysis of the space, time and energy distribution of Vrancea earthquakes

    International Nuclear Information System (INIS)

    Radulian, M.; Popa, M.

    1995-01-01

    Statistical analysis of fractal properties of space, time and energy distributions of Vrancea intermediate-depth earthquakes is performed on a homogeneous and complete data set. All events with magnitudes M L >2.5 which occurred from 1974 to 1992 are considered. The 19-year time interval includes the major earthquakes of March 4, 1977, August 26, 1986 and May 30, 1990. The subducted plate, lying between 60 km and 180 km depth, is divided into four active zones with characteristic seismic activities. The correlations between the parameters defining the seismic activities in these zones are studied. The predictive properties of the parameters related to the stress distribution on the fault are analysed. The significant anomalies in time and size distributions of earthquakes are emphasized. The correlations between spatial distribution (fractal dimension), the frequency-magnitude distribution (b slope value) and the high-frequency energy radiated by the source (fall off of the displacement spectra) are studied both at the scale of the whole seismogenic volume and the scale of a specific active zone. The results of this study for the Vrancea earthquakes bring evidence in favour of the seismic source model with hierarchical inhomogeneities (Frankel, 1991) (Author) 8 Figs., 2 Tabs., 5 Refs

  3. Vibration analysis of continuous maglev guideways with a moving distributed load model

    Energy Technology Data Exchange (ETDEWEB)

    Teng, N G; Qiao, B P [Department of Civil Engineering, Shanghai Jiao Tong University, 800 Dongchuan Road, Shanghai, 200240 (China)

    2008-02-15

    A model of moving distributed load with a constant speed is established for vertical vibration analysis of a continuous guideway in maglev transportation system. The guideway is considered as a continuous structural system and the action of maglev vehicles on guideways is considered as a moving distributed load. Vibration of the continuous guideways used in Shanghai maglev line is analyzed with this model. The factors that affect the vibration of the guideways, such as speeds, guideway's spans, frequency and damping, are discussed.

  4. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing

    Directory of Open Access Journals (Sweden)

    Samsinar Riza

    2018-01-01

    Full Text Available The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  5. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    Science.gov (United States)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  6. Stability analysis and reconstruction of wave distribution functions in warm plasmas

    International Nuclear Information System (INIS)

    Oscarsson, T.E.

    1989-05-01

    The purpose of this thesis is first to describe stability analysis and reconstruction of the wave distribution function (WDF) separately, and then to show how the two approaches can be combined in an investigation of satellite data. To demonstrate the type of stability investigation that is often used in space physics we study instabilities below the local proton gyrofrequency which are caused by anisotropic proton distributions. Arbitrary angles between the wavevector and the background magnetic field are considered, and effects of warm plasma on the wave propagation properties are included. We also comment briefly given on an often-used scheme for classifying instabilities. In our discussion on WDF analysis we develop a completely new and general method for reconstructing the WDF. Our scheme can be used to reconstruct the distribution function of waves in warm as well as cold plasma. Doppler effects introduced by satellite motion are included, and the reconstructions can be performed over a broad frequency range simultaneously. The applicability of our new WDF reconstruction method is studied in model problems and in an application to observations made by the Swedish satellite Viking. In the application to Viking data we combine stability and WDF analyses in a unique way that promises to become an important tool in future studies of wave-particle interactions in space plasmas. (author)

  7. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  8. Development of pair distribution function analysis

    International Nuclear Information System (INIS)

    Vondreele, R.; Billinge, S.; Kwei, G.; Lawson, A.

    1996-01-01

    This is the final report of a 3-year LDRD project at LANL. It has become more and more evident that structural coherence in the CuO 2 planes of high-T c superconducting materials over some intermediate length scale (nm range) is important to superconductivity. In recent years, the pair distribution function (PDF) analysis of powder diffraction data has been developed for extracting structural information on these length scales. This project sought to expand and develop this technique, use it to analyze neutron powder diffraction data, and apply it to problems. In particular, interest is in the area of high-T c superconductors, although we planned to extend the study to the closely related perovskite ferroelectric materials andother materials where the local structure affects the properties where detailed knowledge of the local and intermediate range structure is important. In addition, we planned to carry out single crystal experiments to look for diffuse scattering. This information augments the information from the PDF

  9. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang; Alouini, Mohamed-Slim; Qaraqe, Khalid A.

    2012-01-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  10. Performance analysis of distributed beamforming in a spectrum sharing system

    KAUST Repository

    Yang, Liang

    2012-09-01

    In this paper, we consider a distributed beamforming scheme (DBF) in a spectrum sharing system where multiple secondary users share the spectrum with the licensed primary users under an interference temperature constraint. We assume that DBF is applied at the secondary users. We first consider optimal beamforming and compare it with the user selection scheme in terms of the outage probability and bit-error rate performance. Since perfect feedback is difficult to obtain, we then investigate a limited feedback DBF scheme and develop an outage probability analysis for a random vector quantization (RVQ) design algorithm. Numerical results are provided to illustrate our mathematical formalism and verify our analysis. © 2012 IEEE.

  11. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  12. Temporal assessment of radiomic features on clinical mammography in a high-risk population

    Science.gov (United States)

    Mendel, Kayla R.; Li, Hui; Lan, Li; Chan, Chun-Wai; King, Lauren M.; Tayob, Nabihah; Whitman, Gary; El-Zein, Randa; Bedrosian, Isabelle; Giger, Maryellen L.

    2018-02-01

    Extraction of high-dimensional quantitative data from medical images has become necessary in disease risk assessment, diagnostics and prognostics. Radiomic workflows for mammography typically involve a single medical image for each patient although medical images may exist for multiple imaging exams, especially in screening protocols. Our study takes advantage of the availability of mammograms acquired over multiple years for the prediction of cancer onset. This study included 841 images from 328 patients who developed subsequent mammographic abnormalities, which were confirmed as either cancer (n=173) or non-cancer (n=155) through diagnostic core needle biopsy. Quantitative radiomic analysis was conducted on antecedent FFDMs acquired a year or more prior to diagnostic biopsy. Analysis was limited to the breast contralateral to that in which the abnormality arose. Novel metrics were used to identify robust radiomic features. The most robust features were evaluated in the task of predicting future malignancies on a subset of 72 subjects (23 cancer cases and 49 non-cancer controls) with mammograms over multiple years. Using linear discriminant analysis, the robust radiomic features were merged into predictive signatures by: (i) using features from only the most recent contralateral mammogram, (ii) change in feature values between mammograms, and (iii) ratio of feature values over time, yielding AUCs of 0.57 (SE=0.07), 0.63 (SE=0.06), and 0.66 (SE=0.06), respectively. The AUCs for temporal radiomics (ratio) statistically differed from chance, suggesting that changes in radiomics over time may be critical for risk assessment. Overall, we found that our two-stage process of robustness assessment followed by performance evaluation served well in our investigation on the role of temporal radiomics in risk assessment.

  13. Analysis of the influences of grid-connected PV power system on distribution grids

    Directory of Open Access Journals (Sweden)

    Dumitru Popandron

    2013-12-01

    Full Text Available This paper presents the analysis of producing an electric power of 2.8 MW using a solar photovoltaic plant. The PV will be grid connected to the distribution network. The study is focused on the influences of connecting to the grid of a photovoltaic system, using modern software for analysis, modeling and simulation in power systems.

  14. Evolution of the ATLAS PanDA Production and Distributed Analysis System

    International Nuclear Information System (INIS)

    Maeno, T; Wenaus, T; Fine, V; Potekhin, M; Panitkin, S; De, K; Nilsson, P; Stradling, A; Walker, R; Compostella, G

    2012-01-01

    The PanDA (Production and Distributed Analysis) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at LHC data processing scale. PanDA has performed well with high reliability and robustness during the two years of LHC data-taking, while being actively evolved to meet the rapidly changing requirements for analysis use cases. We will present an overview of system evolution including automatic rebrokerage and reattempt for analysis jobs, adaptation for the CernVM File System, support for the multi-cloud model through which Tier-2 sites act as members of multiple clouds, pledged resource management and preferential brokerage, and monitoring improvements. We will also describe results from the analysis of two years of PanDA usage statistics, current issues, and plans for the future.

  15. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    Science.gov (United States)

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  16. Small disturbances stability analysis applied in a radial distribution system with distributed generation units; Analise de estabilidade a pequenos disturbios aplicada em um sistema de distribuicao radial com unidades de geracao distribuida

    Energy Technology Data Exchange (ETDEWEB)

    Dorca, Daniel Azevedo; Camacho, Jose Roberto [Universidade Federal de Uberlandia (UFU), MG (Brazil). Curso de Mestrado em Engenharia Eletrica

    2008-07-01

    This work investigates the small-disturbance stability of a 30 bus radial distribution system with distributed generation units. This work is realized through the time domain simulations and through the eigenvalue analysis and participation factors. The eigenvalue analysis show that is possible to predict a possible system instability face to a disturbance. The development of this work was stimulated by the increasing of the distributed generation units in the distribution networks. (author)

  17. Decision analysis for the cost effectiveness of Sestamibi Scintimammography in minimizing unnecessary biopsies

    International Nuclear Information System (INIS)

    Allen, M.W.; Hendi, P.; Schwimmer, J.; Gambhir, S.S.; Bassett, L.

    2000-01-01

    The purpose of this study was to assess if breast cancer screening using sestamibi scintimammography (SSMM) in conjunction with mammography (MM) is cost effective in avoiding biopsies in healthy patients. Quantitative decision tree sensitivity analysis was used to compare the conventional MM alone strategy (strategy A) with two decision strategies for screening with SSMM; SSMM after an indeterminate mammogram (strategy B) or SSMM after both a positive and an indeterminate mammogram (strategy C). Cost effectiveness was measured by calculating the expected cost per patient and the average life expectancy per patient for baseline values as well as over a range of values for all of the variables of each strategy. Based on Medicare reimbursement values, strategies B and C showed a cost savings of $9 and $20 per patient respectively as compared to strategy A. This translates into respective savings of $189 and $420 million per year assuming 21 million females undergo screening each year. Strategies B and C did however have a loss of mean life expectancy of 0.000178 and 0.000222 years respectively as compared to strategy A due to interval progression of breast cancer in a small number of women. Strategies B and C significantly lowered the number of biopsies performed on healthy patients in the screening population by 750,063 and 1,557,915 biopsies respectively as compared to strategy A. These results quantitatively verify the potential utility of using SSMM in avoiding unnecessary biopsies

  18. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  19. Combining Different Tools for EEG Analysis to Study the Distributed Character of Language Processing

    Directory of Open Access Journals (Sweden)

    Armando Freitas da Rocha

    2015-01-01

    Full Text Available Recent studies on language processing indicate that language cognition is better understood if assumed to be supported by a distributed intelligent processing system enrolling neurons located all over the cortex, in contrast to reductionism that proposes to localize cognitive functions to specific cortical structures. Here, brain activity was recorded using electroencephalogram while volunteers were listening or reading small texts and had to select pictures that translate meaning of these texts. Several techniques for EEG analysis were used to show this distributed character of neuronal enrollment associated with the comprehension of oral and written descriptive texts. Low Resolution Tomography identified the many different sets (si of neurons activated in several distinct cortical areas by text understanding. Linear correlation was used to calculate the information H(ei provided by each electrode of the 10/20 system about the identified si. H(ei Principal Component Analysis (PCA was used to study the temporal and spatial activation of these sources si. This analysis evidenced 4 different patterns of H(ei covariation that are generated by neurons located at different cortical locations. These results clearly show that the distributed character of language processing is clearly evidenced by combining available EEG technologies.

  20. Achievements of the ATLAS Distributed Analysis during the first run period

    CERN Document Server

    Farida, Fassi; The ATLAS collaboration

    2013-01-01

    Summary : In the LHC operations era analyzing the large data by the distributed physicists becomes a challenging task. The Computing Model of the ATLAS experiment at the LHC at CERN was designed around the concepts of grid computing. Large data volumes from the detectors and simulations require a large number of CPUs and storage space for data processing. To cope with these challenges a global network known as the Worlwide LHC Computing Grid (WLCG) was built. This is the most sophisticated data taking and analysis system ever built. Since the start of data-taking, the ATLAS Distributed Analysis (ADA) service has been running stably with the huge amount of data. The reliability of the ADA service is high but steadily improving; grid sites are continually validated against a set of standard tests, and a dedicated team of expert shifters provides user support and communicates user problems to the sites. The ATLAS Grid Computing Model is reviewed in this talk. Emphasis is given to ADA system. Description : The ce...

  1. Objective models of compressed breast shapes undergoing mammography

    Science.gov (United States)

    Feng, Steve Si Jia; Patel, Bhavika; Sechopoulos, Ioannis

    2013-01-01

    Purpose: To develop models of compressed breasts undergoing mammography based on objective analysis, that are capable of accurately representing breast shapes in acquired clinical images and generating new, clinically realistic shapes. Methods: An automated edge detection algorithm was used to catalogue the breast shapes of clinically acquired cranio-caudal (CC) and medio-lateral oblique (MLO) view mammograms from a large database of digital mammography images. Principal component analysis (PCA) was performed on these shapes to reduce the information contained within the shapes to a small number of linearly independent variables. The breast shape models, one of each view, were developed from the identified principal components, and their ability to reproduce the shape of breasts from an independent set of mammograms not used in the PCA, was assessed both visually and quantitatively by calculating the average distance error (ADE). Results: The PCA breast shape models of the CC and MLO mammographic views based on six principal components, in which 99.2% and 98.0%, respectively, of the total variance of the dataset is contained, were found to be able to reproduce breast shapes with strong fidelity (CC view mean ADE = 0.90 mm, MLO view mean ADE = 1.43 mm) and to generate new clinically realistic shapes. The PCA models based on fewer principal components were also successful, but to a lesser degree, as the two-component model exhibited a mean ADE = 2.99 mm for the CC view, and a mean ADE = 4.63 mm for the MLO view. The four-component models exhibited a mean ADE = 1.47 mm for the CC view and a mean ADE = 2.14 mm for the MLO view. Paired t-tests of the ADE values of each image between models showed that these differences were statistically significant (max p-value = 0.0247). Visual examination of modeled breast shapes confirmed these results. Histograms of the PCA parameters associated with the six principal components were fitted with Gaussian distributions. The six

  2. Objective models of compressed breast shapes undergoing mammography

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Steve Si Jia [Department of Biomedical Engineering, Georgia Institute of Technology and Emory University and Department of Radiology and Imaging Sciences, Emory University, 1701 Uppergate Drive Northeast, Suite 5018, Atlanta, Georgia 30322 (United States); Patel, Bhavika [Department of Radiology and Imaging Sciences, Emory University, 1701 Uppergate Drive Northeast, Suite 5018, Atlanta, Georgia 30322 (United States); Sechopoulos, Ioannis [Departments of Radiology and Imaging Sciences, Hematology and Medical Oncology and Winship Cancer Institute, Emory University, 1701 Uppergate Drive Northeast, Suite 5018, Atlanta, Georgia 30322 (United States)

    2013-03-15

    Purpose: To develop models of compressed breasts undergoing mammography based on objective analysis, that are capable of accurately representing breast shapes in acquired clinical images and generating new, clinically realistic shapes. Methods: An automated edge detection algorithm was used to catalogue the breast shapes of clinically acquired cranio-caudal (CC) and medio-lateral oblique (MLO) view mammograms from a large database of digital mammography images. Principal component analysis (PCA) was performed on these shapes to reduce the information contained within the shapes to a small number of linearly independent variables. The breast shape models, one of each view, were developed from the identified principal components, and their ability to reproduce the shape of breasts from an independent set of mammograms not used in the PCA, was assessed both visually and quantitatively by calculating the average distance error (ADE). Results: The PCA breast shape models of the CC and MLO mammographic views based on six principal components, in which 99.2% and 98.0%, respectively, of the total variance of the dataset is contained, were found to be able to reproduce breast shapes with strong fidelity (CC view mean ADE = 0.90 mm, MLO view mean ADE = 1.43 mm) and to generate new clinically realistic shapes. The PCA models based on fewer principal components were also successful, but to a lesser degree, as the two-component model exhibited a mean ADE = 2.99 mm for the CC view, and a mean ADE = 4.63 mm for the MLO view. The four-component models exhibited a mean ADE = 1.47 mm for the CC view and a mean ADE = 2.14 mm for the MLO view. Paired t-tests of the ADE values of each image between models showed that these differences were statistically significant (max p-value = 0.0247). Visual examination of modeled breast shapes confirmed these results. Histograms of the PCA parameters associated with the six principal components were fitted with Gaussian distributions. The six

  3. Objective models of compressed breast shapes undergoing mammography

    International Nuclear Information System (INIS)

    Feng, Steve Si Jia; Patel, Bhavika; Sechopoulos, Ioannis

    2013-01-01

    Purpose: To develop models of compressed breasts undergoing mammography based on objective analysis, that are capable of accurately representing breast shapes in acquired clinical images and generating new, clinically realistic shapes. Methods: An automated edge detection algorithm was used to catalogue the breast shapes of clinically acquired cranio-caudal (CC) and medio-lateral oblique (MLO) view mammograms from a large database of digital mammography images. Principal component analysis (PCA) was performed on these shapes to reduce the information contained within the shapes to a small number of linearly independent variables. The breast shape models, one of each view, were developed from the identified principal components, and their ability to reproduce the shape of breasts from an independent set of mammograms not used in the PCA, was assessed both visually and quantitatively by calculating the average distance error (ADE). Results: The PCA breast shape models of the CC and MLO mammographic views based on six principal components, in which 99.2% and 98.0%, respectively, of the total variance of the dataset is contained, were found to be able to reproduce breast shapes with strong fidelity (CC view mean ADE = 0.90 mm, MLO view mean ADE = 1.43 mm) and to generate new clinically realistic shapes. The PCA models based on fewer principal components were also successful, but to a lesser degree, as the two-component model exhibited a mean ADE = 2.99 mm for the CC view, and a mean ADE = 4.63 mm for the MLO view. The four-component models exhibited a mean ADE = 1.47 mm for the CC view and a mean ADE = 2.14 mm for the MLO view. Paired t-tests of the ADE values of each image between models showed that these differences were statistically significant (max p-value = 0.0247). Visual examination of modeled breast shapes confirmed these results. Histograms of the PCA parameters associated with the six principal components were fitted with Gaussian distributions. The six

  4. Analysis of tecniques for measurement of the size distribution of solid particles

    Directory of Open Access Journals (Sweden)

    F. O. Arouca

    2005-03-01

    Full Text Available Determination of the size distribution of solid particles is fundamental for analysis of the performance several pieces of equipment used for solid-fluid separation. The main objective of this work is to compare the results obtained with two traditional methods for determination of the size grade distribution of powdery solids: the gamma-ray attenuation technique (GRAT and the LADEQ test tube technique. The effect of draining the suspension in the two techniques used was also analyzed. The GRAT can supply the particle size distribution of solids through the monitoring of solid concentration in experiments on batch settling of diluted suspensions. The results show that use of the peristaltic pump in the GRAT and the LADEQ methods produced a significant difference between the values obtained for the parameters of the particle size model.

  5. Mammographic Image Analysis of Breast Using Neural Network

    Directory of Open Access Journals (Sweden)

    Lesa MAMBWE

    2015-07-01

    Full Text Available This paper discusses the various stages of detecting tumours of the breast mammogram images. A Neural Network algorithm is applied for obtaining the complete classification of the tumour into normal or abnormal. The most important procedure or technique for obtaining the classification is the feature extraction, by extracting a few of discriminative features, first-order statistical intensities and gradients. The Image Pre-processing technique is essential prior to Image Segmentation in order to obtain accurate segmentation. Thus mass detection can be carried out. The processes involved in achieving the three techniques mentioned above include global equalization transformation, denoising, binarization, breast orientation determination and the pectoral muscle suppression. The presented feature difference matrices could be created by five features extracted from a suspicious region of interest (ROI. Grey Level Co-occurrence Matrix (GLCM aids the obtaining of statistical features such as correlation, energy, entropy and homogeneity. The other statistical to features to obtain are area, moment, variance, entropy, standard deviation and moment. The Neural network technique yields results of abnormal mammograms.

  6. Rank-Ordered Multifractal Analysis (ROMA of probability distributions in fluid turbulence

    Directory of Open Access Journals (Sweden)

    C. C. Wu

    2011-04-01

    Full Text Available Rank-Ordered Multifractal Analysis (ROMA was introduced by Chang and Wu (2008 to describe the multifractal characteristic of intermittent events. The procedure provides a natural connection between the rank-ordered spectrum and the idea of one-parameter scaling for monofractals. This technique has successfully been applied to MHD turbulence simulations and turbulence data observed in various space plasmas. In this paper, the technique is applied to the probability distributions in the inertial range of the turbulent fluid flow, as given in the vast Johns Hopkins University (JHU turbulence database. In addition, a new way of finding the continuous ROMA spectrum and the scaled probability distribution function (PDF simultaneously is introduced.

  7. Analysis Of Educational Services Distribution-Based Geographic Information System GIS

    Directory of Open Access Journals (Sweden)

    Waleed Lagrab

    2015-03-01

    Full Text Available Abstract This study analyzes the spatial distribution of kindergarten facilities in the study area based on the Geographic Information Systems GIS in order to test an efficiency of GIS technology to redistribute the existing kindergarten and choose the best location in the future and applying the standard criteria for selecting the suitable locations for kindergarten. To achieve this goal the data and information are collected via interviews and comprehensive statistics on the education facilities in Mukalla districts in YEMEN which contributed to building a geographic database for the study area. After that the Kindergarten spatial patterns are analyzed in terms of proximity to each other and used near some other land in the surrounding area such as streets highways factories etc. Also measures the concentration dispersion clustering and distribution direction for the kindergarten this study showed the effectiveness of the GIS for spatial data analysis. One of the most important finding that most of the Kindergarten was established in Mukalla city did not take into account the criteria that set by the authorities. Furthermore almost district suffers from a shortage in the number of kindergarten and pattern of distribution of those kindergartens dominated by spatial dispersed.

  8. Evaluating Domestic Hot Water Distribution System Options With Validated Analysis Models

    Energy Technology Data Exchange (ETDEWEB)

    Weitzel, E.; Hoeschele, M.

    2014-09-01

    A developing body of work is forming that collects data on domestic hot water consumption, water use behaviors, and energy efficiency of various distribution systems. A full distribution system developed in TRNSYS has been validated using field monitoring data and then exercised in a number of climates to understand climate impact on performance. This study builds upon previous analysis modelling work to evaluate differing distribution systems and the sensitivities of water heating energy and water use efficiency to variations of climate, load, distribution type, insulation and compact plumbing practices. Overall 124 different TRNSYS models were simulated. Of the configurations evaluated, distribution losses account for 13-29% of the total water heating energy use and water use efficiency ranges from 11-22%. The base case, an uninsulated trunk and branch system sees the most improvement in energy consumption by insulating and locating the water heater central to all fixtures. Demand recirculation systems are not projected to provide significant energy savings and in some cases increase energy consumption. Water use is most efficient with demand recirculation systems, followed by the insulated trunk and branch system with a central water heater. Compact plumbing practices and insulation have the most impact on energy consumption (2-6% for insulation and 3-4% per 10 gallons of enclosed volume reduced). The results of this work are useful in informing future development of water heating best practices guides as well as more accurate (and simulation time efficient) distribution models for annual whole house simulation programs.

  9. Threatened Plants in China’s Sanjiang Plain: Hotspot Distributions and Gap Analysis

    Directory of Open Access Journals (Sweden)

    Baojia Du

    2018-01-01

    Full Text Available Global biodiversity is markedly decreasing in response to climate change and human disturbance. Sanjiang Plain is recognized as a biodiversity hotspot in China due to its high forest and wetland coverage, but species are being lost at an unprecedented rate, induced by anthropogenic activities. Identifying hotspot distributions and conservation gaps of threatened species is of particular significance for enhancing the conservation of biodiversity. Specifically, we integrated the principles and methods of spatial hotspot inspection, geographic information system (GIS technology and spatial autocorrelation analysis along with fieldwork to determine the spatial distribution patterns and unprotected hotspots of vulnerable and endangered plants in Sanjiang Plain. A gap analysis of the conservation status of vulnerable and endangered plants was conducted. Our results indicate that six nationally-protected plants were not observed in nature reserves or were without any protection, while the protection rates were <10% for 10 other nationally-protected plants. Protected areas (PAs cover <5% of the distribution areas for 31 threatened plant species, while only five species are covered by national nature reserves (NNRs within >50% of the distribution areas. We found 30 hotspots with vulnerable and endangered plants in the study area, but the area covered by NNRs is very limited. Most of the hotspots were located in areas with a high-high aggregation of plant species. Therefore, it is necessary to expand the area of existing nature reserves, establish miniature protection plots and create new PAs and ecological corridors to link the existing PAs. Our findings can contribute to the design of a PA network for botanical conservation.

  10. Computerized analysis of mammographic parenchymal patterns for assessing breast cancer risk: Effect of ROI size and location

    International Nuclear Information System (INIS)

    Li Hui; Giger, Maryellen L.; Huo Zhimin; Olopade, Olufunmilayo I.; Lan Li; Weber, Barbara L.; Bonta, Ioana

    2004-01-01

    The long-term goal of our research is to develop computerized radiographic markers for assessing breast density and parenchymal patterns that may be used together with clinical measures for determining the risk of breast cancer and assessing the response to preventive treatment. In our earlier studies, we found that women at high risk tended to have dense breasts with mammographic patterns that were coarse and low in contrast. With our method, computerized texture analysis is performed on a region of interest (ROI) within the mammographic image. In our current study, we investigate the effect of ROI size and ROI location on the computerized texture features obtained from 90 subjects (30 BRCA1/BRCA2 gene-mutation carriers and 60 age-matched women deemed to be at low risk for breast cancer). Mammograms were digitized at 0.1 mm pixel size and various ROI sizes were extracted from different breast regions in the craniocaudal (CC) view. Seventeen features, which characterize the density and texture of the parenchymal patterns, were extracted from the ROIs on these digitized mammograms. Stepwise feature selection and linear discriminant analysis were applied to identify features that differentiate between the low-risk women and the BRCA1/BRCA2 gene-mutation carriers. ROC analysis was used to assess the performance of the features in the task of distinguishing between these two groups. Our results show that there was a statistically significant decrease in the performance of the computerized texture features, as the ROI location was varied from the central region behind the nipple. However, we failed to show a statistically significant decrease in the performance of the computerized texture features with decreasing ROI size for the range studied

  11. Study on effects of mixing vane grids on coolant temperature distribution by subchannel analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mao, H.; Yang, B.W.; Han, B. [Xi' an Jiaotong Univ., Shaanxi (China). Science and Technology Center for Advanced Nuclear Fuel Research

    2016-07-15

    Mixing vane grids (MVG) have great influence on coolant temperature field in the rod bundle. The MVG could enhance convective heat transfer between the fuel rod wall and the coolant, and promote inter-subchannel mixing at the same time. For the influence of the MVG on convective heat transfer enhancement, many experiments have been done and several correlations have been developed based on the experimental data. However, inter-subchannel mixing promotion caused by the MVG is not well estimated in subchannel analysis because the information of mixing vanes is totally missing in most subchannel codes. This paper analyzes the influence of mixing vanes on coolant temperature distribution using the improved MVG model in subchannel analysis. The coolant temperature distributions with the MVG are analyzed, and the results show that mixing vanes lead to a more uniform temperature distribution. The performances of split vane grids under different power conditions are evaluated. The results are compared with those of spacer grids without mixing vanes and some conclusions are obtained.

  12. Study on effects of mixing vane grids on coolant temperature distribution by subchannel analysis

    International Nuclear Information System (INIS)

    Mao, H.; Yang, B.W.; Han, B.

    2016-01-01

    Mixing vane grids (MVG) have great influence on coolant temperature field in the rod bundle. The MVG could enhance convective heat transfer between the fuel rod wall and the coolant, and promote inter-subchannel mixing at the same time. For the influence of the MVG on convective heat transfer enhancement, many experiments have been done and several correlations have been developed based on the experimental data. However, inter-subchannel mixing promotion caused by the MVG is not well estimated in subchannel analysis because the information of mixing vanes is totally missing in most subchannel codes. This paper analyzes the influence of mixing vanes on coolant temperature distribution using the improved MVG model in subchannel analysis. The coolant temperature distributions with the MVG are analyzed, and the results show that mixing vanes lead to a more uniform temperature distribution. The performances of split vane grids under different power conditions are evaluated. The results are compared with those of spacer grids without mixing vanes and some conclusions are obtained.

  13. New Approaches for Very Short-term Steady-State Analysis of An Electrical Distribution System with Wind Farms

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2010-04-01

    Full Text Available Distribution networks are undergoing radical changes due to the high level of penetration of dispersed generation. Dispersed generation systems require particular attention due to their incorporation of uncertain energy sources, such as wind farms, and due to the impacts that such sources have on the planning and operation of distribution networks. In particular, the foreseeable, extensive use of wind turbine generator units in the future requires that distribution system engineers properly account for their impacts on the system. Many new technical considerations must be addressed, including protection coordination, steady-state analysis, and power quality issues. This paper deals with the very short-term, steady-state analysis of a distribution system with wind farms, for which the time horizon of interest ranges from one hour to a few hours ahead. Several wind-forecasting methods are presented in order to obtain reliable input data for the steady-state analysis. Both deterministic and probabilistic methods were considered and used in performing deterministic and probabilistic load-flow analyses. Numerical applications on a 17-bus, medium-voltage, electrical distribution system with various wind farms connected at different busbars are presented and discussed.

  14. Do cultural factors predict mammography behaviour among Korean immigrants in the USA?

    Science.gov (United States)

    Lee, Hanju; Kim, Jiyun; Han, Hae-Ra

    2009-12-01

    This paper is a report of a study of the correlates of mammogram use among Korean American women. Despite the increasing incidence of and mortality from breast cancer, Asian women in the United States of America report consistently low rates of mammography screening. A number of health beliefs and sociodemographic characteristics have been associated with mammogram participation among these women. However, studies systematically investigating cultural factors in relation to mammogram experience have been scarce. We measured screening-related health beliefs, modesty and use of Eastern medicine in 100 Korean American women in 2006. Hierarchical logistic regression was used to examine the unique contribution of the study variables, after accounting for sociodemographic characteristics. Only 51% reported past mammogram use. Korean American women who had previously had mammograms were statistically significantly older and had higher perceived benefit scores than those who had not. Perceived benefits (odds ratio = 6.3, 95% confidence interval = 2.12, 18.76) and breast cancer susceptibility (odds ratio = 3.18, 95% confidence interval = 1.06, 9.59) were statistically significant correlates of mammography experience, whereas cultural factors did not correlate. Post hoc analysis showed that for women with some or good English skills, cultural factors statistically significantly correlated with health beliefs and breast cancer knowledge (P culturally tailored interventions of more targeted outreach and healthcare system navigation assistance for promoting mammography screening in Korean American women. Further research is needed to unravel the interplay between acculturation, cultural factors and health beliefs related to cancer screening behaviours of Korean American women.

  15. Duvernay shale lithofacies distribution analysis in the West Canadian Sedimentary Basin

    Science.gov (United States)

    Zhu, Houqin; Kong, Xiangwen; Long, Huashan; Huai, Yinchao

    2018-02-01

    In the West Canadian Sedimentary Basin (WCSB), Duvernay shale is considered to contribute most of the Canadian shale gas reserve and production. According to global shale gas exploration and development practice, reservoir property and well completion quality are the two key factors determining the shale gas economics. The two key factors are strongly depending on shale lithofacies. On the basis of inorganic mineralogy theory, all available thin section, X-ray diffraction, scanning electron microscope (SEM), energy dispersive spectrometer (EDS) data were used to assist lithofacies analysis. Gamma ray (GR), acoustic (AC), bulk density (RHOB), neutron porosity (NPHI) and photoelectric absorption cross-section index (PE) were selected for log response analysis of various minerals. Reservoir representative equation was created constrained by quantitative core analysis results, and matrix mineral percentage of quartz, carbonate, feldspar and pyrite were calculated to classify shale lithofacies. Considering the horizontal continuity of seismic data, rock physics model was built, and acoustic impedance integrated with core data and log data was used to predict the horizontal distribution of different lithofacies. The results indicate that: (1) nine lithofacies can be categorized in Duvernay shale, (2) the horizontal distribution of different lithofacies is quite diversified, siliceous shale mainly occurs in Simonette area, calcareous shale is prone to develop in the vicinity of reef, while calcareous-siliceous shale dominates in Willesdon Green area.

  16. Phenotype Clustering of Breast Epithelial Cells in Confocal Imagesbased on Nuclear Protein Distribution Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Long, Fuhui; Peng, Hanchuan; Sudar, Damir; Levievre, Sophie A.; Knowles, David W.

    2006-09-05

    Background: The distribution of the chromatin-associatedproteins plays a key role in directing nuclear function. Previously, wedeveloped an image-based method to quantify the nuclear distributions ofproteins and showed that these distributions depended on the phenotype ofhuman mammary epithelial cells. Here we describe a method that creates ahierarchical tree of the given cell phenotypes and calculates thestatistical significance between them, based on the clustering analysisof nuclear protein distributions. Results: Nuclear distributions ofnuclear mitotic apparatus protein were previously obtained fornon-neoplastic S1 and malignant T4-2 human mammary epithelial cellscultured for up to 12 days. Cell phenotype was defined as S1 or T4-2 andthe number of days in cultured. A probabilistic ensemble approach wasused to define a set of consensus clusters from the results of multipletraditional cluster analysis techniques applied to the nucleardistribution data. Cluster histograms were constructed to show how cellsin any one phenotype were distributed across the consensus clusters.Grouping various phenotypes allowed us to build phenotype trees andcalculate the statistical difference between each group. The resultsshowed that non-neoplastic S1 cells could be distinguished from malignantT4-2 cells with 94.19 percent accuracy; that proliferating S1 cells couldbe distinguished from differentiated S1 cells with 92.86 percentaccuracy; and showed no significant difference between the variousphenotypes of T4-2 cells corresponding to increasing tumor sizes.Conclusion: This work presents a cluster analysis method that canidentify significant cell phenotypes, based on the nuclear distributionof specific proteins, with high accuracy.

  17. Incremental Role of Mammography in the Evaluation of Gynecomastia in Men Who Have Undergone Chest CT.

    Science.gov (United States)

    Sonnenblick, Emily B; Salvatore, Mary; Szabo, Janet; Lee, Karen A; Margolies, Laurie R

    2016-08-01

    The purpose of this study was to determine whether additional breast imaging is clinically valuable in the evaluation of patients with gynecomastia incidentally observed on CT of the chest. In a retrospective analysis, 62 men were identified who had a mammographic diagnosis of gynecomastia and had also undergone CT within 8 months (median, 2 months). We compared the imaging findings of both modalities and correlated them with the clinical outcome. Gynecomastia was statistically significantly larger on mammograms than on CT images; however, there was a high level of concordance in morphologic features and distribution of gynecomastia between mammography and CT. In only one case was gynecomastia evident on mammographic but not CT images, owing to cachexia. Two of the 62 men had ductal carcinoma, which was obscured by gynecomastia. Both of these patients had symptoms suggesting malignancy. The appearance of gynecomastia on CT scans and mammograms was highly correlated. Mammography performed within 8 months of CT is unlikely to reveal cancer unless there is a suspicious clinical finding or a breast mass eccentric to the nipple. Men with clinical symptoms of gynecomastia do not need additional imaging with mammography to confirm the diagnosis if they have undergone recent cross-sectional imaging.

  18. Assessing breast cancer masking risk with automated texture analysis in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lillholm, Martin; Diao, Pengfei

    2015-01-01

    PURPOSE The goal of this work is to develop a method to assess the risk of breast cancer masking, based on image characteristics beyond breast density. METHOD AND MATERIALS From the Dutch breast cancer screening program we collected 285 screen detected cancers, and 109 cancers that were screen...... negative and subsequently appeared as interval cancers. To obtain mammograms without cancerous tissue, we took the contralateral mammograms. We developed a novel machine learning based method called convolutional sparse autoencoder to characterize mammographic texture. The reason for focusing...... status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade (VDG) was determined for each image using Volpara, Matakina Technology, New Zealand. RESULTS We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low...

  19. Reliability analysis of water distribution systems under uncertainty

    International Nuclear Information System (INIS)

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  20. Harmonic Analysis of Electric Vehicle Loadings on Distribution System

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Yijun A [University of Southern California, Department of Electrical Engineering; Xu, Yunshan [University of Southern California, Department of Electrical Engineering; Chen, Zimin [University of Southern California, Department of Electrical Engineering; Peng, Fei [University of Southern California, Department of Electrical Engineering; Beshir, Mohammed [University of Southern California, Department of Electrical Engineering

    2014-12-01

    With the increasing number of Electric Vehicles (EV) in this age, the power system is facing huge challenges of the high penetration rates of EVs charging stations. Therefore, a technical study of the impact of EVs charging on the distribution system is required. This paper is applied with PSCAD software and aimed to analyzing the Total Harmonic Distortion (THD) brought by Electric Vehicles charging stations in power systems. The paper starts with choosing IEEE34 node test feeder as the distribution system, building electric vehicle level two charging battery model and other four different testing scenarios: overhead transmission line and underground cable, industrial area, transformer and photovoltaic (PV) system. Then the statistic method is used to analyze different characteristics of THD in the plug-in transient, plug-out transient and steady-state charging conditions associated with these four scenarios are taken into the analysis. Finally, the factors influencing the THD in different scenarios are found. The analyzing results lead the conclusion of this paper to have constructive suggestions for both Electric Vehicle charging station construction and customers' charging habits.

  1. Mammographic screening practices among Chinese-Australian women.

    Science.gov (United States)

    Kwok, Cannas; Fethney, Judith; White, Kate

    2012-03-01

    To report mammographic screening practice among Chinese-Australian women, and to examine the relationship between demographic characteristics, acculturation factors (English proficiency and length of stay in Australia), cultural beliefs, and having a mammogram as recommended. Cross-sectional and descriptive. The study was conducted in 2009 in Sydney, Australia. Of 988 Chinese-Australian women over 18 years of age invited to participate in the study, 785 (79%) completed and returned the questionnaire. Of these women, 320 (40.8%) were in the target age range of 50 to 69 years. The Chinese Breast Cancer Screening Beliefs Questionnaire (CBCSB) was used as a data collection instrument. Analysis included descriptive statistics, bivariate analysis using chi-square and t tests, and logistic regression. Of the 320 women in the targeted age range of 50 to 69 years, 238 (74.4%) had a mammogram as recommended biannually. Being married-de facto, in the 60 to 69 age group, and speaking Cantonese at home were positively associated with women's mammographic screening practice. However, no statistically significant differences in acculturation factors and having a mammogram as recommended were found. In terms of CBCSB score, women who had mammograms as recommended had more positive attitudes toward health checkups and perceived fewer barriers to mammographic screening. Effort should be focused on specific subgroups of Chinese-Australian women in order to fully understand the barriers involved in participating in mammographic screening. Nurses can use the findings from the present study to design culturally sensitive breast cancer screening programs to encourage women's participation in mammography. © 2011 Sigma Theta Tau International.

  2. A density distribution algorithm for bone incorporating local orthotropy, modal analysis and theories of cellular solids.

    Science.gov (United States)

    Impelluso, Thomas J

    2003-06-01

    An algorithm for bone remodeling is presented which allows for both a redistribution of density and a continuous change of principal material directions for the orthotropic material properties of bone. It employs a modal analysis to add density for growth and a local effective strain based analysis to redistribute density. General re-distribution functions are presented. The model utilizes theories of cellular solids to relate density and strength. The code predicts the same general density distributions and local orthotropy as observed in reality.

  3. Componential distribution analysis of food using near infrared ray image

    Science.gov (United States)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

  4. Analysis of spatial distribution and marketing area of boutique hotels in Shanghai

    Directory of Open Access Journals (Sweden)

    Chu Xueqin

    2017-08-01

    Full Text Available Basing on data collected from Google Earth and Baidu Map,we inputed the coordinates of the address of boutique hotels,five-star hotels in Shanghai,scenic spots above 3A in the urban area of Shanghai and historical relics protection units in Shanghai et al into the Arc GIS,which locates these units in accurate position.We measured some distance data by the tool of spatial analysis.The distribution map and marketing area analysis of boutique hotel were made.

  5. Analysis of the distribution of temperature fields in the braked railway wheel

    Directory of Open Access Journals (Sweden)

    Suchánek Andrej

    2018-01-01

    Full Text Available The article deals with detection of reduced stress in a braked railway wheel, based on thermal transient analysis on virtual models, which influence the characteristics of the railway wheels. Structural analysis was performed by means of the ANSYS Multiphysics program system package. Thermal transient analysis deals with detection of temperature fields which are a result of braking by brake block. The applied heat flux represents the heat generated by friction of brake block. It is applied to a quarter model of the wheel to speed up the calculation. This analysis simulates two braking processes with subsequent cooling. Distribution of the equivalent stress was detected in the railway wheel cross section, at selected points. The input parameters were taken from the thermal transient analysis. These equivalent stresses result from thermal load.

  6. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  7. Field distribution analysis in deflecting structures

    Energy Technology Data Exchange (ETDEWEB)

    Paramonov, V.V. [Joint Inst. for Nuclear Research, Moscow (Russian Federation)

    2013-02-15

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE{sub 1} and HM{sub 1}. The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  8. Field distribution analysis in deflecting structures

    International Nuclear Information System (INIS)

    Paramonov, V.V.

    2013-02-01

    Deflecting structures are used now manly for bunch rotation in emittance exchange concepts, bunch diagnostics and to increase the luminosity. The bunch rotation is a transformation of a particles distribution in the six dimensional phase space. Together with the expected transformations, deflecting structures introduce distortions due to particularities - aberrations - in the deflecting field distribution. The distributions of deflecting fields are considered with respect to non linear additions, which provide emittance deteriorations during a transformation. The deflecting field is treated as combination of hybrid waves HE 1 and HM 1 . The criteria for selection and formation of deflecting structures with minimized level of aberrations are formulated and applied to known structures. Results of the study are confirmed by comparison with results of numerical simulations.

  9. Detrended analysis of shower track distribution in nucleus-nucleus interactions at CERN SPS energy

    International Nuclear Information System (INIS)

    Mali, P.; Manna, S.K.; Haldar, P.K.; Mukhopadhyay, A.; Singh, G.

    2017-01-01

    We have studied the charged particle density fluctuations in "1"6O+Ag(Br) and "3"2S+Ag(Br) interactions at 200A GeV incident energy in the laboratory frame by using the detrended methods. These methods can extract (multi)fractal properties of the underlying distributions after filtering out the average trend of fluctuations associated. Multifractal parameters obtained from data analysis are systematically compared with event samples generated by the Ultra-relativistic Quantum Molecular Dynamics (UrQMD) model, where Bose–Einstein correlation (BEC) effect is mimicked via a charge reassignment algorithm implemented as an after burner. Both the experimental and the simulated data are subjected to two different statistical techniques namely the multifractal detrended fluctuation analysis (MFDFA) and multifractal detrended moving average (MFDMA) analysis. The results indicate that for both the interactions considered the pseudorapidity distributions of the shower tracks are multifractal in nature. Qualitatively, both methods of analysis and both interactions considered, result in similar behavior of multifractal parameters. We do however notice significant quantitative differences in certain cases.

  10. LDV measurement, flow visualization and numerical analysis of flow distribution in a close-coupled catalytic converter

    International Nuclear Information System (INIS)

    Kim, Duk Sang; Cho, Yong Seok

    2004-01-01

    Results from an experimental study of flow distribution in a Close-coupled Catalytic Converter (CCC) are presented. The experiments were carried out with a flow measurement system specially designed for this study under steady and transient flow conditions. A pitot tube was a tool for measuring flow distribution at the exit of the first monolith. The flow distribution of the CCC was also measured by LDV system and flow visualization. Results from numerical analysis are also presented. Experimental results showed that the flow uniformity index decreases as flow Reynolds number increases. In steady flow conditions, the flow through each exhaust pipe made some flow concentrations on a specific region of the CCC inlet. The transient test results showed that the flow through each exhaust pipe in the engine firing order, interacted with each other to ensure that the flow distribution was uniform. The results of numerical analysis were qualitatively accepted with experimental results. They supported and helped explain the flow in the entry region of CCC

  11. Frequency distribution analysis of the long-lived beta-activity of air dust

    International Nuclear Information System (INIS)

    Bunzl, K.; Hoetzl, H.; Winkler, R.

    1977-01-01

    In order to compare the average annual beta activities of air dust a frequency distribution analysis of data has been carried out in order to select a representative quantity for the average value of the data group. It was found that the data to be analysed were consistent with a log-normal frequency distribution and therefore calculations were made of, as the representative average, the median of the beta activity of each year as the antilog of the arithmetric mean of the logarithms, log x, of the analytical values x. The 95% confidence limits were also obtained. The quantities thus calculated are summarized in tabular form. (U.K.)

  12. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  13. Sensitivity analysis of power depression and axial power factor effect on fuel pin to temperature and related properties distribution

    International Nuclear Information System (INIS)

    Suwardi, S.

    2001-01-01

    The presented paper is a preliminary step to evaluate the effect of radial and axial distribution of power generation on thermal analysis of whole fuel pin model with large L/D ratio. The model takes into account both radial and axial distribution of power generation due to power depression and core geometry, temperature and microstructure dependent on thermal conductivity. The microstructure distribution and the gap conductance for typical steady-state situation are given for the sensitivity analysis. The temperature and thermal conductivity distribution along the radial and axial directions obtained by different power distribution is used to indicate the sensitivity of power depression and power factor on thermal aspect. The evaluation is made for one step of incremental time and steady state approach is used. The analysis has been performed using a finite element-finite difference model. The result for typical reactor fuel shows that the sensitivity is too important to be omitted in thermal model

  14. Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems

    OpenAIRE

    Herrera, Manuel; Meniconi, Silvia; Alvisi, Stefano; Izquierdo, Joaquin

    2018-01-01

    This document is intended to be a presentation of the Special Issue “Advanced Hydroinformatic Techniques for the Simulation and Analysis of Water Supply and Distribution Systems”. The final aim of this Special Issue is to propose a suitable framework supporting insightful hydraulic mechanisms to aid the decision-making processes of water utility managers and practitioners. Its 18 peer-reviewed articles present as varied topics as: water distribution system design, optimization of network perf...

  15. A two-component generalized extreme value distribution for precipitation frequency analysis

    Czech Academy of Sciences Publication Activity Database

    Rulfová, Zuzana; Buishand, A.; Roth, M.; Kyselý, Jan

    2016-01-01

    Roč. 534, March (2016), s. 659-668 ISSN 0022-1694 R&D Projects: GA ČR(CZ) GA14-18675S Institutional support: RVO:68378289 Keywords : precipitation extremes * two-component extreme value distribution * regional frequency analysis * convective precipitation * stratiform precipitation * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.483, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022169416000500

  16. Evaluation of an improved algorithm for producing realistic 3D breast software phantoms: Application for mammography

    International Nuclear Information System (INIS)

    Bliznakova, K.; Suryanarayanan, S.; Karellas, A.; Pallikarakis, N.

    2010-01-01

    Purpose: This work presents an improved algorithm for the generation of 3D breast software phantoms and its evaluation for mammography. Methods: The improved methodology has evolved from a previously presented 3D noncompressed breast modeling method used for the creation of breast models of different size, shape, and composition. The breast phantom is composed of breast surface, duct system and terminal ductal lobular units, Cooper's ligaments, lymphatic and blood vessel systems, pectoral muscle, skin, 3D mammographic background texture, and breast abnormalities. The key improvement is the development of a new algorithm for 3D mammographic texture generation. Simulated images of the enhanced 3D breast model without lesions were produced by simulating mammographic image acquisition and were evaluated subjectively and quantitatively. For evaluation purposes, a database with regions of interest taken from simulated and real mammograms was created. Four experienced radiologists participated in a visual subjective evaluation trial, as they judged the quality of the simulated mammograms, using the new algorithm compared to mammograms, obtained with the old modeling approach. In addition, extensive quantitative evaluation included power spectral analysis and calculation of fractal dimension, skewness, and kurtosis of simulated and real mammograms from the database. Results: The results from the subjective evaluation strongly suggest that the new methodology for mammographic breast texture creates improved breast models compared to the old approach. Calculated parameters on simulated images such as β exponent deducted from the power law spectral analysis and fractal dimension are similar to those calculated on real mammograms. The results for the kurtosis and skewness are also in good coincidence with those calculated from clinical images. Comparison with similar calculations published in the literature showed good agreement in the majority of cases. Conclusions: The

  17. Depth distribution analysis of martensitic transformations in Xe implanted austenitic stainless steel

    International Nuclear Information System (INIS)

    Johnson, E.; Johansen, A.; Sarholt-Kristensen, L.; Chechenin, N.G.; Grabaek, L.; Bohr, J.

    1988-01-01

    In this work we present results from a depth distribution analysis of the martensitic phase change occurring in Xe implanted single crystals of austenitic stainless steel. Analysis was done by 'in situ' RBS/channeling analysis, X-ray diffraction and cross-section transmission electron microscopy (XTEM) of the implanted surface. It is found that the martensitic transformation of the surface layer occurs for fluences above 1x10 20 m -2 . The thickness of the transformed layer increases with fluence to ≅ 150 nm at 1x10 21 m -2 , which far exceeds the range plus straggling of the implanted Xe as calculated by the TRIM computer simulation code. Simulations using the MARLOWE code indicate that the thickness of the transformed layer coincides with the range of the small fraction of ions channeled under random implantation conditions. Using cross sectional TEM on the Xe implanted crystals, the depth distribution of gas inclusions and defects can be directly observed. Using X-ray diffraction on implanted single crystals, the solid epitaxial nature of the Xe inclusions, induced prior to the martensitic transformation, was established. The lattice constant obtained from the broad diffraction peak indicates that the pressure in the inclusions is ≅ 5 GPa. (orig./BHO)

  18. A New Wind Turbine Generating System Model for Balanced and Unbalanced Distribution Systems Load Flow Analysis

    Directory of Open Access Journals (Sweden)

    Ahmet Koksoy

    2018-03-01

    Full Text Available Wind turbine generating systems (WTGSs, which are conventionally connected to high voltage transmission networks, have frequently been employed as distributed generation units in today’s distribution networks. In practice, the distribution networks always have unbalanced bus voltages and line currents due to uneven distribution of single or double phase loads over three phases and asymmetry of the lines, etc. Accordingly, in this study, for the load flow analysis of the distribution networks, Conventional Fixed speed Induction Generator (CFIG based WTGS, one of the most widely used WTGS types, is modelled under unbalanced voltage conditions. The Developed model has active and reactive power expressions in terms of induction machine impedance parameters, terminal voltages and input power. The validity of the Developed model is confirmed with the experimental results obtained in a test system. The results of the slip calculation based phase-domain model (SCP Model, which was previously proposed in the literature for CFIG based WTGSs under unbalanced voltages, are also given for the comparison. Finally, the Developed model and the SCP model are implemented in the load flow analysis of the IEEE 34 bus test system with the CFIG based WTGSs and unbalanced loads. Thus, it is clearly pointed out that the results of the load flow analysis implemented with both models are very close to each other, and the Developed model is computationally more efficient than the SCP model.

  19. Evaluation of a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Imae, Toshikazu; Takenaka, Shigeharu; Saotome, Naoya

    2016-01-01

    The purpose of this study was to evaluate a post-analysis method for cumulative dose distribution in stereotactic body radiotherapy (SBRT) using volumetric modulated arc therapy (VMAT). VMAT is capable of acquiring respiratory signals derived from projection images and machine parameters based on machine logs during VMAT delivery. Dose distributions were reconstructed from the respiratory signals and machine parameters in the condition where respiratory signals were without division, divided into 4 and 10 phases. The dose distribution of each respiratory phase was calculated on the planned four-dimensional CT (4DCT). Summation of the dose distributions was carried out using deformable image registration (DIR), and cumulative dose distributions were compared with those of the corresponding plans. Without division, dose differences between cumulative distribution and plan were not significant. In the condition Where respiratory signals were divided, dose differences were observed over dose in cranial region and under dose in caudal region of planning target volume (PTV). Differences between 4 and 10 phases were not significant. The present method Was feasible for evaluating cumulative dose distribution in VMAT-SBRT using 4DCT and DIR. (author)

  20. Determinants of the distribution and concentration of biogas production in Germany. A spatial econometric analysis

    International Nuclear Information System (INIS)

    Scholz, Lukas

    2015-01-01

    The biogas production in Germany is characterized by a heterogeneous distribution and the formation of regional centers. In the present study the determinants of the spatial distribution and concentration are analyzed with methods of spatial statistics and spatial econometrics. In addition to the consideration of ''classic'' site factors of agricultural production, the analysis here focuses on the possible relevance of agglomeration effects. The results of the work contribute to a better understanding of the regional distribution and concentration of the biogas production in Germany. [de