WorldWideScience

Sample records for previous objective analysis

  1. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  2. Treatment of Previously Treated Facial Capillary Malformations: Results of Single-Center Retrospective Objective 3-Dimensional Analysis of the Efficacy of Large Spot 532 nm Lasers.

    Science.gov (United States)

    Kwiek, Bartłomiej; Ambroziak, Marcin; Osipowicz, Katarzyna; Kowalewski, Cezary; Rożalski, Michał

    2018-06-01

    Current treatment of facial capillary malformations (CM) has limited efficacy. To assess the efficacy of large spot 532 nm lasers for the treatment of previously treated facial CM with the use of 3-dimensional (3D) image analysis. Forty-three white patients aged 6 to 59 were included in this study. Patients had 3D photography performed before and after treatment with a 532 nm Nd:YAG laser with large spot and contact cooling. Objective analysis of percentage improvement based on 3D digital assessment of combined color and area improvement (global clearance effect [GCE]) were performed. The median maximal improvement achieved during the treatment (GCE) was 59.1%. The mean number of laser procedures required to achieve this improvement was 6.2 (range 1-16). Improvement of minimum 25% (GCE25) was achieved by 88.4% of patients, a minimum of 50% (GCE50) by 61.1%, a minimum of 75% (GCE75) by 25.6%, and a minimum of 90% (GCE90) by 4.6%. Patients previously treated with pulsed dye lasers had a significantly less response than those treated with other modalities (GCE 37.3% vs 61.8%, respectively). A large spot 532 nm laser is effective in previously treated patients with facial CM.

  3. Attribute and topology based change detection in a constellation of previously detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  4. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  5. Numerical Analysis Objects

    Science.gov (United States)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  6. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  7. An unbinding problem? The disintegration of visible, previously attended objects does not attract attention.

    Science.gov (United States)

    Wolfe, Jeremy M; Oliva, Aude; Butcher, Serena J; Arsenio, Helga C

    2002-01-01

    In seven experiments, observers searched for a scrambled object among normal objects. The critical comparison was between repeated search in which the same set of stimuli remained present in fixed positions in the display for many (>100) trials and unrepeated conditions in which new stimuli were presented on each trial. In repeated search conditions, observers monitored an essentially stable display for the disruption of a clearly visible object. This is an extension of repeated search experiments in which subjects search a fixed set of items for different targets on each trial (Wolfe, Klempen, & Dahlen, 2000) and can be considered as a form of a "change blindness" task. The unrepeated search was very inefficient, showing that a scrambled object does not "pop-out" among intact objects (or vice versa). Interestingly, the repeated search condition was just as inefficient, as if participants had to search for the scrambled target even after extensive experience with the specific change in the specific scene. The results suggest that the attentional processes involved in searching for a target in a novel scene may be very similar to those used to confirm the presence of a target in a familiar scene.

  8. Objective - oriented financial analysis introduction

    Directory of Open Access Journals (Sweden)

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available The practice of financial analysis has been immeasurably strengthened in recent years thanks to the ongoing evolution of computerized approaches in the form of spreadsheets and computer-based financial models of different types. These devices not only relieved the analyst's computing task, but also opened up a wide range of analyzes and research into alternative sensitivity, which so far has not been possible. The main potential for object-oriented financial analysis consists in enormously expanding the analyst's capabilities through an online knowledge and information interface that has not yet been achieved through existing methods and software packages.

  9. Analysis of previous screening examinations for patients with breast cancer

    International Nuclear Information System (INIS)

    Lee, Eun Hye; Cha, Joo Hee; Han, Dae Hee; Choi, Young Ho; Hwang, Ki Tae; Ryu, Dae Sik; Kwak, Jin Ho; Moon, Woo Kyung

    2007-01-01

    We wanted to improve the quality of subsequent screening by reviewing the previous screening of breast cancer patients. Twenty-four breast cancer patients who underwent previous screening were enrolled. All 24 took mammograms and 15 patients also took sonograms. We reviewed the screening retrospectively according to the BI-RADS criteria and we categorized the results into false negative, true negative, true positive and occult cancers. We also categorized the causes of false negative cancers into misperception, misinterpretation and technical factors and then we analyzed the attributing factors. Review of the previous screening revealed 66.7% (16/24) false negative, 25.0% (6/24) true negative, and 8.3% (2/24) true positive cancers. False negative cancers were caused by the mammogram in 56.3% (9/16) and by the sonogram in 43.7% (7/16). For the false negative cases, all of misperception were related with mammograms and this was attributed to dense breast, a lesion located at the edge of glandular tissue or the image, and findings seen on one view only. Almost all misinterpretations were related with sonograms and attributed to loose application of the final assessment. To improve the quality of breast screening, it is essential to overcome the main causes of false negative examinations, including misperception and misinterpretation. We need systematic education and strict application of final assessment categories of BI-RADS. For effective communication among physicians, it is also necessary to properly educate them about BI-RADS

  10. Proteomics Analysis Reveals Previously Uncharacterized Virulence Factors in Vibrio proteolyticus

    Directory of Open Access Journals (Sweden)

    Ann Ray

    2016-07-01

    Full Text Available Members of the genus Vibrio include many pathogens of humans and marine animals that share genetic information via horizontal gene transfer. Hence, the Vibrio pan-genome carries the potential to establish new pathogenic strains by sharing virulence determinants, many of which have yet to be characterized. Here, we investigated the virulence properties of Vibrio proteolyticus, a Gram-negative marine bacterium previously identified as part of the Vibrio consortium isolated from diseased corals. We found that V. proteolyticus causes actin cytoskeleton rearrangements followed by cell lysis in HeLa cells in a contact-independent manner. In search of the responsible virulence factor involved, we determined the V. proteolyticus secretome. This proteomics approach revealed various putative virulence factors, including active type VI secretion systems and effectors with virulence toxin domains; however, these type VI secretion systems were not responsible for the observed cytotoxic effects. Further examination of the V. proteolyticus secretome led us to hypothesize and subsequently demonstrate that a secreted hemolysin, belonging to a previously uncharacterized clan of the leukocidin superfamily, was the toxin responsible for the V. proteolyticus-mediated cytotoxicity in both HeLa cells and macrophages. Clearly, there remains an armory of yet-to-be-discovered virulence factors in the Vibrio pan-genome that will undoubtedly provide a wealth of knowledge on how a pathogen can manipulate host cells.

  11. Sacrococcygeal pilonidal disease: analysis of previously proposed risk factors

    Directory of Open Access Journals (Sweden)

    Ali Harlak

    2010-01-01

    Full Text Available PURPOSE: Sacrococcygeal pilonidal disease is a source of one of the most common surgical problems among young adults. While male gender, obesity, occupations requiring sitting, deep natal clefts, excessive body hair, poor body hygiene and excessive sweating are described as the main risk factors for this disease, most of these need to be verified with a clinical trial. The present study aimed to evaluate the value and effect of these factors on pilonidal disease. METHOD: Previously proposed main risk factors were evaluated in a prospective case control study that included 587 patients with pilonidal disease and 2,780 healthy control patients. RESULTS: Stiffness of body hair, number of baths and time spent seated per day were the three most predictive risk factors. Adjusted odds ratios were 9.23, 6.33 and 4.03, respectively (p<0.001. With an adjusted odds ratio of 1.3 (p<.001, body mass index was another risk factor. Family history was not statistically different between the groups and there was no specific occupation associated with the disease. CONCLUSIONS: Hairy people who sit down for more than six hours a day and those who take a bath two or less times per week are at a 219-fold increased risk for sacrococcygeal pilonidal disease than those without these risk factors. For people with a great deal of hair, there is a greater need for them to clean their intergluteal sulcus. People who engage in work that requires sitting in a seat for long periods of time should choose more comfortable seats and should also try to stand whenever possible.

  12. Radionuclides in Bayer process residues: previous analysis for radiological protection

    International Nuclear Information System (INIS)

    Cuccia, Valeria; Rocha, Zildete; Oliveira, Arno H. de

    2011-01-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in the main residues of Bayer process for alumina production: red mud and sand samples. Usually, the residues of Bayer process are named red mud, in their totality. However, in the industry where the samples were collected, there is an additional residues separation: sand and red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and neutron activation analysis. The concentrations of radionuclides are higher in the red mud than in the sand. These solid residues present activities concentrations enhanced, when compared to bauxite. Further uses for the residues as building material must be more evaluated from the radiological point of view, due to its potential of radiological exposure enhancement, specially caused by radon emission. (author)

  13. The association between subjective memory complaint and objective cognitive function in older people with previous major depression.

    Science.gov (United States)

    Chu, Chung-Shiang; Sun, I-Wen; Begum, Aysha; Liu, Shen-Ing; Chang, Ching-Jui; Chiu, Wei-Che; Chen, Chin-Hsin; Tang, Hwang-Shen; Yang, Chia-Li; Lin, Ying-Chin; Chiu, Chih-Chiang; Stewart, Robert

    2017-01-01

    The goal of this study is to investigate associations between subjective memory complaint and objective cognitive performance in older people with previous major depression-a high-risk sample for cognitive impairment and later dementia. A cross-sectional study was carried out in people aged 60 or over with previous major depression but not fulfilling current major depression criteria according to DSM-IV-TR. People with dementia or Mini-Mental State Examination score less than 17 were excluded. Subjective memory complaint was defined on the basis of a score ≧4 on the subscale of Geriatric Mental State schedule, a maximum score of 8. Older people aged equal or over 60 without any psychiatric diagnosis were enrolled as healthy controls. Cognitive function was evaluated using a series of cognitive tests assessing verbal memory, attention/speed, visuospatial function, verbal fluency, and cognitive flexibility in all participants. One hundred and thirteen older people with previous major depression and forty-six healthy controls were enrolled. Subjective memory complaint was present in more than half of the participants with depression history (55.8%). Among those with major depression history, subjective memory complaint was associated with lower total immediate recall and delayed verbal recall scores after adjustment. The associations between subjective memory complaint and worse memory performance were stronger in participants with lower depressive symptoms (Hamilton Depression Rating Scale scorememory complaint may be a valid appraisal of memory performance in older people with previous major depression and consideration should be given to more proactive assessment and follow-up in these clinical samples.

  14. The association between subjective memory complaint and objective cognitive function in older people with previous major depression.

    Directory of Open Access Journals (Sweden)

    Chung-Shiang Chu

    Full Text Available The goal of this study is to investigate associations between subjective memory complaint and objective cognitive performance in older people with previous major depression-a high-risk sample for cognitive impairment and later dementia. A cross-sectional study was carried out in people aged 60 or over with previous major depression but not fulfilling current major depression criteria according to DSM-IV-TR. People with dementia or Mini-Mental State Examination score less than 17 were excluded. Subjective memory complaint was defined on the basis of a score ≧4 on the subscale of Geriatric Mental State schedule, a maximum score of 8. Older people aged equal or over 60 without any psychiatric diagnosis were enrolled as healthy controls. Cognitive function was evaluated using a series of cognitive tests assessing verbal memory, attention/speed, visuospatial function, verbal fluency, and cognitive flexibility in all participants. One hundred and thirteen older people with previous major depression and forty-six healthy controls were enrolled. Subjective memory complaint was present in more than half of the participants with depression history (55.8%. Among those with major depression history, subjective memory complaint was associated with lower total immediate recall and delayed verbal recall scores after adjustment. The associations between subjective memory complaint and worse memory performance were stronger in participants with lower depressive symptoms (Hamilton Depression Rating Scale score<7. The results suggest subjective memory complaint may be a valid appraisal of memory performance in older people with previous major depression and consideration should be given to more proactive assessment and follow-up in these clinical samples.

  15. Object-sensitive Type Analysis of PHP

    NARCIS (Netherlands)

    Van der Hoek, Henk Erik; Hage, J

    2015-01-01

    In this paper we develop an object-sensitive type analysis for PHP, based on an extension of the notion of monotone frameworks to deal with the dynamic aspects of PHP, and following the framework of Smaragdakis et al. for object-sensitive analysis. We consider a number of instantiations of the

  16. Object-oriented analysis and design

    CERN Document Server

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  17. Neutron activation analysis of limestone objects

    International Nuclear Information System (INIS)

    Meyers, P.; Van Zelst, L.

    1977-01-01

    The elemental composition of samples from limestone objects were determined by neutron activation analysis to investigate whether this technique can be used to distinguish between objects made of limestone from different sources. Samples weighing between 0.2-2 grams were obtained by drilling from a series of ancient Egyptian and medieval Spanish objects. Analysis was performed on aliquots varying in weight from 40-100 milligrams. The following elements were determined quantitatively: Na, K, Rb, Cs, Ba, Sc, La, Ce, Sm, Eu, Hf, Th, Ta, Cr, Mn, Fe, Co and Zn. The data on Egyptian limestones indicate that, because of the inhomogeneous nature of the stone, 0.2-2 gram samples may not be representative of an entire object. Nevertheless, multivariate statistical methods produced a clear distinction between objects originating from the Luxor area (ancient Thebes) and objects found north of Luxor. The Spanish limestone studied appeared to be more homogeneous. Samples from stylistically related objects have similar elemental compositions while relative large differences were observed between objects having no relationship other than the common provenance of medieval Spain. (orig.) [de

  18. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  19. Objective analysis of toolmarks in forensics

    Energy Technology Data Exchange (ETDEWEB)

    Grieve, Taylor N. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  20. Intelligence, previous convictions and interrogative suggestibility: a path analysis of alleged false-confession cases.

    Science.gov (United States)

    Sharrock, R; Gudjonsson, G H

    1993-05-01

    The main purpose of this study was to investigate the relationship between interrogative suggestibility and previous convictions among 108 defendants in criminal trials, using a path analysis technique. It was hypothesized that previous convictions, which may provide defendants with interrogative experiences, would correlate negatively with 'shift' as measured by the Gudjonsson Suggestibility Scale (Gudjonsson, 1984a), after intelligence and memory had been controlled for. The hypothesis was partially confirmed and the theoretical and practical implications of the findings are discussed.

  1. Total hip arthroplasty after a previous pelvic osteotomy: A systematic review and meta-analysis.

    Science.gov (United States)

    Shigemura, T; Yamamoto, Y; Murata, Y; Sato, T; Tsuchiya, R; Wada, Y

    2018-06-01

    There are several reports regarding total hip arthroplasty (THA) after a previous pelvic osteotomy (PO). However, to our knowledge, until now there has been no formal systematic review and meta-analysis published to summarize the clinical results of THA after a previous PO. Therefore, we conducted a systematic review and meta-analysis of results of THA after a previous PO. We focus on these questions as follows: does a previous PO affect the results of subsequent THA, such as clinical outcomes, operative time, operative blood loss, and radiological parameters. Using PubMed, Web of Science, and Cochrane Library, we searched for relevant original papers. The pooling of data was performed using RevMan software (version 5.3, Cochrane Collaboration, Oxford, UK). A p-value50%, significant heterogeneity was assumed and a random-effects model was applied for the meta-analysis. A fixed-effects model was applied in the absence of significant heterogeneity. Eleven studies were included in this meta-analysis. The pooled results indicated that there was no significant difference in postoperative Merle D'Aubigne-Postel score (I 2 =0%, SMD=-0.15, 95% CI: -0.36 to 0.06, p=0.17), postoperative Harris hip score (I 2 =60%, SMD=-0.23, 95% CI: -0.50 to 0.05, p=0.10), operative time (I 2 =86%, SMD=0.37, 95% CI: -0.09 to 0.82, p=0.11), operative blood loss (I 2 =82%, SMD=0.23, 95% CI: -0.17 to 0.63, p=0.25), and cup abduction angle (I 2 =43%, SMD=-0.08, 95% CI: -0.25 to 0.09, p=0.38) between THA with and without a previous PO. However, cup anteversion angle of THA with a previous PO was significantly smaller than that of without a previous PO (I 2 =77%, SMD=-0.63, 95% CI: -1.13 to -0.13, p=0.01). Systematic review and meta-analysis of results of THA after a previous PO was performed. A previous PO did not affect the results of subsequent THA, except for cup anteversion. Because of the low quality evidence currently available, high-quality randomized controlled trials are required

  2. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  3. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  4. Origin of choriocarcinoma in previous molar pregnancy proved by DNA analysis

    International Nuclear Information System (INIS)

    Vojtassak, J.; Repiska, V.; Konecna, B.; Zajac, V.; Korbel, M.; Danihel, L.

    1996-01-01

    A 17-year old woman had in a short time period (seven months) a very exciting reproduction history. Molar pregnancy in December 1993, choriocarcinoma in January 1994 and induced abortion in June 1994. DNA analysis proved the origin of the choriocarcinoma in the previous molar pregnancy. (author)

  5. Upon Further Review: V. An Examination of Previous Lightcurve Analysis from the Palmer Divide Observatory

    Science.gov (United States)

    Warner, Brian D.

    2011-01-01

    Updated results are given for nine asteroids previously reported from the Palmer Divide Observatory (PDO). The original images were re-measured to obtain new data sets using the latest version of MPO Canopus photometry software, analysis tools, and revised techniques for linking multiple observing runs covering several days to several weeks. Results that were previously not reported or were moderately different were found for 1659 Punkajarju, 1719 Jens, 1987 Kaplan, 2105 Gudy, 2961 Katsurahama, 3285 Ruth Wolfe, 3447 Burckhalter, 7816 Hanoi, and (34817) 2000 SE116. This is one in a series of papers that will examine results obtained during the initial years of the asteroid lightcurve program at PDO.

  6. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  7. Previously unidentified changes in renal cell carcinoma gene expression identified by parametric analysis of microarray data

    International Nuclear Information System (INIS)

    Lenburg, Marc E; Liou, Louis S; Gerry, Norman P; Frampton, Garrett M; Cohen, Herbert T; Christman, Michael F

    2003-01-01

    Renal cell carcinoma is a common malignancy that often presents as a metastatic-disease for which there are no effective treatments. To gain insights into the mechanism of renal cell carcinogenesis, a number of genome-wide expression profiling studies have been performed. Surprisingly, there is very poor agreement among these studies as to which genes are differentially regulated. To better understand this lack of agreement we profiled renal cell tumor gene expression using genome-wide microarrays (45,000 probe sets) and compare our analysis to previous microarray studies. We hybridized total RNA isolated from renal cell tumors and adjacent normal tissue to Affymetrix U133A and U133B arrays. We removed samples with technical defects and removed probesets that failed to exhibit sequence-specific hybridization in any of the samples. We detected differential gene expression in the resulting dataset with parametric methods and identified keywords that are overrepresented in the differentially expressed genes with the Fisher-exact test. We identify 1,234 genes that are more than three-fold changed in renal tumors by t-test, 800 of which have not been previously reported to be altered in renal cell tumors. Of the only 37 genes that have been identified as being differentially expressed in three or more of five previous microarray studies of renal tumor gene expression, our analysis finds 33 of these genes (89%). A key to the sensitivity and power of our analysis is filtering out defective samples and genes that are not reliably detected. The widespread use of sample-wise voting schemes for detecting differential expression that do not control for false positives likely account for the poor overlap among previous studies. Among the many genes we identified using parametric methods that were not previously reported as being differentially expressed in renal cell tumors are several oncogenes and tumor suppressor genes that likely play important roles in renal cell

  8. Service for victims of crime VDS info and victims’ support: Analysis of the previous work

    Directory of Open Access Journals (Sweden)

    Ćopić Sanja M.

    2004-01-01

    Full Text Available The first victim support service in our country VDS info and victims’ support started with its work in April 2003 within the Victimology Society of Serbia. This service is aimed at victims of crime (women and men, primarily at victims of violent crime, but also of some forms of property crime (such as burglary. The aim of the Service is to offer victims of crime information on their rights and the ways of how to realize them, emotional support, as well as to refer them to other institutions/organizations depending on the certain victim’s needs. Coordinators and volunteers, who passed the appropriate training, are responsible for that. Bearing that in mind, this paper will give the brief glens on the Service itself, its organization and the way of work, followed by the analysis of the results of previous work.

  9. Analysis of Product Buying Decision on Lazada E-commerce based on Previous Buyers’ Comments

    Directory of Open Access Journals (Sweden)

    Neil Aldrin

    2017-06-01

    Full Text Available The aims of the present research are: 1 to know that product buying decision possibly occurs, 2 to know how product buying decision occurs on Lazada e-commerce’s customers, 3 how previous buyers’ comments can increase product buying decision on Lazada e-commerce. This research utilizes qualitative research method. Qualitative research is a research that investigates other researches and makes assumption or discussion result so that other analysis results can be made in order to widen idea and opinion. Research result shows that product which has many ratings and reviews will trigger other buyers to purchase or get that product. The conclusion is that product buying decision may occur because there are some processes before making decision which are: looking for recognition and searching for problems, knowing the needs, collecting information, evaluating alternative, evaluating after buying. In those stages, buying decision on Lazada e-commerce is supported by price, promotion, service, and brand.

  10. Identification and pathway analysis of microRNAs with no previous involvement in breast cancer.

    Directory of Open Access Journals (Sweden)

    Sandra Romero-Cordoba

    Full Text Available microRNA expression signatures can differentiate normal and breast cancer tissues and can define specific clinico-pathological phenotypes in breast tumors. In order to further evaluate the microRNA expression profile in breast cancer, we analyzed the expression of 667 microRNAs in 29 tumors and 21 adjacent normal tissues using TaqMan Low-density arrays. 130 miRNAs showed significant differential expression (adjusted P value = 0.05, Fold Change = 2 in breast tumors compared to the normal adjacent tissue. Importantly, the role of 43 of these microRNAs has not been previously reported in breast cancer, including several evolutionary conserved microRNA*, showing similar expression rates to that of their corresponding leading strand. The expression of 14 microRNAs was replicated in an independent set of 55 tumors. Bioinformatic analysis of mRNA targets of the altered miRNAs, identified oncogenes like ERBB2, YY1, several MAP kinases, and known tumor-suppressors like FOXA1 and SMAD4. Pathway analysis identified that some biological process which are important in breast carcinogenesis are affected by the altered microRNA expression, including signaling through MAP kinases and TP53 pathways, as well as biological processes like cell death and communication, focal adhesion and ERBB2-ERBB3 signaling. Our data identified the altered expression of several microRNAs whose aberrant expression might have an important impact on cancer-related cellular pathways and whose role in breast cancer has not been previously described.

  11. Objectivity

    CERN Document Server

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  12. A strategic analysis of Business Objects' portal application

    OpenAIRE

    Kristinsson, Olafur Oskar

    2007-01-01

    Business Objects is the leading software firm producing business intelligence software. Business intelligence is a growing market. Small to medium businesses are increasingly looking at business intelligence. Business Objects' flagship product in the enterprise market is Business Objects XI and for medium-size companies it has Crystal Decisions. Portals are the front end for the two products. InfoView, Business Objects portal application, lacks a long-term strategy. This analysis evaluates...

  13. Analysis of previous perceptual and motor experience in breaststroke kick learning

    Directory of Open Access Journals (Sweden)

    Ried Bettina

    2015-12-01

    Full Text Available One of the variables that influence motor learning is the learner’s previous experience, which may provide perceptual and motor elements to be transferred to a novel motor skill. For swimming skills, several motor experiences may prove effective. Purpose. The aim was to analyse the influence of previous experience in playing in water, swimming lessons, and music or dance lessons on learning the breaststroke kick. Methods. The study involved 39 Physical Education students possessing basic swimming skills, but not the breaststroke, who performed 400 acquisition trials followed by 50 retention and 50 transfer trials, during which stroke index as well as rhythmic and spatial configuration indices were mapped, and answered a yes/no questionnaire regarding previous experience. Data were analysed by ANOVA (p = 0.05 and the effect size (Cohen’s d ≥0.8 indicating large effect size. Results. The whole sample improved their stroke index and spatial configuration index, but not their rhythmic configuration index. Although differences between groups were not significant, two types of experience showed large practical effects on learning: childhood water playing experience only showed major practically relevant positive effects, and no experience in any of the three fields hampered the learning process. Conclusions. The results point towards diverse impact of previous experience regarding rhythmic activities, swimming lessons, and especially with playing in water during childhood, on learning the breaststroke kick.

  14. Use of objective analysis to estimate winter temperature and ...

    Indian Academy of Sciences (India)

    In the complex terrain of Himalaya, nonavailability of snow and meteorological data of the remote locations ... Precipitation intensity; spatial interpolation; objective analysis. J. Earth Syst. ... This technique needs historical database and unable ...

  15. Retrospective analysis on malignant calcification previously misdiagnosed as benign on screening mammography

    International Nuclear Information System (INIS)

    Ha, Su Min; Cha, Joo Hee; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young; Choi, Woo Jung

    2017-01-01

    The purpose of our study was to investigate the morphology and distribution of calcifications initially interpreted as benign or probably benign, but proven to be malignant by subsequent stereotactic biopsy, and to identify the reason for misinterpretation or underestimation at the initial diagnosis. Out of 567 women who underwent stereotactic biopsy for calcifications at our hospital between January 2012 and December 2014, 167 women were diagnosed with malignancy. Forty-six of these 167 women had previous mammography assessed as benign or probably benign which was changed to suspicious malignancy on follow-up mammography. Of these 46 women, three women with biopsy-proven benign calcifications at the site of subsequent cancer were excluded, and 43 patients were finally included. The calcifications (morphology, distribution, extent, associated findings) in the previous and follow-up mammography examinations were analyzed according to the Breast Imaging Reporting and Data System (BI-RADS) lexicon and assessment category. We classified the patients into two groups: 1) group A patients who were still retrospectively re-categorized as less than or equal to BI-RADS 3 and 2) group B patients who were re-categorized as equal to or higher than BI-RADS 4a and whose results should have prompted previous diagnostic assessment. In the follow-up mammography examinations, change in calcification morphology (n = 27, 63%) was the most frequent cause of assessment change. The most frequent previous mammographic findings of malignant calcification were amorphous morphology (n = 26, 60%) and grouped distribution (n = 36, 84%). The most frequent calcification findings at reassessment were amorphous morphology (n = 4, 9%), fine pleomorphic calcification (n = 30, 70%), grouped distribution (n = 23, 53%), and segmental calcification (n = 12, 28%). There were 33 (77%) patients in group A, and 10 patients (23%) in group B. Amorphous morphology and grouped distribution were the most frequent

  16. Retrospective analysis on malignant calcification previously misdiagnosed as benign on screening mammography

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Su Min [Dept. of Radiology, Research Institute of Radiology, Chung Ang University Hospital, Seoul(Korea, Republic of); Cha, Joo Hee; Kim, Hak Hee; Shin, Hee Jung; Chae, Eun Young; Choi, Woo Jung [Dept. of Radiology, Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of)

    2017-04-15

    The purpose of our study was to investigate the morphology and distribution of calcifications initially interpreted as benign or probably benign, but proven to be malignant by subsequent stereotactic biopsy, and to identify the reason for misinterpretation or underestimation at the initial diagnosis. Out of 567 women who underwent stereotactic biopsy for calcifications at our hospital between January 2012 and December 2014, 167 women were diagnosed with malignancy. Forty-six of these 167 women had previous mammography assessed as benign or probably benign which was changed to suspicious malignancy on follow-up mammography. Of these 46 women, three women with biopsy-proven benign calcifications at the site of subsequent cancer were excluded, and 43 patients were finally included. The calcifications (morphology, distribution, extent, associated findings) in the previous and follow-up mammography examinations were analyzed according to the Breast Imaging Reporting and Data System (BI-RADS) lexicon and assessment category. We classified the patients into two groups: 1) group A patients who were still retrospectively re-categorized as less than or equal to BI-RADS 3 and 2) group B patients who were re-categorized as equal to or higher than BI-RADS 4a and whose results should have prompted previous diagnostic assessment. In the follow-up mammography examinations, change in calcification morphology (n = 27, 63%) was the most frequent cause of assessment change. The most frequent previous mammographic findings of malignant calcification were amorphous morphology (n = 26, 60%) and grouped distribution (n = 36, 84%). The most frequent calcification findings at reassessment were amorphous morphology (n = 4, 9%), fine pleomorphic calcification (n = 30, 70%), grouped distribution (n = 23, 53%), and segmental calcification (n = 12, 28%). There were 33 (77%) patients in group A, and 10 patients (23%) in group B. Amorphous morphology and grouped distribution were the most frequent

  17. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  18. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    NARCIS (Netherlands)

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  19. Rhabdomyosarcoma Arising in a Previously Irradiated Field: An Analysis of 43 Patients

    Energy Technology Data Exchange (ETDEWEB)

    Dang, Nguyen D. [Department of Radiation Oncology, Baylor College of Medicine, Houston, Texas (United States); Teh, Bin S. [Department of Radiation Oncology, The Methodist Hospital and Methodist Hospital Research Institute, Houston, Texas (United States); Paulino, Arnold C., E-mail: apaulino@tmhs.org [Department of Radiation Oncology, The Methodist Hospital and Methodist Hospital Research Institute, Houston, Texas (United States)

    2013-03-01

    Patients with soft tissue sarcomas that arise from previously irradiated fields have traditionally been reported to have a poor prognosis. In this report, we examined the characteristics and outcomes of patients who developed a rhabdomyosarcoma in a previously irradiated field (RMS-RIF); we hypothesize that these patients should have a better outcome compared to other postradiation soft tissue sarcomas as these tumors are chemosensitive and radiosensitive. A PubMed search of the literature from 1961-2010 yielded 33 studies with data for patients with RMS-RIF. The study included 43 patients with a median age of 6.5 years at the time of radiation therapy (RT) for the initial tumor. The median RT dose was 48 Gy. The median latency period, the time from RT to development of RMS-RIF, was 8 years. The 3-year overall survival for RMS-RIF was 42%. The 3-year overall survival was 66% for patients receiving chemotherapy and local treatment (surgery and/or RT) compared to 29% for those who had systemic treatment only or local treatment only (P=.049). Other factors associated with increased 3-year overall survival included retinoblastoma initial diagnosis (P<.001), age ≤18 years at diagnosis of RMS-RIF (P=.003), favorable site (P=.008), and stage 1 disease (P=.002). Age at time of RMS-RIF, retinoblastoma initial tumor, favorable site, stage 1 disease, and use of both systemic and local treatment were found to be favorable prognostic factors for 3-year overall survival.

  20. A Comparative Investigation of the Previous and New Secondary History Curriculum: The Issues of the Definition of the Aims and Objectives and the Selection of Curriculum Content

    Science.gov (United States)

    Dinc, Erkan

    2011-01-01

    Discussions on history teaching in Turkey indicate that the previous versions of the history curriculum and the pedagogy of history in the country bear many problems and deficiencies. The problems of Turkish history curriculum mainly arise from the perspectives it takes and the selection of its content. Since 2003, there have been extensive…

  1. Analysis Of Navy Hornet Squadron Mishap Costs With Regard To Previously Flown Flight Hours

    Science.gov (United States)

    2017-06-01

    Psychological Association ( APA ) manual guidelines for reporting statistics have recommended reporting exact p-values, and avoiding arbitrary...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. ANALYSIS OF NAVY...Advisor Kenneth Doerr THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden

  2. Orthodontic bracket bonding without previous adhesive priming: A meta-regression analysis.

    Science.gov (United States)

    Altmann, Aline Segatto Pires; Degrazia, Felipe Weidenbach; Celeste, Roger Keller; Leitune, Vicente Castelo Branco; Samuel, Susana Maria Werner; Collares, Fabrício Mezzomo

    2016-05-01

    To determine the consensus among studies that adhesive resin application improves the bond strength of orthodontic brackets and the association of methodological variables on the influence of bond strength outcome. In vitro studies were selected to answer whether adhesive resin application increases the immediate shear bond strength of metal orthodontic brackets bonded with a photo-cured orthodontic adhesive. Studies included were those comparing a group having adhesive resin to a group without adhesive resin with the primary outcome measurement shear bond strength in MPa. A systematic electronic search was performed in PubMed and Scopus databases. Nine studies were included in the analysis. Based on the pooled data and due to a high heterogeneity among studies (I(2)  =  93.3), a meta-regression analysis was conducted. The analysis demonstrated that five experimental conditions explained 86.1% of heterogeneity and four of them had significantly affected in vitro shear bond testing. The shear bond strength of metal brackets was not significantly affected when bonded with adhesive resin, when compared to those without adhesive resin. The adhesive resin application can be set aside during metal bracket bonding to enamel regardless of the type of orthodontic adhesive used.

  3. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  4. Ten years of Object-Oriented analysis on H1

    International Nuclear Information System (INIS)

    Laycock, Paul

    2012-01-01

    Over a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the ROOT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This original solution was then augmented with a fourth layer containing user-defined objects. This contribution will summarise the history of the solutions used, from modifications to the original design, to the evolution of the high-level end-user analysis object framework which is used by H1 today. Several important issues are addressed - the portability of expert knowledge to increase the efficiency of data analysis, the flexibility of the framework to incorporate new analyses, the performance and ease of use, and lessons learned for future projects.

  5. Possible Detection of Perchlorates by the Sample Analysis at Mars (SAM) Instrument: Comparison with Previous Missions

    Science.gov (United States)

    Navarro-Gonzalex, Rafael; Sutter, Brad; Archer, Doug; Ming, Doug; Eigenbrode, Jennifer; Franz, Heather; Glavin, Daniel; McAdam, Amy; Stern, Jennifer; McKay, Christopher; hide

    2013-01-01

    The first chemical analysis of soluble salts in the soil was carried out by the Phoenix Lander in the Martian Arctic [1]. Surprisingly, chlorine was present as magnesium or calcium perchlorate at 0.4 to 0.6 percent. Additional support for the identification of perchlorate came from the evolved gas analysis which detected the release of molecular oxygen at 350-550C [1]. When Mars-like soils from the Atacama Desert were spiked with magnesium perchlorate (1 percent) and heated using the Viking GC-MS protocol, nearly all the organics were combusted but a small amount was chlorinated, forming chloromethane and dichloromethane [2]. These chlorohydrocarbons were detected by the Viking GC-MS experiments when the Martian soil was analyzed but they were considered to be terrestrial contaminants [3]. Reinterpretation of the Viking results suggests Analysis at Mars (SAM) instrument on board the Mars Science Laboratory (MSL) ran four samples from an aeolian bedform named Rocknest. The samples analyzed were portioned from the fifth scoop at this location. The samples were heated to 835C at 35C/min with a He flow. The SAM QMS detected a major oxygen release (300-500C) [5], coupled with the release of chlorinated hydrocarbons (chloromethane, dichloromethane, trichloromethane, and chloromethylpropene) detected both by SAM QMS and GC-MS derived from known Earth organic contaminants in the instrument [6]. Calcium perchlorate appears to be the best candidate for evolved O2 in the Rocknest samples at this time but other Cl species (e.g., chlorates) are possible and must be evaluated. The potential detection of perchlorates in Rocknest material adds weight to the argument that both Viking Landers measured signatures of perchlorates. Even if the source of the organic carbon detected is still unknown, the chlorine source was likely Martian. Two mechanisms have been hypothesized for the formation of soil perchlorate: (1) Atmospheric oxidation of chlorine; and (2) UV photooxidation of

  6. Proteomics analysis in frozen horse mackerel previously high-pressure processed.

    Science.gov (United States)

    Pazos, Manuel; Méndez, Lucía; Vázquez, Manuel; Aubourg, Santiago P

    2015-10-15

    The effect of high-pressure processing (HPP) (150, 300 and 450 MPa for 0, 2.5 and 5 min) on total sodium dodecyl sulphate (SDS)-soluble and sarcoplasmic proteins in frozen (-10 °C for 3 months) horse mackerel (Trachurus trachurus) was evaluated. Proteomics tools based on image analysis of SDS-PAGE protein gels and protein identification by tandem mass spectrometry (MS/MS) were applied. Although total SDS-soluble fraction indicated no important changes induced by HPP, this processing modified the 1-D SDS-PAGE sarcoplasmic patterns in a direct-dependent manner and exerted a selective effect on particular proteins depending on processing conditions. Thus, application of the highest pressure (450 MPa) provoked a significant degradation of phosphoglycerate mutase 2, glycogen phosphorylase muscle form, pyruvate kinase muscle isozyme, beta-enolase and triosephosphate isomerase and phosphoglucomutase-1. Conversely, protein bands assigned to tropomyosin alpha-1 chain, fast myotomal muscle troponin T and parvalbumin beta 2 increased their intensity after applying a 450-MPa processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Optiwave Refractive Analysis may not work well in patients with previous history of radial keratotomy

    Directory of Open Access Journals (Sweden)

    Fuxiang Zhang

    2018-06-01

    Full Text Available Purpose: To report a case of significant hyperopic outcome (both eyes following Optiwave Refractive Analysis (ORA intraocular lens (IOL power recommendation in a cataract patient with history of 8 cut radial keratotomy (RK in each eye. Observations: It is hypothesized that increased intraocular pressure (IOP from phacoemulsification could make the RK cuts swell, and change cornea shape intraoperatively. In this unique scenario, the corneal curvature readings from ORA could be quite different from preoperative readings or from stabilized postoperative corneal measurements. The change in corneal curvature could also affect the anterior chamber depth and axial length readings, skewing multiple parameters on which ORA bases recommendations for IOL power. Conclusions and importance: ORA has been widely used among cataract surgeons on patients with history of RK, but it's validation, unlike for laser-assisted in-situ keratomileusis (LASIK and photorefractive keratectomy (PRK, has yet to be established by peer reviewed studies. Surgeons should be cautious when using ORA on RK patients. Keywords: Intraoperative aberrometry, ORA, RK, IOL power

  8. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    Science.gov (United States)

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  9. Which diabetic patients should receive podiatry care? An objective analysis.

    Science.gov (United States)

    McGill, M; Molyneaux, L; Yue, D K

    2005-08-01

    Diabetes is the leading cause of lower limb amputation in Australia. However, due to limited resources, it is not feasible for everyone with diabetes to access podiatry care, and some objective guidelines of who should receive podiatry is required. A total of 250 patients with neuropathy (Biothesiometer; Biomedical Instruments, Newbury, Ohio, USA) ( > 30, age podiatry care (mean of estimates from 10 reports), the NNT to prevent one foot ulcer per year was: no neuropathy (vibration perception threshold (VPT) 30) alone, NNT = 45; +cannot feel monofilament, NNT = 18; +previous ulcer/amputation, NNT = 7. Provision of podiatry care to diabetic patients should not be only economically based, but should also be directed to those with reduced sensation, especially where there is a previous history of ulceration or amputation.

  10. Frame sequences analysis technique of linear objects movement

    Science.gov (United States)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  11. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  12. Forest Rent as an Object of Economic Analysis

    Directory of Open Access Journals (Sweden)

    Lisichko Andriyana M.

    2018-01-01

    Full Text Available The article is aimed at researching the concept of forest rent as an object of economic analysis. The essence of the concept of «forest rent» has been researched. It has been defined that the forest rent is the object of management of the forest complex of Ukraine as a whole and forest enterprises in particular. Rent for special use of forest resources is the object of interest om the part of both the State and the corporate sector, because its value depends on the cost of timber for industry and households. Works of scholars on classification of rents were studied. It has been determined that the rent for specialized use of forest resources is a special kind of natural rent. The structure of constituents in the system of rent relations in the forest sector has been defined in accordance with provisions of the tax code of Ukraine.

  13. Fast grasping of unknown objects using principal component analysis

    Science.gov (United States)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  14. Effect of Previous Abdominal Surgery on Laparoscopic Liver Resection: Analysis of Feasibility and Risk Factors for Conversion.

    Science.gov (United States)

    Cipriani, Federica; Ratti, Francesca; Fiorentini, Guido; Catena, Marco; Paganelli, Michele; Aldrighetti, Luca

    2018-03-28

    Previous abdominal surgery has traditionally been considered an additional element of difficulty to later laparoscopic procedures. The aim of the study is to analyze the effect of previous surgery on the feasibility and safety of laparoscopic liver resection (LLR), and its role as a risk factor for conversion. After matching, 349 LLR in patients known for previous abdominal surgery (PS group) were compared with 349 LLR on patients with a virgin abdomen (NPS group). Subgroup analysis included 161 patients with previous upper abdominal surgery (UPS subgroup). Feasibility and safety were evaluated in terms of conversion rate, reasons for conversion and outcomes, and risk factors for conversion assessed via uni/multivariable analysis. Conversion rate was 9.4%, and higher for PS patients compared with NPS patients (13.7% versus 5.1%, P = .021). Difficult adhesiolysis resulted the commonest reason for conversion in PS group (5.7%). However, operative time (P = .840), blood loss (P = .270), transfusion (P = .650), morbidity rate (P = .578), hospital stay (P = .780), and R1 rate (P = .130) were comparable between PS and NPS group. Subgroup analysis confirmed higher conversion rates for UPS patients (23%) compared with both NPS (P = .015) and PS patients (P = .041). Previous surgery emerged as independent risk factor for conversion (P = .033), alongside the postero-superior location and major hepatectomy. LLR are feasible in case of previous surgery and proved to be safe and maintain the benefits of LLR carried out in standard settings. However, a history of surgery should be considered a risk factor for conversion.

  15. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  16. Some new mathematical methods for variational objective analysis

    Science.gov (United States)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  17. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  18. Head First Object-Oriented Analysis and Design

    CERN Document Server

    McLaughlin, Brett D; West, David

    2006-01-01

    "Head First Object Oriented Analysis and Design is a refreshing look at subject of OOAD. What sets this book apart is its focus on learning. The authors have made the content of OOAD accessible, usable for the practitioner." Ivar Jacobson, Ivar Jacobson Consulting "I just finished reading HF OOA&D and I loved it! The thing I liked most about this book was its focus on why we do OOA&D-to write great software!" Kyle Brown, Distinguished Engineer, IBM "Hidden behind the funny pictures and crazy fonts is a serious, intelligent, extremely well-crafted presentation of OO Analysis and Design

  19. Previous induced abortion among young women seeking abortion-related care in Kenya: a cross-sectional analysis.

    Science.gov (United States)

    Kabiru, Caroline W; Ushie, Boniface A; Mutua, Michael M; Izugbara, Chimaraoke O

    2016-05-14

    Unsafe abortion is a leading cause of death among young women aged 10-24 years in sub-Saharan Africa. Although having multiple induced abortions may exacerbate the risk for poor health outcomes, there has been minimal research on young women in this region who have multiple induced abortions. The objective of this study was therefore to assess the prevalence and correlates of reporting a previous induced abortion among young females aged 12-24 years seeking abortion-related care in Kenya. We used data on 1,378 young women aged 12-24 years who presented for abortion-related care in 246 health facilities in a nationwide survey conducted in 2012. Socio-demographic characteristics, reproductive and clinical histories, and physical examination assessment data were collected from women during a one-month data collection period using an abortion case capture form. Nine percent (n = 98) of young women reported a previous induced abortion prior to the index pregnancy for which they were receiving care. Statistically significant differences by previous history of induced abortion were observed for area of residence, religion and occupation at bivariate level. Urban dwellers and unemployed/other young women were more likely to report a previous induced abortion. A greater proportion of young women reporting a previous induced abortion stated that they were using a contraceptive method at the time of the index pregnancy (47 %) compared with those reporting no previous induced abortion (23 %). Not surprisingly, a greater proportion of young women reporting a previous induced abortion (82 %) reported their index pregnancy as unintended (not wanted at all or mistimed) compared with women reporting no previous induced abortion (64 %). Our study results show that about one in every ten young women seeking abortion-related care in Kenya reports a previous induced abortion. Comprehensive post-abortion care services targeting young women are needed. In particular, post

  20. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  1. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Directory of Open Access Journals (Sweden)

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  2. Mediman: Object oriented programming approach for medical image analysis

    International Nuclear Information System (INIS)

    Coppens, A.; Sibomana, M.; Bol, A.; Michel, C.

    1993-01-01

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  3. EGYPTIAN MUTUAL FUNDS ANALYSIS: HISTORY, PERFORMANCE, OBJECTIVES, RISK AND RETURN

    Directory of Open Access Journals (Sweden)

    Petru STEFEA

    2013-10-01

    Full Text Available The present research aims to overview the mutual fund in Egypt. The establishment of the first mutual funds was achieved in 1994. Nowadays, the total mutual funds reached 90 funds , approximately. The income funds represent the largest share of the Egyptian mutual funds (40%, growth funds (25% and the private equity funds is at least (1%. The total population of the Egyptian mutual funds reached 22. Finally, the study proved that the Egyptian mutual funds have an impact on fund return , total risk and systemic; when analysis relationship between risk and return. The study found influencing for mutual fund's objectives on Sharpe and Terynor ratios.

  4. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  5. The suitability of XRF analysis for compositional classification of archaeological ceramic fabric: A comparison with a previous NAA study

    International Nuclear Information System (INIS)

    Padilla, R.; Espen, P. van; Torres, P.P. Godo

    2006-01-01

    The main drawbacks of EDXRF techniques, restricting its more frequent use for the specific purpose of compositional analysis of archaeological ceramic fabric, have been the insufficient sensitivity to determine some important elements (like Cr, REE, among others), a somewhat worse precision and the inability to perform standard-less quantitative procedures in the absence of suitable certified reference materials (CRM) for ceramic fabric. This paper presents the advantages of combining two energy dispersive X-ray fluorescence methods for fast and non-destructive analysis of ceramic fabric with increased sensitivity. Selective polarized excitation using secondary targets (EDPXRF) and radioisotope excitation (R-XRF) using a 241 Am source. The analytical performance of the methods was evaluated by analyzing several CRM of sediment type, and the fitness for the purpose of compositional classification was compared with that obtained by using Instrumental Neutron Activation Analysis in a previous study of Cuban aborigine pottery

  6. The suitability of XRF analysis for compositional classification of archaeological ceramic fabric: A comparison with a previous NAA study

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, R. [Centro de Aplicaciones Tecnologicas y Desarrollo Nuclear (CEADEN), Laboratorio de Analisis Quimico, Calle 30 no. 502, Playa, Ciudad Habana (Cuba)]. E-mail: roman.padilla@infomed.sld.cu; Espen, P. van [University of Antwerp (Belgium); Torres, P.P. Godo [Centro de Antropologia, Havana (Cuba)

    2006-02-03

    The main drawbacks of EDXRF techniques, restricting its more frequent use for the specific purpose of compositional analysis of archaeological ceramic fabric, have been the insufficient sensitivity to determine some important elements (like Cr, REE, among others), a somewhat worse precision and the inability to perform standard-less quantitative procedures in the absence of suitable certified reference materials (CRM) for ceramic fabric. This paper presents the advantages of combining two energy dispersive X-ray fluorescence methods for fast and non-destructive analysis of ceramic fabric with increased sensitivity. Selective polarized excitation using secondary targets (EDPXRF) and radioisotope excitation (R-XRF) using a {sup 241}Am source. The analytical performance of the methods was evaluated by analyzing several CRM of sediment type, and the fitness for the purpose of compositional classification was compared with that obtained by using Instrumental Neutron Activation Analysis in a previous study of Cuban aborigine pottery.

  7. Objective image analysis of the meibomian gland area.

    Science.gov (United States)

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  9. Dexamethasone intravitreal implant in previously treated patients with diabetic macular edema : Subgroup analysis of the MEAD study

    OpenAIRE

    Augustin, A.J.; Kuppermann, B.D.; Lanzetta, P.; Loewenstein, A.; Li, X.; Cui, H.; Hashad, Y.; Whitcup, S.M.; Abujamra, S.; Acton, J.; Ali, F.; Antoszyk, A.; Awh, C.C.; Barak, A.; Bartz-Schmidt, K.U.

    2015-01-01

    Background Dexamethasone intravitreal implant 0.7?mg (DEX 0.7) was approved for treatment of diabetic macular edema (DME) after demonstration of its efficacy and safety in the MEAD registration trials. We performed subgroup analysis of MEAD study results to evaluate the efficacy and safety of DEX 0.7 treatment in patients with previously treated DME. Methods Three-year, randomized, sham-controlled phase 3 study in patients with DME, best-corrected visual acuity (BCVA) of 34?68 Early Treatment...

  10. Joint Tensor Feature Analysis For Visual Object Recognition.

    Science.gov (United States)

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms.

  11. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  12. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  13. At what price? A cost-effectiveness analysis comparing trial of labour after previous caesarean versus elective repeat caesarean delivery.

    Directory of Open Access Journals (Sweden)

    Christopher G Fawsitt

    Full Text Available BACKGROUND: Elective repeat caesarean delivery (ERCD rates have been increasing worldwide, thus prompting obstetric discourse on the risks and benefits for the mother and infant. Yet, these increasing rates also have major economic implications for the health care system. Given the dearth of information on the cost-effectiveness related to mode of delivery, the aim of this paper was to perform an economic evaluation on the costs and short-term maternal health consequences associated with a trial of labour after one previous caesarean delivery compared with ERCD for low risk women in Ireland. METHODS: Using a decision analytic model, a cost-effectiveness analysis (CEA was performed where the measure of health gain was quality-adjusted life years (QALYs over a six-week time horizon. A review of international literature was conducted to derive representative estimates of adverse maternal health outcomes following a trial of labour after caesarean (TOLAC and ERCD. Delivery/procedure costs derived from primary data collection and combined both "bottom-up" and "top-down" costing estimations. RESULTS: Maternal morbidities emerged in twice as many cases in the TOLAC group than the ERCD group. However, a TOLAC was found to be the most-effective method of delivery because it was substantially less expensive than ERCD (€ 1,835.06 versus € 4,039.87 per women, respectively, and QALYs were modestly higher (0.84 versus 0.70. Our findings were supported by probabilistic sensitivity analysis. CONCLUSIONS: Clinicians need to be well informed of the benefits and risks of TOLAC among low risk women. Ideally, clinician-patient discourse would address differences in length of hospital stay and postpartum recovery time. While it is premature advocate a policy of TOLAC across maternity units, the results of the study prompt further analysis and repeat iterations, encouraging future studies to synthesis previous research and new and relevant evidence under a single

  14. Peruvian Tropical Glacier May Survive Longer Than Previously Thought: Landsat Image Analysis of Nevado Coropuna Ice Cap, Peru

    Science.gov (United States)

    Kochtitzky, W. H.; Edwards, B. R.; Marino, J.; Manrique, N.

    2015-12-01

    Nevado Coropuna is a large volcanic complex in southern Peru (15.56°S, 72.62°N; 6,425 m). The complex is approximately 12 km east-west and 8 km north-south with elevation from ~4,500 m at the base to over 6,000 m at the highest points. This ice cap is the largest hosted by a volcano in the tropics, and one of the ten biggest ice masses in the tropics. Previous workers have predicted that the Coropuna ice cap will completely melt by 2050. We present a new analysis of historic satellite imagery to test this hypothesis. In this study, ice and snow are classified based on unique spectral signatures including spectral band thresholds, Normalized Difference Snow Index, and Band 4/5 ratio. Landsat scenes (L2, 4, 5, 7, and 8) from 1975 to present in addition to one SPOT scene (2013) are used. Previous workers used images from June and July, which are peak snow periods in southern Peru, leading to overestimates of ice area. This study uses November and December images when snow is at an annual minimum. Annual equilibrium line altitudes are calculated for each end of year image (November/December). The glaciers of Nevado Coropuna were found to be shrinking at ~0.5 km2/yr, which is ~1/3 the rate previously published. In this study, SPOT (1.5 m resolution) and Landsat 7 ETM scenes from November 23 and 26, 2013 respectively were used to calibrate the spectral band threshold classification. While this study suggests that the ice cap of Coropuna will persist until 2100 given current rates, water quantity and security remains a concern for Peruvian agriculture. Coropuna is an active volcano, so it poses great risk to surrounding inhabitants from lahars, flooding, and debris avalanches. Our new data suggest that these will continue to be risks late into this century.

  15. Objective high Resolution Analysis over Complex Terrain with VERA

    Science.gov (United States)

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  16. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  17. Poka Yoke system based on image analysis and object recognition

    Science.gov (United States)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  18. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  19. Statistical analysis of COMPTEL maximum likelihood-ratio distributions: evidence for a signal from previously undetected AGN

    International Nuclear Information System (INIS)

    Williams, O. R.; Bennett, K.; Much, R.; Schoenfelder, V.; Blom, J. J.; Ryan, J.

    1997-01-01

    The maximum likelihood-ratio method is frequently used in COMPTEL analysis to determine the significance of a point source at a given location. In this paper we do not consider whether the likelihood-ratio at a particular location indicates a detection, but rather whether distributions of likelihood-ratios derived from many locations depart from that expected for source free data. We have constructed distributions of likelihood-ratios by reading values from standard COMPTEL maximum-likelihood ratio maps at positions corresponding to the locations of different categories of AGN. Distributions derived from the locations of Seyfert galaxies are indistinguishable, according to a Kolmogorov-Smirnov test, from those obtained from ''random'' locations, but differ slightly from those obtained from the locations of flat spectrum radio loud quasars, OVVs, and BL Lac objects. This difference is not due to known COMPTEL sources, since regions near these sources are excluded from the analysis. We suggest that it might arise from a number of sources with fluxes below the COMPTEL detection threshold

  20. Static analysis of unbounded structures in object-oriented programs

    NARCIS (Netherlands)

    Grabe, Immo

    2012-01-01

    In this thesis we investigate different techniques and formalisms to address complexity introduced by unbounded structures in object-oriented programs. We give a representation of a weakest precondition calculus for abstract object creation in dynamic logic. Based on this calculus we define symbolic

  1. A Comparative Analysis of Structured and Object-Oriented ...

    African Journals Online (AJOL)

    The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today's programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a ...

  2. Impact of previous open renal surgery on the outcomes of subsequent percutaneous nephrolithotomy: a meta-analysis.

    Science.gov (United States)

    Hu, Henglong; Lu, Yuchao; Cui, Lei; Zhang, Jiaqiao; Zhao, Zhenyu; Qin, Baolong; Wang, Yufeng; Wang, Qing; Wang, Shaogang

    2016-04-28

    The aim of this study was to systematically compare the perioperative outcomes of percutaneous nephrolithotomy in patients with or without previous ipsilateral open renal surgery (POS). Systematic searches of the PubMed, Web of Science and Cochrane Library databases were used to identify relevant studies, and, following literature screening and data extraction, a meta-analysis was performed. 17 retrospective cohort studies involving 4833 procedures (4784 patients) were included. No statistically significant differences were observed between patients with or without POS in terms of supracostal access; single/multiple tracts; metal dilator need; time required to access the collecting system; fluoroscopic duration; demand for analgesics; hospital stay; final stone-free rate; and risk of developing certain complications (eg, fever, haemorrhage, haemo/hydro/pneumothorax, blood transfusion, urinary tract infection and sepsis) as well as regarding the risk of total complications. Patients with POS, however, had a greater drop in haemoglobin (weighted mean difference (WMD), 1.78 g/L; 95% CI 1.09 to 2.47; pPOS also had a lower initial stone-free rate (RR, 0.96; 95% CI 0.92 to 0.99; p=0.007) and more secondary treatment (RR, 1.61; 95% CI 1.09 to 2.37; p=0.02). Sensitivity analysis produced comparable results except for differences in operative time and initial stone-free rate, which did, however, prove to be statistically insignificant (p=0.16 and 0.69, respectively). Current evidence suggests that percutaneous nephrolithotomy in patients with POS is associated with a significantly greater drop in haemoglobin, higher risk of requiring angiographic embolisation and auxiliary procedures, potentially longer operative time, and lower initial stone-free rate than percutaneous nephrolithotomy in patients without POS. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  3. NMR-based phytochemical analysis of Vitis vinifera cv Falanghina leaves. Characterization of a previously undescribed biflavonoid with antiproliferative activity.

    Science.gov (United States)

    Tartaglione, Luciana; Gambuti, Angelita; De Cicco, Paola; Ercolano, Giuseppe; Ianaro, Angela; Taglialatela-Scafati, Orazio; Moio, Luigi; Forino, Martino

    2018-03-01

    Vitis vinifera cv Falanghina is an ancient grape variety of Southern Italy. A thorough phytochemical analysis of the Falanghina leaves was conducted to investigate its specialised metabolite content. Along with already known molecules, such as caftaric acid, quercetin-3-O-β-d-glucopyranoside, quercetin-3-O-β-d-glucuronide, kaempferol-3-O-β-d-glucopyranoside and kaempferol-3-O-β-d-glucuronide, a previously undescribed biflavonoid was identified. For this last compound, a moderate bioactivity against metastatic melanoma cells proliferation was discovered. This datum can be of some interest to researchers studying human melanoma. The high content in antioxidant glycosylated flavonoids supports the exploitation of grape vine leaves as an inexpensive source of natural products for the food industry and for both pharmaceutical and nutraceutical companies. Additionally, this study offers important insights into the plant physiology, thus prompting possible technological researches of genetic selection based on the vine adaptation to specific pedo-climatic environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Towards a syntactic analysis of European Portuguese cognate objects

    Directory of Open Access Journals (Sweden)

    Celda Morgado Choupina

    2013-01-01

    Full Text Available The present paper aims at discussing selected syntactic aspects of cognate objects in European Portuguese, along the lines of Distributed Morphology (Haugen, 2009. Cognate objects may be readily discovered in numerous human languages, including European Portuguese (Chovia uma chuva miudinha. It is assumed in papers devoted to their English counterparts that they belong to various subclasses. Indeed, some of them are genuine cognates (to sleep a sleep... or hyponyms (to dance a jig; Hale & Keyser, 2002. It turns out that in European Portuguese, they can be split into four different categories: (i genuine cognate objects (chorar um choro..., (ii similar cognate objects (dançar uma dança (iii objects hyponyms (dançar um tango and (iv prepositional cognate objects (morrer de uma morte .... There are, then, significant differences between various classes of cognate objects: whereas the genuine ones call imperatively for a restrictive modifier and a definite article, the remaining ones admit it only optionally. It might be concluded, then, that a lexicalist theory set up along the lines of Hale and Keyser is unable to deal successfully with distributional facts proper to various classes of cognate constructions in European Portuguese. That is why the present study is conducted more in accordance with syntactic principles of Distributed Morphology, with a strong impact of hypotheses put forward by Haugen (2009.

  5. Efficacy of lisdexamfetamine dimesylate in children with attention-deficit/hyperactivity disorder previously treated with methylphenidate: a post hoc analysis

    Directory of Open Access Journals (Sweden)

    Jain Rakesh

    2011-11-01

    Full Text Available Abstract Background Attention-deficit/hyperactivity disorder (ADHD is a common neurobehavioral psychiatric disorder that afflicts children, with a reported prevalence of 2.4% to 19.8% worldwide. Stimulants (methylphenidate [MPH] and amphetamine are considered first-line ADHD pharmacotherapy. MPH is a catecholamine reuptake inhibitor, whereas amphetamines have additional presynaptic activity. Although MPH and amphetamine can effectively manage ADHD symptoms in most pediatric patients, many still fail to respond optimally to either. After administration, the prodrug stimulant lisdexamfetamine dimesylate (LDX is converted to l-lysine and therapeutically active d-amphetamine in the blood. The objective of this study was to evaluate the clinical efficacy of LDX in children with ADHD who remained symptomatic (ie, nonremitters; ADHD Rating Scale IV [ADHD-RS-IV] total score > 18 on MPH therapy prior to enrollment in a 4-week placebo-controlled LDX trial, compared with the overall population. Methods In this post hoc analysis of data from a multicenter, randomized, double-blind, forced-dose titration study, we evaluated the clinical efficacy of LDX in children aged 6-12 years with and without prior MPH treatment at screening. ADHD symptoms were assessed using the ADHD-RS-IV scale, Conners' Parent Rating Scale-Revised short form (CPRS-R, and Clinical Global Impressions-Improvement scale, at screening, baseline, and endpoint. ADHD-RS-IV total and CPRS-R ADHD Index scores were summarized as mean (SD. Clinical response for the subgroup analysis was defined as a ≥ 30% reduction from baseline in ADHD-RS-IV score and a CGI-I score of 1 or 2. Dunnett test was used to compare change from baseline in all groups. Number needed to treat to achieve one clinical responder or one symptomatic remitter was calculated as the reciprocal of the difference in their proportions on active treatment and placebo at endpoint. Results Of 290 randomized participants enrolled, 28

  6. FEM analysis of impact of external objects to pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Gracie, Robert; Konuk, Ibrahim [Geological Survey of Canada, Ottawa, ON (Canada)]. E-mail: ikonuk@NRCan.gc.ca; Fredj, Abdelfettah [BMT Fleet Technology Limited, Ottawa, ON (Canada)

    2003-07-01

    One of the most common hazards to pipelines is impact of external objects. Earth moving machinery, farm equipment or bullets can dent or fail land pipelines. External objects such as anchors, fishing gear, ice can damage offshore pipelines. This paper develops an FEM model to simulate the impact process and presents investigations using the FEM model to determine the influence of the geometry and velocity of the impacting object and also will study the influence of the pipe diameter, wall thickness, and concrete thickness along with internal pressure. The FEM model is developed by using LS-DYNA explicit FEM software utilizing shell and solid elements. The model allows damage and removal of the concrete and corrosion coating elements during impact. Parametric studies will be presented relating the dent size to pipe diameter, wall thickness and concrete thickness, internal pipe pressure, and impacting object geometry. The primary objective of this paper is to develop and present the FEM model. The model can be applied to both offshore and land pipeline problems. Some examples are used to illustrate how the model can be applied to real life problems. A future paper will present more detailed parametric studies. (author)

  7. Voice analysis as an objective state marker in bipolar disorder

    DEFF Research Database (Denmark)

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    Changes in speech have been suggested as sensitive and valid measures of depression and mania in bipolar disorder. The present study aimed at investigating (1) voice features collected during phone calls as objective markers of affective states in bipolar disorder and (2) if combining voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... and electronic self-monitored data increased the accuracy, sensitivity and specificity of classification of affective states slightly. Voice features collected in naturalistic settings using smartphones may be used as objective state markers in patients with bipolar disorder....

  8. Software Analysis of Mining Images for Objects Detection

    Directory of Open Access Journals (Sweden)

    Jan Tomecek

    2013-11-01

    Full Text Available The contribution is dealing with the development of the new module of robust FOTOMNG system for editing images from a video or miningimage from measurements for subsequent improvement of detection of required objects in the 2D image. The generated module allows create a finalhigh-quality picture by combination of multiple images with the search objects. We can combine input data according to the parameters or basedon reference frames. Correction of detected 2D objects is also part of this module. The solution is implemented intoFOTOMNG system and finishedwork has been tested in appropriate frames, which were validated core functionality and usability. Tests confirmed the function of each part of themodule, its accuracy and implications of integration.

  9. Voice analysis as an objective state marker in bipolar disorder

    DEFF Research Database (Denmark)

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    features with automatically generated objective smartphone data on behavioral activities (for example, number of text messages and phone calls per day) and electronic self-monitored data (mood) on illness activity would increase the accuracy as a marker of affective states. Using smartphones, voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... to be more accurate, sensitive and specific in the classification of manic or mixed states with an area under the curve (AUC)=0.89 compared with an AUC=0.78 for the classification of depressive states. Combining voice features with automatically generated objective smartphone data on behavioral activities...

  10. Analysis for the high-level waste disposal cost object

    International Nuclear Information System (INIS)

    Kim, S. K.; Lee, J. R.; Choi, J. W.; Han, P. S.

    2003-01-01

    The purpose of this study is to analyse the ratio of cost object in terms of the disposal cost estimation. According to the result, the ratio of operating cost is the most significant object in total cost. There are a lot of differences between the disposal costs and product costs in view of their constituents. While the product costs may be classified by the direct materials cost, direct manufacturing labor cost, and factory overhead the disposal cost factors should be constituted by the technical factors and the non-technical factors

  11. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  12. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Science.gov (United States)

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  13. Insurer’s activity as object of economic analysis

    Directory of Open Access Journals (Sweden)

    O.O. Poplavskiy

    2015-12-01

    Full Text Available The article is devoted to the substantiation of theoretical fundamentals of insurer’s analysis and peculiarities of its implementation. The attention has been focused on the important role of economic analysis in economic science which is confirmed by its active use in research and practical orientation. The author summarizes the classification and principles of insurer’s activity analysis, supplements it with specific principles for insurer’s environment, publicity and risk-orientation which enable increasingly to take into account the peculiarities of insurance relations. The paper pays attention to the specification of elements of analysis and its key directions including the analysis of insurer’s financing, the analysis of insurance operations and the analysis of investment activity which will allow the effective functioning of risk management system.

  14. GuidosToolbox: universal digital image object analysis

    Science.gov (United States)

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  15. Contextual object understanding through geospatial analysis and reasoning (COUGAR)

    Science.gov (United States)

    Douglas, Joel; Antone, Matthew; Coggins, James; Rhodes, Bradley J.; Sobel, Erik; Stolle, Frank; Vinciguerra, Lori; Zandipour, Majid; Zhong, Yu

    2009-05-01

    Military operations in urban areas often require detailed knowledge of the location and identity of commonly occurring objects and spatial features. The ability to rapidly acquire and reason over urban scenes is critically important to such tasks as mission and route planning, visibility prediction, communications simulation, target recognition, and inference of higher-level form and function. Under DARPA's Urban Reasoning and Geospatial ExploitatioN Technology (URGENT) Program, the BAE Systems team has developed a system that combines a suite of complementary feature extraction and matching algorithms with higher-level inference and contextual reasoning to detect, segment, and classify urban entities of interest in a fully automated fashion. Our system operates solely on colored 3D point clouds, and considers object categories with a wide range of specificity (fire hydrants, windows, parking lots), scale (street lights, roads, buildings, forests), and shape (compact shapes, extended regions, terrain). As no single method can recognize the diverse set of categories under consideration, we have integrated multiple state-of-the-art technologies that couple hierarchical associative reasoning with robust computer vision and machine learning techniques. Our solution leverages contextual cues and evidence propagation from features to objects to scenes in order to exploit the combined descriptive power of 3D shape, appearance, and learned inter-object spatial relationships. The result is a set of tools designed to significantly enhance the productivity of analysts in exploiting emerging 3D data sources.

  16. Multi-element analysis of unidentified fallen objects from Tatale in ...

    African Journals Online (AJOL)

    A multi-element analysis has been carried out on two fallen objects, # 01 and # 02, using instrumental neutron activation analysis technique. A total of 17 elements were identified in object # 01 while 21 elements were found in object # 02. The two major elements in object # 01 were Fe and Mg, which together constitute ...

  17. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Directory of Open Access Journals (Sweden)

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  18. Subjective Analysis and Objective Characterization of Adaptive Bitrate Videos

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Tavakoli, Samira; Brunnström, Kjell

    2016-01-01

    The HTTP Adaptive Streaming (HAS) technology allows video service providers to improve the network utilization and thereby increasing the end-users’ Quality of Experience (QoE).This has made HAS a widely used approach for audiovisual delivery. There are several previous studies aiming to identify...... the factors influencing on subjective QoE of adaptation events.However, adapting the video quality typically lasts in a time scale much longer than what current standardized subjective testing methods are designed for, thus making the full matrix design of the experiment on an event level hard to achieve....... In this study, we investigated the overall subjective QoE of 6 minutes long video sequences containing different sequential adaptation events. This was compared to a data set from our previous work performed to evaluate the individual adaptation events. We could then derive a relationship between the overall...

  19. X-ray analysis of objects of art and archaeology

    International Nuclear Information System (INIS)

    Mantler, M.; Schreiner, M.

    2001-01-01

    Some theoretical aspects and limitations of XRF are discussed, including information depths in layered materials, characterization of inhomogeneous specimens, light element analysis, and radiation damage. Worked examples of applications of XRF and XRD are pigment analysis in delicate Chinese Paper, corrosion of glass, and leaching effects in soil-buried medieval coins. (author)

  20. The MUSIC algorithm for sparse objects: a compressed sensing analysis

    International Nuclear Information System (INIS)

    Fannjiang, Albert C

    2011-01-01

    The multiple signal classification (MUSIC) algorithm, and its extension for imaging sparse extended objects, with noisy data is analyzed by compressed sensing (CS) techniques. A thresholding rule is developed to augment the standard MUSIC algorithm. The notion of restricted isometry property (RIP) and an upper bound on the restricted isometry constant (RIC) are employed to establish sufficient conditions for the exact localization by MUSIC with or without noise. In the noiseless case, the sufficient condition gives an upper bound on the numbers of random sampling and incident directions necessary for exact localization. In the noisy case, the sufficient condition assumes additionally an upper bound for the noise-to-object ratio in terms of the RIC and the dynamic range of objects. This bound points to the super-resolution capability of the MUSIC algorithm. Rigorous comparison of performance between MUSIC and the CS minimization principle, basis pursuit denoising (BPDN), is given. In general, the MUSIC algorithm guarantees to recover, with high probability, s scatterers with n=O(s 2 ) random sampling and incident directions and sufficiently high frequency. For the favorable imaging geometry where the scatterers are distributed on a transverse plane MUSIC guarantees to recover, with high probability, s scatterers with a median frequency and n=O(s) random sampling/incident directions. Moreover, for the problems of spectral estimation and source localizations both BPDN and MUSIC guarantee, with high probability, to identify exactly the frequencies of random signals with the number n=O(s) of sampling times. However, in the absence of abundant realizations of signals, BPDN is the preferred method for spectral estimation. Indeed, BPDN can identify the frequencies approximately with just one realization of signals with the recovery error at worst linearly proportional to the noise level. Numerical results confirm that BPDN outperforms MUSIC in the well-resolved case while

  1. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  2. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    Science.gov (United States)

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, pemotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  3. Analysis of over 10,000 Cases finds no association between previously reported candidate polymorphisms and ovarian cancer outcome

    DEFF Research Database (Denmark)

    White, Kristin L; Vierkant, Robert A; Fogarty, Zachary C

    2013-01-01

    Ovarian cancer is a leading cause of cancer-related death among women. In an effort to understand contributors to disease outcome, we evaluated single-nucleotide polymorphisms (SNP) previously associated with ovarian cancer recurrence or survival, specifically in angiogenesis, inflammation, mitosis...

  4. The influence of previous subject experience on interactions during peer instruction in an introductory physics course: A mixed methods analysis

    Science.gov (United States)

    Vondruska, Judy A.

    Over the past decade, peer instruction and the introduction of student response systems has provided a means of improving student engagement and achievement in large-lecture settings. While the nature of the student discourse occurring during peer instruction is less understood, existing studies have shown student ideas about the subject, extraneous cues, and confidence level appear to matter in the student-student discourse. Using a mixed methods research design, this study examined the influence of previous subject experience on peer instruction in an introductory, one-semester Survey of Physics course. Quantitative results indicated students in discussion pairs where both had previous subject experience were more likely to answer clicker question correctly both before and after peer discussion compared to student groups where neither partner had previous subject experience. Students in mixed discussion pairs were not statistically different in correct response rates from the other pairings. There was no statistically significant difference between the experience pairs on unit exam scores or the Peer Instruction Partner Survey. Although there was a statistically significant difference between the pre-MPEX and post-MPEX scores, there was no difference between the members of the various subject experience peer discussion pairs. The qualitative study, conducted after the quantitative study, helped to inform the quantitative results by exploring the nature of the peer interactions through survey questions and a series of focus groups discussions. While the majority of participants described a benefit to the use of clickers in the lecture, their experience with their discussion partners varied. Students with previous subject experience tended to describe peer instruction more positively than students who did not have previous subject experience, regardless of the experience level of their partner. They were also more likely to report favorable levels of comfort with

  5. Multispectral image analysis for object recognition and classification

    Science.gov (United States)

    Viau, C. R.; Payeur, P.; Cretu, A.-M.

    2016-05-01

    Computer and machine vision applications are used in numerous fields to analyze static and dynamic imagery in order to assist or automate decision-making processes. Advancements in sensor technologies now make it possible to capture and visualize imagery at various wavelengths (or bands) of the electromagnetic spectrum. Multispectral imaging has countless applications in various fields including (but not limited to) security, defense, space, medical, manufacturing and archeology. The development of advanced algorithms to process and extract salient information from the imagery is a critical component of the overall system performance. The fundamental objective of this research project was to investigate the benefits of combining imagery from the visual and thermal bands of the electromagnetic spectrum to improve the recognition rates and accuracy of commonly found objects in an office setting. A multispectral dataset (visual and thermal) was captured and features from the visual and thermal images were extracted and used to train support vector machine (SVM) classifiers. The SVM's class prediction ability was evaluated separately on the visual, thermal and multispectral testing datasets.

  6. Analysis of process parameters in surface grinding using single objective Taguchi and multi-objective grey relational grade

    Directory of Open Access Journals (Sweden)

    Prashant J. Patil

    2016-09-01

    Full Text Available Close tolerance and good surface finish are achieved by means of grinding process. This study was carried out for multi-objective optimization of MQL grinding process parameters. Water based Al2O3 and CuO nanofluids of various concentrations are used as lubricant for MQL system. Grinding experiments were carried out on instrumented surface grinding machine. For experimentation purpose Taguchi's method was used. Important process parameters that affect the G ratio and surface finish in MQL grinding are depth of cut, type of lubricant, feed rate, grinding wheel speed, coolant flow rate, and nanoparticle size. Grinding performance was calculated by the measurement G ratio and surface finish. For improvement of grinding process a multi-objective process parameter optimization is performed by use of Taguchi based grey relational analysis. To identify most significant factor of process analysis of variance (ANOVA has been used.

  7. Object permanence in cats: Analysis in locomotor space.

    Science.gov (United States)

    Thinus-Blanc, C; Poucet, B; Chapuis, N

    1982-04-01

    Stages IV and V object permanence were studied with 38-40-week-old cats. A constraining apparatus preventing animals from pursuing the bowl containing meat before it was concealed was used. Either the bowl was seen moving and disappeared from view behind a screen (stage IV trials), or after this sequence, it reappeared from behind the first screen and disappeared behind a second screen (stage V trials). In both situations cats performed significantly above chance but the paths taken to reach the food were different according to the stage. In stage V trials, cats expressed a preference for the path leading to the end of the second screen where the food was last seen disappearing. Copyright © 1982. Published by Elsevier B.V.

  8. Heating Development Analysis in Long HTS Objects - Updated Results

    Energy Technology Data Exchange (ETDEWEB)

    Vysotsky, V S; Repnikov, V V; Lobanov, E A; Karapetyan, G H; Sytnikov, V E [All-Russian Scientific R and D Cable Institute, 5, Shosse Entuziastov, 111024, Moscow (Russian Federation)

    2006-06-01

    During fault in a grid large overload current, up to 30-times fold, forcibly will go to an HTS superconducting cable installed in a grid causing its quench and heating. The upgraded model has been used to analyse the heating development in long HTS objects during overloads. The model better presents real properties of materials used. New calculations coincide well with experiments and permit to determine the cooling coefficients. The stability limit (thermal runaway current) was determined for different cooling and index n. The overload currents, at which the superconductor will be heated up to 100 K during 250 ms can be determined also. The model may be used for practical evaluations of operational parameters.

  9. Introductory Psychology Textbooks: An Objective Analysis and Update.

    Science.gov (United States)

    Griggs, Richard A.; Jackson, Sherri L.; Christopher, Andrew N.; Marek, Pam

    1999-01-01

    Explores changes in the introductory psychology textbook market through an analysis of edition, author, length, and content coverage of the volumes that comprise the current market. Finds a higher edition average, a decrease in the number of authors, an increase in text pages, and a focus on developmental psychology and sensation/perception. (CMK)

  10. Analysis of 60 706 Exomes Questions the Role of De Novo Variants Previously Implicated in Cardiac Disease

    DEFF Research Database (Denmark)

    Paludan-Müller, Christian; Ahlberg, Gustav; Ghouse, Jonas

    2017-01-01

    BACKGROUND: De novo variants in the exome occur at a rate of 1 per individual per generation, and because of the low reproductive fitness for de novo variants causing severe disease, the likelihood of finding these as standing variations in the general population is low. Therefore, this study...... sought to evaluate the pathogenicity of de novo variants previously associated with cardiac disease based on a large population-representative exome database. METHODS AND RESULTS: We performed a literature search for previous publications on de novo variants associated with severe arrhythmias...... trio studies (>1000 subjects). Of the monogenic variants, 11% (23/211) were present in ExAC, whereas 26% (802/3050) variants believed to increase susceptibility of disease were identified in ExAC. Monogenic de novo variants in ExAC had a total allele count of 109 and with ≈844 expected cases in Ex...

  11. The Army Communications Objectives Measurement System (ACOMS): Survey Analysis Plan

    Science.gov (United States)

    1988-05-01

    Analysis Plan 12. PERSONAL AUTHOR(S) Gregory H. Gaertner (Westat) and Timothy W. Elig (ARI), editors 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF...such as those of Lavidge and Steiner (1961), McGuire (1969), and Fishbein and Azjen (1975). Fishbein and Azjen (1975) and Aaker (1975) present...for college, challenge and personal development, or patriotic service). Corresponding to these beliefs are evaluations of the importance of these

  12. Robustness of Multiple Objective Decision Analysis Preference Functions

    Science.gov (United States)

    2002-06-01

    Bayesian Decision Theory and Utilitarian Ethics ,” American Economic Review Papers and Proceedings, 68: 223-228 (May 1978). Hartsough, Bruce R. “A...1983). Morrell, Darryl and Eric Driver. “ Bayesian Network Implementation of Levi’s Epistemic Utility Decision Theory ,” International Journal Of...elicitation efficiency for the decision maker. Subject Terms Decision Analysis, Utility Theory , Elicitation Error, Operations Research, Decision

  13. Objective Bayesian analysis of neutrino masses and hierarchy

    Science.gov (United States)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  14. At what price? A cost-effectiveness analysis comparing trial of labour after previous Caesarean versus elective repeat Caesarean delivery.

    LENUS (Irish Health Repository)

    Fawsitt, Christopher G

    2013-01-01

    Elective repeat caesarean delivery (ERCD) rates have been increasing worldwide, thus prompting obstetric discourse on the risks and benefits for the mother and infant. Yet, these increasing rates also have major economic implications for the health care system. Given the dearth of information on the cost-effectiveness related to mode of delivery, the aim of this paper was to perform an economic evaluation on the costs and short-term maternal health consequences associated with a trial of labour after one previous caesarean delivery compared with ERCD for low risk women in Ireland.

  15. Hadronic Triggers and trigger-object level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program, and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors to more deeply probing for new physics, such as storage and computing requirements f...

  16. Hadronic triggers and trigger object-level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program at the Large Hadron Collider (LHC), and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous event rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors, such as storage and computing requirements...

  17. a Morphometric Analysis of HYLARANA SIGNATA Group (previously Known as RANA SIGNATA and RANA PICTURATA) of Malaysia

    Science.gov (United States)

    Zainudin, Ramlah; Sazali, Siti Nurlydia

    A study on morphometrical variations of Malaysian Hylarana signata group was conducted to reveal the morphological relationships within the species group. Twenty-seven morphological characters from 18 individuals of H. signata and H. picturata were measured and recorded. The numerical data were analysed using Discriminant Function Analysis in SPSS program version 16.0 and UPGMA Cluster Analysis in Minitab program version 14.0. The results show the complexity clustering between the examined species that might be due to ancient polymorphism of the lineages or cryptic species within the group. Hence, further study should include more representatives in order to fully elucidate the morphological relationships of H. signata group.

  18. Karnyothrips flavipes, a previously unreported predatory thrips of the coffee berry borer: DNA-based gut content analysis

    Science.gov (United States)

    A new predator of the coffee berry borer, Hypothenemus hampei, was found in the coffee growing area of Kisii in Western Kenya. Field observations, laboratory trials and gut content analysis using molecular tools have confirmed the role of the predatory thrips Karnyothrips flavipes Jones (Phlaeothrip...

  19. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  20. Multi-objective optimization of GPU3 Stirling engine using third order analysis

    International Nuclear Information System (INIS)

    Toghyani, Somayeh; Kasaeian, Alibakhsh; Hashemabadi, Seyyed Hasan; Salimi, Morteza

    2014-01-01

    Highlights: • A third-order analysis is carried out for optimization of Stirling engine. • The triple-optimization is done on a GPU3 Stirling engine. • A multi-objective optimization is carried out for a Stirling engine. • The results are compared with an experimental previous work for checking the model improvement. • The methods of TOPSIS, Fuzzy, and LINMAP are compared with each other in aspect of optimization. - Abstract: Stirling engine is an external combustion engine that uses any external heat source to generate mechanical power which operates at closed cycles. These engines are good choices for using in power generation systems; because these engines present a reasonable theoretical efficiency which can be closer to the Carnot efficiency, comparing with other reciprocating thermal engines. Hence, many studies have been conducted on Stirling engines and the third order thermodynamic analysis is one of them. In this study, multi-objective optimization with four decision variables including the temperature of heat source, stroke, mean effective pressure, and the engine frequency were applied in order to increase the efficiency and output power and reduce the pressure drop. Three decision-making procedures were applied to optimize the answers from the results. At last, the applied methods were compared with the results obtained of one experimental work and a good agreement was observed

  1. Adherence to headache treatment and profile of previous health professional seeking among patients with chronic headache: a retrospective analysis.

    Science.gov (United States)

    Krymchantowski, Abouch Valenty; Adriano, Marcus Vinicius; de Góes, Renemilda; Moreira, Pedro Ferreira; da Cunha Jevoux, Carla

    2007-04-26

    Chronic headache is common among patients in neurology clinics. Patients may suffer important economic and social losses because of headaches, which may result in high expectations for treatment outcomes. When their treatment goals are not reached quickly, treatment may be difficult to maintain and patients may consult with numerous health professionals. This retrospective study evaluated the relationship between treatment and the profiles of previous health professionals consulted by patients in a tertiary headache center. The records were reviewed of all patients from a headache center who were seen in initial consultation between January 2000 and June 2003. Data related to patient demographic characteristics (sex and age), headache diagnosis, and the profile (quality and quantity) of previous healthcare consultations exclusively related to headache, were collected. The headache diagnoses were confirmed according to the IHS criteria (1988) and to the Silberstein criteria (1994,1996). Although adherence includes taking the prescribed medicines, discontinuing overused symptomatic medications, and changing behavior, among other things, for this study, adherence was defined as when the patient returned at least 2 times within a 3- to 3.5-month period. Patients were separated into groups depending on the number of different healthcare professionals they had consulted, from none to more than 7. Data from 495 patients were analyzed; 357 were women and 138 were men (ages 6 to 90 years; mean, 41.1 +/- 15.05 years). The headache diagnoses included migraine without aura (43.2%), chronic (transformed) migraine (40%), cluster headache (6.5%), episodic tension-type headache (0.8%), and hemicrania continua (0.4%). The 24.2% of patients who sought care from no more than 1 health professional showed a 59.8% adherence rate; 29% of the total had consulted 7 or more health professionals and showed an adherence rate of 74.3% (P = .0004). In Brazil, the belief is widespread that

  2. Objective analysis of image quality of video image capture systems

    Science.gov (United States)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  3. SOCIAL EXCLUSION AS AN OBJECT OF ECONOMIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z. Halushka

    2014-06-01

    Full Text Available In the article essence and forms of display of social exception of separate citizens and certain layers of population are certain as the socioeconomic phenomenon. Theoretical principles and methodology of estimation of the phenomenon of social exception are analyzed. Certain characteristic lines of social exception: subzero even consumptions and profit of individuals or groups; a limit access is to the public mechanisms of increase of welfare; a mainly passive type of cooperating is with society. Attention is accented on a defect for the individuals of row of rights, limit nature of access to the institutes that distribute resources, to the labor-market. Poverty is certain the main category of social exception. A concept "circles of poverty" and mechanisms of its existence are reasonable. Other displays of social exception-direct violation of base human rights are examined on quality education, on medical services and kind health, on the acceptable standard of living, on access to cultural acquisition, on defense of the interests and on the whole on participating in economic, social, in a civilized manner, political life of country. Cited data about part of torn away housekeeping of Ukraine on separate signs. The analysis of distribution of housekeeping after the amount of the accumulated signs of the social tearing away gave an opportunity to set a limit after that the social tearing away begins brightly to show up, at the level of 5 signs. It is certain the limit of the sharp tearing away. The second degree of tearing away – critical – answers a presence 7thsigns. At this level in Ukraine there are 37,7. That's far more than those, who are considered poor on a relative national criterion (24,0. It is set that conception of social exception shows the "horizontal cut" of the system of social relations and place of individual, layer, group and others like that in this system, certain on certain signs. The necessity of the use of

  4. Diachronic and Synchronic Analysis - the Case of the Indirect Object in Spanish

    DEFF Research Database (Denmark)

    Dam, Lotte; Dam-Jensen, Helle

    2007-01-01

    The article deals with a monograph on the indirect object in Spanish. The book offers a many-faceted analysis of the indrect object, as it, on the one hand, gives a detailed diachronic analysis of what is known as clitic-doubled constructions and, on the other, a synchronic analysis of both...

  5. Allogeneic cell transplant expands bone marrow distribution by colonizing previously abandoned areas: an FDG PET/CT analysis.

    Science.gov (United States)

    Fiz, Francesco; Marini, Cecilia; Campi, Cristina; Massone, Anna Maria; Podestà, Marina; Bottoni, Gianluca; Piva, Roberta; Bongioanni, Francesca; Bacigalupo, Andrea; Piana, Michele; Sambuceti, Gianmario; Frassoni, Francesco

    2015-06-25

    Mechanisms of hematopoietic reconstitution after bone marrow (BM) transplantation remain largely unknown. We applied a computational quantification software application to hybrid 18F-fluorodeoxyglucose positron emission tomography (PET)/computed tomography (CT) images to assess activity and distribution of the hematopoietic system throughout the whole skeleton of recently transplanted patients. Thirty-four patients underwent PET/CT 30 days after either adult stem cell transplantation (allogeneic cell transplantation [ACT]; n = 18) or cord blood transplantation (CBT; n = 16). Our software automatically recognized compact bone volume and trabecular bone volume (IBV) in CT slices. Within IBV, coregistered PET data were extracted to identify the active BM (ABM) from the inactive tissue. Patients were compared with 34 matched controls chosen among a published normalcy database. Whole body ABM increased in ACT and CBT when compared with controls (12.4 ± 3 and 12.8 ± 6.8 vs 8.1 ± 2.6 mL/kg of ideal body weight [IBW], P bones, ABM increased three- and sixfold in CBT and ACT, respectively, compared with controls (0.9 ± 0.9 and 1.7 ± 2.5 vs 0.3 ± 0.3 mL/kg IBW, P transplanted BM into previously abandoned BM sites. © 2015 by The American Society of Hematology.

  6. Transcriptomic analysis in a Drosophila model identifies previously implicated and novel pathways in the therapeutic mechanism in neuropsychiatric disorders

    Directory of Open Access Journals (Sweden)

    Priyanka eSingh

    2011-03-01

    Full Text Available We have taken advantage of a newly described Drosophila model to gain insights into the potential mechanism of antiepileptic drugs (AEDs, a group of drugs that are widely used in the treatment of several neurological and psychiatric conditions besides epilepsy. In the recently described Drosophila model that is inspired by pentylenetetrazole (PTZ induced kindling epileptogenesis in rodents, chronic PTZ treatment for seven days causes a decreased climbing speed and an altered CNS transcriptome, with the latter mimicking gene expression alterations reported in epileptogenesis. In the model, an increased climbing speed is further observed seven days after withdrawal from chronic PTZ. We used this post-PTZ withdrawal regime to identify potential AED mechanism. In this regime, treatment with each of the five AEDs tested, namely, ethosuximide (ETH, gabapentin (GBP, vigabatrin (VGB, sodium valproate (NaVP and levetiracetam (LEV, resulted in rescuing of the altered climbing behavior. The AEDs also normalized PTZ withdrawal induced transcriptomic perturbation in fly heads; whereas AED untreated flies showed a large number of up- and down-regulated genes which were enriched in several processes including gene expression and cell communication, the AED treated flies showed differential expression of only a small number of genes that did not enrich gene expression and cell communication processes. Gene expression and cell communication related upregulated genes in AED untreated flies overrepresented several pathways - spliceosome, RNA degradation, and ribosome in the former category, and inositol phosphate metabolism, phosphatidylinositol signaling, endocytosis and hedgehog signaling in the latter. Transcriptome remodeling effect of AEDs was overall confirmed by microarray clustering that clearly separated the profiles of AED treated and untreated flies. Besides being consistent with previously implicated pathways, our results provide evidence for a role of

  7. Does previous failed ESWL have a negative impact of on the outcome of ureterorenoscopy? A matched pair analysis.

    Science.gov (United States)

    Philippou, Prodromos; Payne, David; Davenport, Kim; Timoney, Anthony G; Keeley, Francis X

    2013-11-01

    This study aims to evaluate the outcome of ureteroscopy/ureterorenoscopy (URS) as a salvage procedure for stones resistant to extracorporeal shock wave lithotripsy (ESWL). Between January 2009 and January 2012, 313 patients with upper tract lithiasis were treated by URS. Among them, 87 (27.8 %) had undergone URS after prior ESWL failed to achieve stone clearance (Salvage group). These patients were matched with a group of patients who underwent URS as first-line modality (Primary group). Stone-free rates and adjuvant procedures represented the primary points for comparison. Secondary points for comparison included complications, procedure duration, total laser energy used and length of hospitalization. Matching was possible in all cases. Stone clearance rates were 73.6 and 82.8 % for the Salvage and Primary group, respectively. The difference in stone clearance rates between the two groups was not statistically significant (p = 0.186). A total of 11 patients (12.6 %) in the Primary group and 18 patients (20.7 %) in the Salvage group underwent an adjuvant procedure (p = 0.154). No statistically significant differences were noted in terms of complications, procedure duration and length of hospitalization. In the Primary group, the laser energy used for stone fragmentation was higher (p = 0.043). The rate of ureteric stenting at the end of the procedure was higher for the Salvage group (p = 0.030). Previous failed ESWL is not a predictor for unfavorable outcome of URS. Salvage URS is associated, however, with an increased need for ureteric stenting at the end of the procedure.

  8. Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects

    Science.gov (United States)

    Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.

    2013-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to

  9. Intermediary object for participative design processes based on the ergonomic work analysis

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  10. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  11. Infrared spectroscopy with multivariate analysis to interrogate endometrial tissue: a novel and objective diagnostic approach.

    Science.gov (United States)

    Taylor, S E; Cheung, K T; Patel, I I; Trevisan, J; Stringfellow, H F; Ashton, K M; Wood, N J; Keating, P J; Martin-Hirsch, P L; Martin, F L

    2011-03-01

    Endometrial cancer is the most common gynaecological malignancy in the United Kingdom. Diagnosis currently involves subjective expert interpretation of highly processed tissue, primarily using microscopy. Previous work has shown that infrared (IR) spectroscopy can be used to distinguish between benign and malignant cells in a variety of tissue types. Tissue was obtained from 76 patients undergoing hysterectomy, 36 had endometrial cancer. Slivers of endometrial tissue (tumour and tumour-adjacent tissue if present) were dissected and placed in fixative solution. Before analysis, tissues were thinly sliced, washed, mounted on low-E slides and desiccated; 10 IR spectra were obtained per slice by attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. Derived data was subjected to principal component analysis followed by linear discriminant analysis. Post-spectroscopy analyses, tissue sections were haematoxylin and eosin-stained to provide histological verification. Using this approach, it is possible to distinguish benign from malignant endometrial tissue, and various subtypes of both. Cluster vector plots of benign (verified post-spectroscopy to be free of identifiable pathology) vs malignant tissue indicate the importance of the lipid and secondary protein structure (Amide I and Amide II) regions of the spectrum. These findings point towards the possibility of a simple objective test for endometrial cancer using ATR-FTIR spectroscopy. This would facilitate earlier diagnosis and so reduce the morbidity and mortality associated with this disease.

  12. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  13. Accelerometry-based gait analysis, an additional objective approach to screen subjects at risk for falling.

    Science.gov (United States)

    Senden, R; Savelberg, H H C M; Grimm, B; Heyligers, I C; Meijer, K

    2012-06-01

    This study investigated whether the Tinetti scale, as a subjective measure for fall risk, is associated with objectively measured gait characteristics. It is studied whether gait parameters are different for groups that are stratified for fall risk using the Tinetti scale. Moreover, the discriminative power of gait parameters to classify elderly according to the Tinetti scale is investigated. Gait of 50 elderly with a Tinneti>24 and 50 elderly with a Tinetti≤24 was analyzed using acceleration-based gait analysis. Validated algorithms were used to derive spatio-temporal gait parameters, harmonic ratio, inter-stride amplitude variability and root mean square (RMS) from the accelerometer data. Clear differences in gait were found between the groups. All gait parameters correlated with the Tinetti scale (r-range: 0.20-0.73). Only walking speed, step length and RMS showed moderate to strong correlations and high discriminative power to classify elderly according to the Tinetti scale. It is concluded that subtle gait changes that have previously been related to fall risk are not captured by the subjective assessment. It is therefore worthwhile to include objective gait assessment in fall risk screening. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Science.gov (United States)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  15. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  16. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  17. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  18. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    Science.gov (United States)

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  19. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    Science.gov (United States)

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  20. Object-based image analysis and data mining for building ontology of informal urban settlements

    Science.gov (United States)

    Khelifa, Dejrriri; Mimoun, Malki

    2012-11-01

    During recent decades, unplanned settlements have been appeared around the big cities in most developing countries and as consequence, numerous problems have emerged. Thus the identification of different kinds of settlements is a major concern and challenge for authorities of many countries. Very High Resolution (VHR) Remotely Sensed imagery has proved to be a very promising way to detect different kinds of settlements, especially through the using of new objectbased image analysis (OBIA). The most important key is in understanding what characteristics make unplanned settlements differ from planned ones, where most experts characterize unplanned urban areas by small building sizes at high densities, no orderly road arrangement and Lack of green spaces. Knowledge about different kinds of settlements can be captured as a domain ontology that has the potential to organize knowledge in a formal, understandable and sharable way. In this work we focus on extracting knowledge from VHR images and expert's knowledge. We used an object based strategy by segmenting a VHR image taken over urban area into regions of homogenous pixels at adequate scale level and then computing spectral, spatial and textural attributes for each region to create objects. A genetic-based data mining was applied to generate high predictive and comprehensible classification rules based on selected samples from the OBIA result. Optimized intervals of relevant attributes are found, linked with land use types for forming classification rules. The unplanned areas were separated from the planned ones, through analyzing of the line segments detected from the input image. Finally a simple ontology was built based on the previous processing steps. The approach has been tested to VHR images of one of the biggest Algerian cities, that has grown considerably in recent decades.

  1. X-ray fluorescence analysis of archaeological finds and art objects: Recognizing gold and gilding

    International Nuclear Information System (INIS)

    Trojek, Tomáš; Hložek, Martin

    2012-01-01

    Many cultural heritage objects were gilded in the past, and nowadays they can be found in archeological excavations or in historical buildings dating back to the Middle Ages, or from the modern period. Old gilded artifacts have been studied using X-ray fluorescence analysis and 2D microanalysis. Several techniques that enable the user to distinguish gold and gilded objects are described and then applied to investigate artifacts. These techniques differ in instrumentation, data analysis and numbers of measurements. The application of Monte Carlo calculation to a quantitative analysis of gilded objects is also introduced. - Highlights: ► Three techniques of gilding identification with XRF analysis are proposed. ► These techniques are applied to gold and gilded art and archeological objects. ► Composition of a substrate material is determined by a Monte Carlo simulation.

  2. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  3. A meta-analysis of drug resistant tuberculosis in Sub-Saharan Africa: how strongly associated with previous treatment and HIV co-infection?

    Science.gov (United States)

    Berhan, Asres; Berhan, Yifru; Yizengaw, Desalegn

    2013-11-01

    In Sub-Saharan Africa, the fight against tuberculosis (TB) has encountered a great challenge because of the emergence of drug resistant TB strains and the high prevalence of HIV infection. The aim of this meta-analysis was to determine the association of drug-resistant TB with anti-TB drug treatment history and HIV co-infection. After electronic based literature search in the databases of Medline, HINARI, EMBASE and the Cochrane library, article selection and data extraction were carried out. HIV co-infection and previous history of TB treatment were used as predictors for the occurrence of any anti-TB drug resistant or multiple drug resistant TB (MDR-TB). The risk ratios for each included study and for the pooled sample were computed using the random-effects model. Heterogeneity test, sensitivity analyses and funnel plots were also done. The pooled analysis showed that the risk of developing drug-resistant TB to at least one anti-TB drug was about 3 times higher in individuals who had a previous history of anti-TB treatment than new TB cases. The risk of having MDR-TB in previously anti-TB treated TB cases was more than 5-fold higher than that of new TB cases. Resistance to Ethambutol and Rifampicin was more than fivefold higher among the previously treated with anti-TB drugs. However, HIV infection was not associated with drug-resistant TB. There was a strong association of previous anti-TB treatment with MDR-TB. Primary treatment warrants special emphasis, and screening for anti-TB drugs sensitivity has to be strengthened.

  4. Exploring the impact of learning objects in middle school mathematics and science classrooms: A formative analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2008-12-01

    Full Text Available The current study offers a formative analysis of the impact of learning objects in middle school mathematics and science classrooms. Five reliable and valid measure of effectiveness were used to examine the impact of learning objects from the perspective of 262 students and 8 teachers (14 classrooms in science or mathematics. The results indicate that teachers typically spend 1-2 hours finding and preparing for learning-object based lesson plans that focus on the review of previous concepts. Both teachers and students are positive about the learning benefits, quality, and engagement value of learning objects, although teachers are more positive than students. Student performance increased significantly, over 40%, when learning objects were used in conjunction with a variety of teaching strategies. It is reasonable to conclude that learning objects have potential as a teaching tool in a middle school environment. L’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire : une analyse formative Résumé : Cette étude présente une analyse formative de l’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire. Cinq mesures de rendement fiables et valides ont été exploitées pour examiner l’effet des objets d’apprentissage selon 262 élèves et 8 enseignants (414 classes en science ou mathématiques. Les résultats indiquent que les enseignants passent typiquement 1-2 heures pour trouver des objets d’apprentissage et préparer les leçons associées qui seraient centrées sur la revue de concepts déjà vus en classe. Quoique les enseignants aient répondu de façon plus positive que les élèves, les deux groupes ont répondu positivement quant aux avantages au niveau de l’apprentissage, à la qualité ainsi qu’à la valeur motivationnelle des objets d’apprentissage. Le rendement des élèves aurait aussi augment

  5. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  6. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Science.gov (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  7. High-resolution melting (HRM) re-analysis of a polyposis patients cohort reveals previously undetected heterozygous and mosaic APC gene mutations.

    Science.gov (United States)

    Out, Astrid A; van Minderhout, Ivonne J H M; van der Stoep, Nienke; van Bommel, Lysette S R; Kluijt, Irma; Aalfs, Cora; Voorendt, Marsha; Vossen, Rolf H A M; Nielsen, Maartje; Vasen, Hans F A; Morreau, Hans; Devilee, Peter; Tops, Carli M J; Hes, Frederik J

    2015-06-01

    Familial adenomatous polyposis is most frequently caused by pathogenic variants in either the APC gene or the MUTYH gene. The detection rate of pathogenic variants depends on the severity of the phenotype and sensitivity of the screening method, including sensitivity for mosaic variants. For 171 patients with multiple colorectal polyps without previously detectable pathogenic variant, APC was reanalyzed in leukocyte DNA by one uniform technique: high-resolution melting (HRM) analysis. Serial dilution of heterozygous DNA resulted in a lowest detectable allelic fraction of 6% for the majority of variants. HRM analysis and subsequent sequencing detected pathogenic fully heterozygous APC variants in 10 (6%) of the patients and pathogenic mosaic variants in 2 (1%). All these variants were previously missed by various conventional scanning methods. In parallel, HRM APC scanning was applied to DNA isolated from polyp tissue of two additional patients with apparently sporadic polyposis and without detectable pathogenic APC variant in leukocyte DNA. In both patients a pathogenic mosaic APC variant was present in multiple polyps. The detection of pathogenic APC variants in 7% of the patients, including mosaics, illustrates the usefulness of a complete APC gene reanalysis of previously tested patients, by a supplementary scanning method. HRM is a sensitive and fast pre-screening method for reliable detection of heterozygous and mosaic variants, which can be applied to leukocyte and polyp derived DNA.

  8. Extending Track Analysis from Animals in the Lab to Moving Objects Anywhere

    NARCIS (Netherlands)

    Dommelen, W. van; Laar, P.J.L.J. van de; Noldus, L.P.J.J.

    2013-01-01

    In this chapter we compare two application domains in which the tracking of objects and the analysis of their movements are core activities, viz. animal tracking and vessel tracking. More specifically, we investigate whether EthoVision XT, a research tool for video tracking and analysis of the

  9. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  10. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  11. SEGMENT OF FINANCIAL CORPORATIONS AS AN OBJECT OF FINANCIAL AND STATISTICAL ANALYSIS

    OpenAIRE

    Marat F. Mazitov

    2013-01-01

    The article is devoted to the study specific features of the formation and change of economic assets of financial corporations as an object of management and financial analysis. He author identifies the features and gives the classification of institutional units belonging to the sector of financial corporations from the viewpoint of assessment and financial analysis of the flows, reflecting change of their assets.

  12. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1987-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results are compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are also presented. (author) 3 refs.; 4 tabs

  13. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1986-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are presented. (author)

  14. Determining characteristics of artificial near-Earth objects using observability analysis

    Science.gov (United States)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  15. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Science.gov (United States)

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  16. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  17. The Making of Paranormal Belief: History, Discourse Analysis and the Object of Belief

    OpenAIRE

    White, Lewis

    2013-01-01

    The present study comprises a discursive analysis of a cognitive phenomenon, paranormal beliefs. A discursive psychological approach to belief highlights that an important component of the cognitivist work has been how the object of paranormal belief has been defined in formal study. Using discourse analysis, as developed as a method in the history of psychology, this problem is explored through analysis of published scales. The findings highlight three rhetorical themes that are deployed in ...

  18. Using Epistemic Network Analysis to understand core topics as planned learning objectives

    DEFF Research Database (Denmark)

    Allsopp, Benjamin Brink; Dreyøe, Jonas; Misfeldt, Morten

    Epistemic Network Analysis is a tool developed by the epistemic games group at the University of Wisconsin Madison for tracking the relations between concepts in students discourse (Shaffer 2017). In our current work we are applying this tool to learning objectives in teachers digital preparation....... The danish mathematics curriculum is organised in six competencies and three topics. In the recently implemented learning platforms teacher choose which of the mathematical competencies that serves as objective for a specific lesson or teaching sequence. Hence learning objectives for lessons and teaching...... sequences are defining a network of competencies, where two competencies are closely related of they often are part of the same learning objective or teaching sequence. We are currently using Epistemic Network Analysis to study these networks. In the poster we will include examples of different networks...

  19. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    Science.gov (United States)

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  20. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    DEFF Research Database (Denmark)

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  1. 3D object-oriented image analysis in 3D geophysical modelling

    DEFF Research Database (Denmark)

    Fadel, I.; van der Meijde, M.; Kerle, N.

    2015-01-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects......) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA......) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D...

  2. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    International Nuclear Information System (INIS)

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  3. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  4. Chemical-genetic profile analysis in yeast suggests that a previously uncharacterized open reading frame, YBR261C, affects protein synthesis

    Directory of Open Access Journals (Sweden)

    Eroukova Veronika

    2008-12-01

    Full Text Available Abstract Background Functional genomics has received considerable attention in the post-genomic era, as it aims to identify function(s for different genes. One way to study gene function is to investigate the alterations in the responses of deletion mutants to different stimuli. Here we investigate the genetic profile of yeast non-essential gene deletion array (yGDA, ~4700 strains for increased sensitivity to paromomycin, which targets the process of protein synthesis. Results As expected, our analysis indicated that the majority of deletion strains (134 with increased sensitivity to paromomycin, are involved in protein biosynthesis. The remaining strains can be divided into smaller functional categories: metabolism (45, cellular component biogenesis and organization (28, DNA maintenance (21, transport (20, others (38 and unknown (39. These may represent minor cellular target sites (side-effects for paromomycin. They may also represent novel links to protein synthesis. One of these strains carries a deletion for a previously uncharacterized ORF, YBR261C, that we term TAE1 for Translation Associated Element 1. Our focused follow-up experiments indicated that deletion of TAE1 alters the ribosomal profile of the mutant cells. Also, gene deletion strain for TAE1 has defects in both translation efficiency and fidelity. Miniaturized synthetic genetic array analysis further indicates that TAE1 genetically interacts with 16 ribosomal protein genes. Phenotypic suppression analysis using TAE1 overexpression also links TAE1 to protein synthesis. Conclusion We show that a previously uncharacterized ORF, YBR261C, affects the process of protein synthesis and reaffirm that large-scale genetic profile analysis can be a useful tool to study novel gene function(s.

  5. Chemical-genetic profile analysis in yeast suggests that a previously uncharacterized open reading frame, YBR261C, affects protein synthesis.

    Science.gov (United States)

    Alamgir, Md; Eroukova, Veronika; Jessulat, Matthew; Xu, Jianhua; Golshani, Ashkan

    2008-12-03

    Functional genomics has received considerable attention in the post-genomic era, as it aims to identify function(s) for different genes. One way to study gene function is to investigate the alterations in the responses of deletion mutants to different stimuli. Here we investigate the genetic profile of yeast non-essential gene deletion array (yGDA, approximately 4700 strains) for increased sensitivity to paromomycin, which targets the process of protein synthesis. As expected, our analysis indicated that the majority of deletion strains (134) with increased sensitivity to paromomycin, are involved in protein biosynthesis. The remaining strains can be divided into smaller functional categories: metabolism (45), cellular component biogenesis and organization (28), DNA maintenance (21), transport (20), others (38) and unknown (39). These may represent minor cellular target sites (side-effects) for paromomycin. They may also represent novel links to protein synthesis. One of these strains carries a deletion for a previously uncharacterized ORF, YBR261C, that we term TAE1 for Translation Associated Element 1. Our focused follow-up experiments indicated that deletion of TAE1 alters the ribosomal profile of the mutant cells. Also, gene deletion strain for TAE1 has defects in both translation efficiency and fidelity. Miniaturized synthetic genetic array analysis further indicates that TAE1 genetically interacts with 16 ribosomal protein genes. Phenotypic suppression analysis using TAE1 overexpression also links TAE1 to protein synthesis. We show that a previously uncharacterized ORF, YBR261C, affects the process of protein synthesis and reaffirm that large-scale genetic profile analysis can be a useful tool to study novel gene function(s).

  6. Error analysis of motion correction method for laser scanning of moving objects

    Science.gov (United States)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  7. Topological situational analysis and synthesis of strategies of object management in the conditions of conflict, uncertainty of behaviour and varible amount of the observed objects

    Directory of Open Access Journals (Sweden)

    Віктор Володимирович Семко

    2016-09-01

    Full Text Available The conflict of cooperation of objects is considered in observation space as integral phenomenon with the certain variety of types of connections between its elements, objects, systems and environment that erected in a single theoretical conception and comprehensively and deeply determine the real features of object of researches. Methodology of system-structural analysis of conflict is used as research of the phenomenon in the whole and system-functional analysis as research with the aim of determination of all basic intercommunications with an environment

  8. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Directory of Open Access Journals (Sweden)

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  9. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  10. Ultrasound elastography of the lower uterine segment in women with a previous cesarean section: Comparison of in-/ex-vivo elastography versus tensile-stress-strain-rupture analysis.

    Science.gov (United States)

    Seliger, Gregor; Chaoui, Katharina; Lautenschläger, Christine; Jenderka, Klaus-Vitold; Kunze, Christian; Hiller, Grit Gesine Ruth; Tchirikov, Michael

    2018-06-01

    The purpose of this study was to assess, if the biomechanical properties of the lower uterine segment (LUS) in women with a previous cesarean section (CS) can be determined by ultrasound (US) elastography. The first aim was to establish an ex-vivo LUS tensile-stress-strain-rupture(break point) analysis with the possibility of simultaneously using US elastography. The second aim was to investigate the relationship between measurement results of LUS stiffness using US elastography in-/ex-vivo with results of tensile-stress-strain-rupture analysis, and to compare different US elastography LUS-stiffness-measurement methods ex-vivo. An explorative experimental, in-/ex-vivo US study of women with previous CS was conducted. LUS elasticity was measured by point Shear Wave Elastography (pSWE) and bidimensional Shear-Wave-Elastography (2D-SWE) first in-vivo during preoperative examination within 24 h before repeat CS (including resection of the thinnest part of the LUS = uterine scar area during CS), second within 1 h after operation during the ex-vivo experiment, followed by tensile-stress-strain-rupture analysis. Pearson's correlation coefficient and scatter plots, Bland-Altman plots and paired T-tests, were used. Thirty three women were included in the study; elastography measurements n = 1412. The feasibility of ex-vivo assessment of LUS by quantitative US elastography using pSWE and 2D-SWE to detect stiffness of LUS was demonstrated. The strongest correlation with tensile-stress-strain analysis was found in the US elastography examination carried out with 2D-SWE (0.78, p break point - as a surrogate marker for the risk of rupture of the LUS after CS - is linearly dependent on the thickness of the LUS in the scar area (Coefficient of correlation: 0.79, p even at less stroke/strain than would be expected by their thickness. This study confirms that US elastography can help in determining viscoelastic properties of the LUS in women with a previous CS. The

  11. Molecular analysis of clinical isolates previously diagnosed as Mycobacterium intracellulare reveals incidental findings of "Mycobacterium indicus pranii" genotypes in human lung infection.

    Science.gov (United States)

    Kim, Su-Young; Park, Hye Yun; Jeong, Byeong-Ho; Jeon, Kyeongman; Huh, Hee Jae; Ki, Chang-Seok; Lee, Nam Yong; Han, Seung-Jung; Shin, Sung Jae; Koh, Won-Jung

    2015-09-30

    Mycobacterium intracellulare is a major cause of Mycobacterium avium complex lung disease in many countries. Molecular studies have revealed several new Mycobacteria species that are closely related to M. intracellulare. The aim of this study was to re-identify and characterize clinical isolates from patients previously diagnosed with M. intracellulare lung disease at the molecular level. Mycobacterial isolates from 77 patients, initially diagnosed with M. intracellulare lung disease were re-analyzed by multi-locus sequencing and pattern of insertion sequences. Among the 77 isolates, 74 (96 %) isolates were designated as M. intracellulare based on multigene sequence-based analysis. Interestingly, the three remaining strains (4 %) were re-identified as "Mycobacterium indicus pranii" according to distinct molecular phylogenetic positions in rpoB and hsp65 sequence-based typing. In hsp65 sequevar analysis, code 13 was found in the majority of cases and three unreported codes were identified. In 16S-23S rRNA internal transcribed spacer (ITS) sequevar analysis, all isolates of both species were classified within the Min-A ITS sequevar. Interestingly, four of the M. intracellulare isolates harbored IS1311, a M. avium-specific element. Two of three patients infected with "M. indicus pranii" had persistent positive sputum cultures after antibiotic therapy, indicating the clinical relevance of this study. This analysis highlights the importance of precise identification of clinical isolates genetically close to Mycobacterium species, and suggests that greater attention should be paid to nontuberculous mycobacteria lung disease caused by "M. indicus pranii".

  12. Previous ISD Program Review.

    Science.gov (United States)

    1981-03-01

    report. The detail required for such a review would be unwieldy and would comsume inordinate amounts of time. The result of the document review will...attempts have been made at writing specific behavioral objectives (SBOs). These, however, have proven to be inadequate in that they are not stated in... behavioral terms (e.g., "will understand," "will have a knowledge of," etc.). C. Development of CRO/CRTs? In nearly all cases, ISD teams are just

  13. Exergoeconomic multi objective optimization and sensitivity analysis of a regenerative Brayton cycle

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2016-01-01

    Highlights: • Finite time exergoeconomic multi objective optimization of a Brayton cycle. • Comparing the exergoeconomic and the ecological function optimization results. • Inserting the cost of fluid streams concept into finite-time thermodynamics. • Exergoeconomic sensitivity analysis of a regenerative Brayton cycle. • Suggesting the cycle performance curve drawing and utilization. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power maximization and then exergoeconomic optimization using finite-time thermodynamic concept and finite-size components. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is used deploying time variations. The decision variables for the optimum state (of multi objective exergoeconomic optimization) are compared to the maximum power state. One can see that the multi objective exergoeconomic optimization results in a better performance than that obtained with the maximum power state. The results demonstrate that system performance at optimum point of multi objective optimization yields 71% of the maximum power, but only with exergy destruction as 24% of the amount that is produced at the maximum power state and 67% lower total cost rate than that of the maximum power state. In order to assess the impact of the variation of the decision variables on the objective functions, sensitivity analysis is conducted. Finally, the cycle performance curve drawing according to exergoeconomic multi objective optimization results and its utilization, are suggested.

  14. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  15. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  16. ART OF METALLOGRAPHY: POSSIBILITIES OF DARK-FIELD MICROSCOPY APPLICATION FOR COLORED OBJECTS STRUCTURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. G. Anisovich

    2015-01-01

    Full Text Available The application of the method of dark field microscopy for the study of colored objects of material technology was researched. The capability of corrosive damage analysis and determination of the thickness of the metal coating were demonstrated. The performance capability of analysis of «reflection» in the dark field during the study of non-metallic materials – orthopedic implants and fireclay refractory were tested. An example of defect detection of carbon coating was displayed.

  17. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  18. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    NARCIS (Netherlands)

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  19. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  20. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E. J. L.; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W. T. Cohen; Overdiek, Hans W. P. M.; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M. P. M.; Wolfs, Tom F. W.; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes: clinical outcomes,

  1. Current evidence on hospital antimicrobial stewardship objectives : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E J L; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W T Cohen; Overdiek, Hans W P M; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M P M; Wolfs, Tom F W; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  2. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, E.C.; Hulscher, M.E.J.L.; Mouton, J.W.; Verduin, C.M.; Stuart, J.W.; Overdiek, H.W.; Linden, P.D. van der; Natsch, S.S.; Hertogh, C.M.; Wolfs, T.F.; Schouten, J.A.; Kullberg, B.J.; Prins, J.M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  3. OBJECTIVE EVALUATION OF HYPERACTIVATED MOTILITY IN RAT SPERMATOZA USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Objective evaluation of hyperactivated motility in rat spermatozoa using computer-assisted sperm analysis.Cancel AM, Lobdell D, Mendola P, Perreault SD.Toxicology Program, University of North Carolina, Chapel Hill, NC 27599, USA.The aim of this study was t...

  4. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    Science.gov (United States)

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  5. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    International Nuclear Information System (INIS)

    Grieger, Khara D.; Laurent, Alexis; Miseljic, Mirko; Christensen, Frans; Baun, Anders; Olsen, Stig I.

    2012-01-01

    While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance on how to practically apply these methods are still very much under development. This paper evaluates how research efforts have applied LCA and RA together for NM, particularly reflecting on previous experiences with applying these methods to chemicals. Through a literature review and a separate analysis of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key “lessons learned” from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches for using these methods together for NM: “LC-based RA” (traditional RA applied in a life-cycle perspective) and “RA-complemented LCA” (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods for NM-risk research efforts to date as the former is rather a continuation of normal RA according to standard assessment procedures (e.g., REACH). Both these approaches along with recommendations for using LCA and RA together for NM are similar to those made previously for chemicals, and thus, there does not appear to be much progress made specific for NM. We have identified one issue in particular that may be specific for NM when applying LCA and RA at this time: the need to establish proper dose metrics within both methods.

  6. Potential protective effect of lactation against incidence of type 2 diabetes mellitus in women with previous gestational diabetes mellitus: A systematic review and meta-analysis.

    Science.gov (United States)

    Tanase-Nakao, Kanako; Arata, Naoko; Kawasaki, Maki; Yasuhi, Ichiro; Sone, Hirohito; Mori, Rintaro; Ota, Erika

    2017-05-01

    Lactation may protect women with previous gestational diabetes mellitus (GDM) from developing type 2 diabetes mellitus, but the results of existing studies are inconsistent, ranging from null to beneficial. We aimed to conduct a systematic review to gather available evidence. Databases MEDLINE, CINAHL, PubMed, and EMBASE were searched on December 15, 2015, without restriction of language or publication year. A manual search was also conducted. We included observational studies (cross-sectional, case-control, and cohort study) with information on lactation and type 2 diabetes mellitus incidence among women with previous GDM. We excluded case studies without control data. Data synthesis was conducted by random-effect meta-analysis. Fourteen reports of 9 studies were included. Overall risk of bias using RoBANS ranged from low to unclear. Longer lactation for more than 4 to 12 weeks postpartum had risk reduction of type 2 diabetes mellitus compared with shorter lactation (OR 0.77, 95% CI 0.01-55.86; OR 0.56, 95% CI 0.35-0.89; OR 0.22, 95% CI 0.13-0.36; type 2 diabetes mellitus evaluation time 5 y, respectively). Exclusive lactation for more than 6 to 9 weeks postpartum also had lower risk of type 2 diabetes mellitus compared with exclusive formula (OR 0.42, 95% CI 0.22-0.81). The findings support the evidence that longer and exclusive lactation may be beneficial for type 2 diabetes mellitus prevention in women with previous GDM. However, the evidence relies only on observational studies. Therefore, further studies are required to address the true causal effect. © 2017 The Authors. Diabetes/Metabolism Research and Reviews Published by John Wiley & Sons Ltd.

  7. SU-E-I-58: Objective Models of Breast Shape Undergoing Mammography and Tomosynthesis Using Principal Component Analysis.

    Science.gov (United States)

    Feng, Ssj; Sechopoulos, I

    2012-06-01

    To develop an objective model of the shape of the compressed breast undergoing mammographic or tomosynthesis acquisition. Automated thresholding and edge detection was performed on 984 anonymized digital mammograms (492 craniocaudal (CC) view mammograms and 492 medial lateral oblique (MLO) view mammograms), to extract the edge of each breast. Principal Component Analysis (PCA) was performed on these edge vectors to identify a limited set of parameters and eigenvectors that. These parameters and eigenvectors comprise a model that can be used to describe the breast shapes present in acquired mammograms and to generate realistic models of breasts undergoing acquisition. Sample breast shapes were then generated from this model and evaluated. The mammograms in the database were previously acquired for a separate study and authorized for use in further research. The PCA successfully identified two principal components and their corresponding eigenvectors, forming the basis for the breast shape model. The simulated breast shapes generated from the model are reasonable approximations of clinically acquired mammograms. Using PCA, we have obtained models of the compressed breast undergoing mammographic or tomosynthesis acquisition based on objective analysis of a large image database. Up to now, the breast in the CC view has been approximated as a semi-circular tube, while there has been no objectively-obtained model for the MLO view breast shape. Such models can be used for various breast imaging research applications, such as x-ray scatter estimation and correction, dosimetry estimates, and computer-aided detection and diagnosis. © 2012 American Association of Physicists in Medicine.

  8. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  9. Complexity analysis of dual-channel game model with different managers' business objectives

    Science.gov (United States)

    Li, Ting; Ma, Junhai

    2015-01-01

    This paper considers dual-channel game model with bounded rationality, using the theory of bifurcations of dynamical system. The business objectives of retailers are assumed to be different, which is closer to reality than previous studies. We study the local stable region of Nash equilibrium point and find that business objectives can expand the stable region and play an important role in price strategy. One interesting finding is that a fiercer competition tends to stabilize the Nash equilibrium. Simulation shows the complex behavior of two dimensional dynamic system, we find period doubling bifurcation and chaos phenomenon. We measure performances of the model in different period by using the index of average profit. The results show that unstable behavior in economic system is often an unfavorable outcome. So this paper discusses the application of adaptive adjustment mechanism when the model exhibits chaotic behavior and then allows the retailers to eliminate the negative effects.

  10. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  11. Classification of Land Use on Sand-Dune Topography by Object-Based Analysis, Digital Photogrammetry, and GIS Analysis in the Horqin Sandy Land, China

    Directory of Open Access Journals (Sweden)

    Takafumi Miyasaka

    2016-07-01

    Full Text Available Previous field research on the Horqin Sandy Land (China, which has suffered from severe desertification during recent decades, revealed how land use on a sand-dune topography affects both land degradation and restoration. This study aimed to depict the spatial distribution of local land use in order to shed more light on previous field findings regarding policies on a broader scale. We performed the following analyses with Panchromatic Remote-sensing Instrument for Stereo Mapping (PRISM and Advanced Visible and Near Infrared Radiometer type 2 (AVNIR-2 images of Advanced Land Observing Satellite (ALOS: (1 object-based classification to discriminate preliminary classification of land-use types that were approximately differentiated by ordinary pixel-based analysis with spectral information; (2 digital photogrammetry to generate a digital surface model (DSM with adequately high accuracy to represent undulating sand-dune topography; (3 geographic information system (GIS analysis to classify major topographic types with the digital surface model (DSM; and (4 overlay of the two classification results to depict the local land-use types. The overall accuracies of the object-based and GIS-based classifications were high, at 93% (kappa statistic: 0.84 and 89% (kappa statistic: 0.81, respectively. The resultant local land-use map represents areas covered in previous field studies, showing where and how land degradation and restoration are likely to occur. This research can contribute to future environmental surveys, models, and policies in the study area.

  12. Art, historical and cultural heritage objects studied with different non-destructive analysis

    International Nuclear Information System (INIS)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M.

    2012-01-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  13. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica

    2012-07-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  14. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Science.gov (United States)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  15. Context based Coding of Binary Shapes by Object Boundary Straightness Analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2004-01-01

    A new lossless compression scheme for bilevel images targeted at binary shapes of image and video objects is presented. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used in the context definition for arithmetic encoding....... Tested on individual images of binary shapes and binary layers of digital maps the algorithm outperforms PWC, JBIG and MPEG-4 CAE. On the binary shapes the code lengths are reduced by 21%, 25%, and 42%, respectively. On the maps the reductions are 34%, 32%, and 59%, respectively. The algorithm is also...

  16. Plantar pressure in diabetic peripheral neuropathy patients with active foot ulceration, previous ulceration and no history of ulceration: a meta-analysis of observational studies.

    Science.gov (United States)

    Fernando, Malindu Eranga; Crowther, Robert George; Pappas, Elise; Lazzarini, Peter Anthony; Cunningham, Margaret; Sangla, Kunwarjit Singh; Buttner, Petra; Golledge, Jonathan

    2014-01-01

    Elevated dynamic plantar pressures are a consistent finding in diabetes patients with peripheral neuropathy with implications for plantar foot ulceration. This meta-analysis aimed to compare the plantar pressures of diabetes patients that had peripheral neuropathy and those with neuropathy with active or previous foot ulcers. Published articles were identified from Medline via OVID, CINAHL, SCOPUS, INFORMIT, Cochrane Central EMBASE via OVID and Web of Science via ISI Web of Knowledge bibliographic databases. Observational studies reporting barefoot dynamic plantar pressure in adults with diabetic peripheral neuropathy, where at least one group had a history of plantar foot ulcers were included. Interventional studies, shod plantar pressure studies and studies not published in English were excluded. Overall mean peak plantar pressure (MPP) and pressure time integral (PTI) were primary outcomes. The six secondary outcomes were MPP and PTI at the rear foot, mid foot and fore foot. The protocol of the meta-analysis was published with PROPSERO, (registration number CRD42013004310). Eight observational studies were included. Overall MPP and PTI were greater in diabetic peripheral neuropathy patients with foot ulceration compared to those without ulceration (standardised mean difference 0.551, 95% CI 0.290-0.811, pdiabetic peripheral neuropathy with a history of foot ulceration compared to those with diabetic neuropathy without a history of ulceration. More homogenous data is needed to confirm these findings.

  17. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta.

    Science.gov (United States)

    Gasse, Barbara; Prasad, Megana; Delgado, Sidney; Huckert, Mathilde; Kawczynski, Marzena; Garret-Bernardin, Annelyse; Lopez-Cazaux, Serena; Bailleul-Forestier, Isabelle; Manière, Marie-Cécile; Stoetzel, Corinne; Bloch-Zupan, Agnès; Sire, Jean-Yves

    2017-01-01

    Amelogenesis imperfecta (AI) designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene ( MMP20 ) produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues), pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  18. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  19. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Science.gov (United States)

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  20. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    Science.gov (United States)

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  1. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  2. Nurse-surgeon object transfer: video analysis of communication and situation awareness in the operating theatre.

    Science.gov (United States)

    Korkiakangas, Terhi; Weldon, Sharon-Marie; Bezemer, Jeff; Kneebone, Roger

    2014-09-01

    One of the most central collaborative tasks during surgical operations is the passing of objects, including instruments. Little is known about how nurses and surgeons achieve this. The aim of the present study was to explore what factors affect this routine-like task, resulting in fast or slow transfer of objects. A qualitative video study, informed by an observational ethnographic approach, was conducted in a major teaching hospital in the UK. A total of 20 general surgical operations were observed. In total, approximately 68 h of video data have been reviewed. A subsample of 225 min has been analysed in detail using interactional video-analysis developed within the social sciences. Two factors affecting object transfer were observed: (1) relative instrument trolley position and (2) alignment. The scrub nurse's instrument trolley position (close to vs. further back from the surgeon) and alignment (gaze direction) impacts on the communication with the surgeon, and consequently, on the speed of object transfer. When the scrub nurse was standing close to the surgeon, and "converged" to follow the surgeon's movements, the transfer occurred more seamlessly and faster (1.0 s). The smoothness of object transfer can be improved by adjusting the scrub nurse's instrument trolley position, enabling a better monitoring of surgeon's bodily conduct and affording early orientation (awareness) to an upcoming request (changing situation). Object transfer is facilitated by the surgeon's embodied practices, which can elicit the nurse's attention to the request and, as a response, maximise a faster object transfer. A simple intervention to highlight the significance of these factors could improve communication in the operating theatre. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  4. Parametric analysis of energy quality management for district in China using multi-objective optimization approach

    International Nuclear Information System (INIS)

    Lu, Hai; Yu, Zitao; Alanne, Kari; Xu, Xu; Fan, Liwu; Yu, Han; Zhang, Liang; Martinac, Ivo

    2014-01-01

    Highlights: • A time-effective multi-objective design optimization scheme is proposed. • The scheme aims at exploring suitable 3E energy system for the specific case. • A realistic case located in China is used for the analysis. • Parametric study is investigated to test the effects of different parameters. - Abstract: Due to the increasing energy demands and global warming, energy quality management (EQM) for districts has been getting importance over the last few decades. The evaluation of the optimum energy systems for specific districts is an essential part of EQM. This paper presents a deep analysis of the optimum energy systems for a district sited in China. A multi-objective optimization approach based on Genetic Algorithm (GA) is proposed for the analysis. The optimization process aims to search for the suitable 3E (minimum economic cost and environmental burden as well as maximum efficiency) energy systems. Here, life cycle CO 2 equivalent (LCCO 2 ), life cycle cost (LCC) and exergy efficiency (EE) are set as optimization objectives. Then, the optimum energy systems for the Chinese case are presented. The final work is to investigate the effects of different energy parameters. The results show the optimum energy systems might vary significantly depending on some parameters

  5. Geographic Object-Based Image Analysis – Towards a new paradigm

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  6. Fourier analysis of intracranial aneurysms: towards an objective and quantitative evaluation of the shape of aneurysms

    International Nuclear Information System (INIS)

    Rohde, Stefan; Lahmann, Katharina; Nafe, Reinhold; Yan, Bernard; Berkefeld, Joachim; Beck, Juergen; Raabe, Andreas

    2005-01-01

    Shape irregularities of intracranial aneurysms may indicate an increased risk of rupture. To quantify morphological differences, Fourier analysis of the shape of intracranial aneurysms was introduced. We compared the morphology of 45 unruptured (UIA) and 46 ruptured intracranial aneurysms (RIA) in 70 consecutive patients on the basis of 3D-rotational angiography. Fourier analysis, coefficient of roundness and qualitative shape assessment were determined for each aneurysm. Morphometric analysis revealed significantly smaller coefficient of roundness (P<0.02) and higher values for Fourier amplitudes numbers 2, 3 and 7 (P<0.01) in the RIA group, indicating more complex and irregular morphology in RIA. Qualitative assessment from 3D-reconstructions showed surface irregularities in 78% of RIA and 42% of UIA (P<0.05). Our data have shown significant differences in shape between RIA and UIA, and further developments of Fourier analysis may provide an objective factor for the assessment of the risk of rupture. (orig.)

  7. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Barbara Gasse

    2017-06-01

    Full Text Available Amelogenesis imperfecta (AI designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene (MMP20 produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues, pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  8. A decision analysis approach for risk management of near-earth objects

    Science.gov (United States)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  9. Neural regions supporting lexical processing of objects and actions: A case series analysis

    Directory of Open Access Journals (Sweden)

    Bonnie L Breining

    2014-04-01

    Full Text Available Introduction. Linking semantic representations to lexical items is an important cognitive process for both producing and comprehending language. Past research has suggested that the bilateral anterior temporal lobes are critical for this process (e.g. Patterson, Nestor, & Rogers, 2007. However, the majority of studies focused on object concepts alone, ignoring actions. The few that considered actions suggest that the temporal poles are not critical for their processing (e.g. Kemmerer et al., 2010. In this case series, we investigated the neural substrates of linking object and action concepts to lexical labels by correlating the volume of defined regions of interest with behavioral performance on picture-word verification and picture naming tasks of individuals with primary progressive aphasia (PPA. PPA is a neurodegenerative condition with heterogeneous neuropathological causes, characterized by increasing language deficits for at least two years in the face of relatively intact cognitive function in other domains (Gorno-Tempini et al., 2011. This population displays appropriate heterogeneity of performance and focal atrophy for investigating the neural substrates involved in lexical semantic processing of objects and actions. Method. Twenty-one individuals with PPA participated in behavioral assessment within six months of high resolution anatomical MRI scans. Behavioral assessments consisted of four tasks: picture-word verification and picture naming of objects and actions. Performance on these assessments was correlated with brain volume measured using atlas-based analysis in twenty regions of interest that are commonly atrophied in PPA and implicated in language processing. Results. Impaired performance for all four tasks significantly correlated with atrophy in the right superior temporal pole, left anterior middle temporal gyrus, and left fusiform gyrus. No regions were identified in which volume correlated with performance for both

  10. Intellectual capital: approaches to analysis as an object of the internal environment of an economic entity

    Directory of Open Access Journals (Sweden)

    O. E. Ustinova

    2017-01-01

    Full Text Available Intellectual capital is of strategic importance for a modern company. At the same time, its effective management, including a stimulating and creative approach to solving problems, will help to increase the competitiveness and development of economic entities. The article considers intellectual capital as an object of analysis of the internal environment. In the context of the proposed approaches to its study, its impact on the development of the company is also considered. The intellectual capital has a special significance and influence on internal processes, since on each of them the intellectual component allows to achieve a positive synergetic effect from the interaction of different objects. In more detail, it is proposed to consider it in terms of the position of the company it occupies on the market, the principles of its activities, the formation of marketing policies, the use of resources, methods and means of making managerial decisions, and the organizational culture formed. For the analysis of the state of the internal environment, the main approaches are proposed, in which the intellectual capital is considered, among them: methods for analyzing cash flows, economic efficiency and financial feasibility of the project, analysis of the consolidated financial flow by group of objects, assessment of the potential of the business entity, technology of choice of investment policy, technology Selection of incentive mechanisms. In this regard, it is advisable to analyze the company's internal environment from the position of influencing its state of intellectual capital. The scheme of interaction of intellectual capital and objects of an estimation of an internal environment of the managing subject is offered. The results of this study should be considered as initial data for the further development of the economic evaluation of the influence of intellectual capital on the competitiveness of companies.

  11. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    Science.gov (United States)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  12. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  13. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  14. Multi-objective optimization of a cascade refrigeration system: Exergetic, economic, environmental, and inherent safety analysis

    International Nuclear Information System (INIS)

    Eini, Saeed; Shahhosseini, Hamidreza; Delgarm, Navid; Lee, Moonyong; Bahadori, Alireza

    2016-01-01

    Highlights: • A multi-objective optimization is performed for a cascade refrigeration cycle. • The optimization problem considers inherently safe design as well as 3E analysis. • As a measure of inherent safety level a quantitative risk analysis is utilized. • A CO 2 /NH 3 cascade refrigeration system is compared with a CO 2 /C 3 H 8 system. - Abstract: Inherently safer design is the new approach to maximize the overall safety of a process plant. This approach suggests some risk reduction strategies to be implemented in the early stages of design. In this paper a multi-objective optimization was performed considering economic, exergetic, and environmental aspects besides evaluation of the inherent safety level of a cascade refrigeration system. The capital costs, the processing costs, and the social cost due to CO 2 emission were considered to be included in the economic objective function. Exergetic efficiency of the plant was considered as the second objective function. As a measure of inherent safety level, Quantitative Risk Assessment (QRA) was performed to calculate total risk level of the cascade as the third objective function. Two cases (ammonia and propane) were considered to be compared as the refrigerant of the high temperature circuit. The achieved optimum solutions from the multi–objective optimization process were given as Pareto frontier. The ultimate optimal solution from available solutions on the Pareto optimal curve was selected using Decision-Makings approaches. NSGA-II algorithm was used to obtain Pareto optimal frontiers. Also, three decision-making approaches (TOPSIS, LINMAP, and Shannon’s entropy methods) were utilized to select the final optimum point. Considering continuous material release from the major equipment in the plant, flash and jet fire scenarios were considered for the CO 2 /C 3 H 8 cycle and toxic hazards were considered for the CO 2 /NH 3 cycle. The results showed no significant differences between CO 2 /NH 3 and

  15. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    Science.gov (United States)

    Zhang, Ying; Moges, Semu; Block, Paul

    2018-01-01

    Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  16. Objective voice and speech analysis of persons with chronic hoarseness by prosodic analysis of speech samples.

    Science.gov (United States)

    Haderlein, Tino; Döllinger, Michael; Matoušek, Václav; Nöth, Elmar

    2016-10-01

    Automatic voice assessment is often performed using sustained vowels. In contrast, speech analysis of read-out texts can be applied to voice and speech assessment. Automatic speech recognition and prosodic analysis were used to find regression formulae between automatic and perceptual assessment of four voice and four speech criteria. The regression was trained with 21 men and 62 women (average age 49.2 years) and tested with another set of 24 men and 49 women (48.3 years), all suffering from chronic hoarseness. They read the text 'Der Nordwind und die Sonne' ('The North Wind and the Sun'). Five voice and speech therapists evaluated the data on 5-point Likert scales. Ten prosodic and recognition accuracy measures (features) were identified which describe all the examined criteria. Inter-rater correlation within the expert group was between r = 0.63 for the criterion 'match of breath and sense units' and r = 0.87 for the overall voice quality. Human-machine correlation was between r = 0.40 for the match of breath and sense units and r = 0.82 for intelligibility. The perceptual ratings of different criteria were highly correlated with each other. Likewise, the feature sets modeling the criteria were very similar. The automatic method is suitable for assessing chronic hoarseness in general and for subgroups of functional and organic dysphonia. In its current version, it is almost as reliable as a randomly picked rater from a group of voice and speech therapists.

  17. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    Science.gov (United States)

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  18. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  19. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    Science.gov (United States)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  20. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    Science.gov (United States)

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  1. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  2. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  3. Objective and quantitative analysis of daytime sleepiness in physicians after night duties.

    Science.gov (United States)

    Wilhelm, Barbara J; Widmann, Anja; Durst, Wilhelm; Heine, Christian; Otto, Gerhard

    2009-06-01

    Work place studies often have the disadvantage of lacking objective data less prone to subject bias. The aim of this study was to contribute objective data to the discussion about safety aspects of night shifts in physicians. For this purpose we applied the Pupillographic Sleepiness Test (PST). The PST allows recording and analyses of pupillary sleepiness-related oscillations in darkness for 11 min in the sitting subject. The parameter of evaluation is the Pupillary Unrest Index (PUI; mm/min). For statistical analysis the natural logarithm of this parameter is used (lnPUI). Thirty-four physicians were examined by the PST and subjective scales during the first half of the day. Data taken during a day work period (D) were compared to those taken directly after night duty (N) by a Wilcoxon signed rank test. Night duty caused a mean sleep reduction of 3 h (Difference N-D: median 3 h, minimum 0 h, maximum 7 h, p home.

  4. Approaches to defining «financial potential» concept as of economic analysis object

    Directory of Open Access Journals (Sweden)

    O.M. Dzyubenkо

    2017-12-01

    Full Text Available The research analyzes the works of scientists who studied the issues of financial potential as an economic category. Due to analyzing the approaches of the scientists to the concept of "financial potential" the author identifies six approaches to the interpretation of its essence, they are: the totality of the enterprise financial resources, the sources of the enterprise economic activity financing, the enterprise economic activity development, the enterprise financial indicators, the system of enterprise financial management, the enterprise efficiency characteristics. It is established that the financial potential is the multifaceted category that characterizes the financial and economic activity of enterprises. The author's definition of the financial potential in the context of its place in the objects of economic analysis is proposed. It is established that the financial potential is the object of the enterprise economic activity management and is the subject to analytical assessments for establishing its state and directions of development.

  5. Spectral analysis of 87-lead body surface signal-averaged ECGs in patients with previous anterior myocardial infarction as a marker of ventricular tachycardia.

    Science.gov (United States)

    Hosoya, Y; Kubota, I; Shibata, T; Yamaki, M; Ikeda, K; Tomoike, H

    1992-06-01

    There were few studies on the relation between the body surface distribution of high- and low-frequency components within the QRS complex and ventricular tachycardia (VT). Eighty-seven signal-averaged ECGs were obtained from 30 normal subjects (N group) and 30 patients with previous anterior myocardial infarction (MI) with VT (MI-VT[+] group, n = 10) or without VT (MI-VT[-] group, n = 20). The onset and offset of the QRS complex were determined from 87-lead root mean square values computed from the averaged (but not filtered) ECG waveforms. Fast Fourier transform analysis was performed on signal-averaged ECG. The resulting Fourier coefficients were attenuated by use of the transfer function, and then inverse transform was done with five frequency ranges (0-25, 25-40, 40-80, 80-150, and 150-250 Hz). From the QRS onset to the QRS offset, the time integration of the absolute value of reconstructed waveforms was calculated for each of the five frequency ranges. The body surface distributions of these areas were expressed as QRS area maps. The maximal values of QRS area maps were compared among the three groups. In the frequency ranges of 0-25 and 150-250 Hz, there were no significant differences in the maximal values among these three groups. Both MI groups had significantly smaller maximal values of QRS area maps in the frequency ranges of 25-40 and 40-80 Hz compared with the N group. The MI-VT(+) group had significantly smaller maximal values in the frequency ranges of 40-80 and 80-150 Hz than the MI-VT(-) group. These three groups were clearly differentiated by the maximal values of the 40-80-Hz QRS area map. It was suggested that the maximal value of the 40-80-Hz QRS area map was a new marker for VT after anterior MI.

  6. Cloning and Functional Analysis of cDNAs with Open Reading Frames for 300 Previously Undefined Genes Expressed in CD34+ Hematopoietic Stem/Progenitor Cells

    Science.gov (United States)

    Zhang, Qing-Hua; Ye, Min; Wu, Xin-Yan; Ren, Shuang-Xi; Zhao, Meng; Zhao, Chun-Jun; Fu, Gang; Shen, Yu; Fan, Hui-Yong; Lu, Gang; Zhong, Ming; Xu, Xiang-Ru; Han, Ze-Guang; Zhang, Ji-Wang; Tao, Jiong; Huang, Qiu-Hua; Zhou, Jun; Hu, Geng-Xi; Gu, Jian; Chen, Sai-Juan; Chen, Zhu

    2000-01-01

    Three hundred cDNAs containing putatively entire open reading frames (ORFs) for previously undefined genes were obtained from CD34+ hematopoietic stem/progenitor cells (HSPCs), based on EST cataloging, clone sequencing, in silico cloning, and rapid amplification of cDNA ends (RACE). The cDNA sizes ranged from 360 to 3496 bp and their ORFs coded for peptides of 58–752 amino acids. Public database search indicated that 225 cDNAs exhibited sequence similarities to genes identified across a variety of species. Homology analysis led to the recognition of 50 basic structural motifs/domains among these cDNAs. Genomic exon–intron organization could be established in 243 genes by integration of cDNA data with genome sequence information. Interestingly, a new gene named as HSPC070 on 3p was found to share a sequence of 105bp in 3′ UTR with RAF gene in reversed transcription orientation. Chromosomal localizations were obtained using electronic mapping for 192 genes and with radiation hybrid (RH) for 38 genes. Macroarray technique was applied to screen the gene expression patterns in five hematopoietic cell lines (NB4, HL60, U937, K562, and Jurkat) and a number of genes with differential expression were found. The resource work has provided a wide range of information useful not only for expression genomics and annotation of genomic DNA sequence, but also for further research on the function of genes involved in hematopoietic development and differentiation. [The sequence data described in this paper have been submitted to the GenBank data library under the accession nos. listed in Table 1, pp 1548–1552.] PMID:11042152

  7. Previous radiotherapy and the clinical activity and toxicity of pembrolizumab in the treatment of non-small-cell lung cancer: a secondary analysis of the KEYNOTE-001 phase 1 trial.

    Science.gov (United States)

    Shaverdian, Narek; Lisberg, Aaron E; Bornazyan, Krikor; Veruttipong, Darlene; Goldman, Jonathan W; Formenti, Silvia C; Garon, Edward B; Lee, Percy

    2017-07-01

    Preclinical studies have found radiotherapy enhances antitumour immune responses. We aimed to assess disease control and pulmonary toxicity in patients who previously received radiotherapy for non-small-cell lung cancer (NSCLC) before receiving pembrolizumab. We assessed patients with advanced NSCLC treated on the phase 1 KEYNOTE-001 trial at a single institution (University of California, Los Angeles, CA, USA). Patients were aged 18 years or older, had an Eastern Cooperative Oncology Group performance status of 1 or less, had adequate organ function, and no history of pneumonitis. Patients received pembrolizumab at a dose of either 2 mg/kg of bodyweight or 10 mg/kg every 3 weeks, or 10 mg/kg every 2 weeks, until disease progression, unacceptable toxicity, or other protocol-defined reasons for discontinuation. Disease response and pulmonary toxicity were prospectively assessed by Immune-related Response Criteria and Common Terminology Criteria for Adverse Events version 4.0. The primary objective of the KEYNOTE-001 trial was to assess the safety, side-effect profile, and antitumour activity of pembrolizumab. For our secondary analysis, patients were divided into subgroups to compare patients who previously received radiotherapy with patients who had not. Our primary objective was to determine whether previous radiotherapy affected progression-free survival, overall survival, and pulmonary toxicity in the intention-to-treat population. The KEYNOTE-001 trial was registered with ClinicalTrials.gov, number NCT01295827. Between May 22, 2012, and July 11, 2014, 98 patients were enrolled and received their first cycle of pembrolizumab. One patient was lost to follow-up. 42 (43%) of 97 patients had previously received any radiotherapy for the treatment of NSCLC before the first cycle of pembrolizumab. 38 (39%) of 97 patients received extracranial radiotherapy and 24 (25%) of 97 patients received thoracic radiotherapy. Median follow-up for surviving patients was 32·5

  8. Application of LC-MS to the analysis of dyes in objects of historical interest

    Science.gov (United States)

    Zhang, Xian; Laursen, Richard

    2009-07-01

    High-performance liquid chromatography (HPLC) with photodiode array and mass spectrometric detection permits dyes extracted from objects of historical interest or from natural plant or animal dyestuffs to be characterized on the basis of three orthogonal properties: HPLC retention time, UV-visible spectrum and molecular mass. In the present study, we have focused primarily on yellow dyes, the bulk of which are flavonoid glycosides that would be almost impossible to characterize without mass spectrometric detection. Also critical for this analysis is a method for mild extraction of the dyes from objects (e.g., textiles) without hydrolyzing the glycosidic linkages. This was accomplished using 5% formic acid in methanol, rather than the more traditional 6 M HCl. Mass spectroscopy, besides providing the molecular mass of the dye molecule, sometimes yields additional structural data based on fragmentation patterns. In addition, coeluting compounds can often be detected using extracted ion chromatography. The utility of mass spectrometry is illustrated by the analysis of historical specimens of silk that had been dyed yellow with flavonoid glycosides from Sophora japonica (pagoda tree) and curcumins from Curcuma longa (turmeric). In addition, we have used these techniques to identify the dye type, and sometimes the specific dyestuff, in a variety of objects, including a yellow varnish from a 19th century Tibetan altar and a 3000-year-old wool mortuary textiles, from Xinjiang, China. We are using HPLC with diode array and mass spectrometric detection to create a library of analyzed dyestuffs (>200 so far; mostly plants) to serve as references for identification of dyes in objects of historical interest.

  9. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Science.gov (United States)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  10. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  11. Using Item Analysis to Assess Objectively the Quality of the Calgary-Cambridge OSCE Checklist

    Directory of Open Access Journals (Sweden)

    Tyrone Donnon

    2011-06-01

    Full Text Available Background:  The purpose of this study was to investigate the use of item analysis to assess objectively the quality of items on the Calgary-Cambridge Communications OSCE checklist. Methods:  A total of 150 first year medical students were provided with extensive teaching on the use of the Calgary-Cambridge Guidelines for interviewing patients and participated in a final year end 20 minute communication OSCE station.  Grouped into either the upper half (50% or lower half (50% communication skills performance groups, discrimination, difficulty and point biserial values were calculated for each checklist item. Results:  The mean score on the 33 item communication checklist was 24.09 (SD = 4.46 and the internal reliability coefficient was ? = 0.77. Although most of the items were found to have moderate (k = 12, 36% or excellent (k = 10, 30% discrimination values, there were 6 (18% identified as ‘fair’ and 3 (9% as ‘poor’. A post-examination review focused on item analysis findings resulted in an increase in checklist reliability (? = 0.80. Conclusions:  Item analysis has been used with MCQ exams extensively. In this study, it was also found to be an objective and practical approach to use in evaluating the quality of a standardized OSCE checklist.

  12. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images.

    Science.gov (United States)

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.

  13. Mapping of crop calendar events by object-based analysis of MODIS and ASTER images

    Directory of Open Access Journals (Sweden)

    A.I. De Castro

    2014-06-01

    Full Text Available A method to generate crop calendar and phenology-related maps at a parcel level of four major irrigated crops (rice, maize, sunflower and tomato is shown. The method combines images from the ASTER and MODIS sensors in an object-based image analysis framework, as well as testing of three different fitting curves by using the TIMESAT software. Averaged estimation of calendar dates were 85%, from 92% in the estimation of emergence and harvest dates in rice to 69% in the case of harvest date in tomato.

  14. Analysis of Scattering by Inhomogeneous Dielectric Objects Using Higher-Order Hierarchical MoM

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Jørgensen, Erik; Meincke, Peter

    2003-01-01

    An efficient technique for the analysis of electromagnetic scattering by arbitrary shaped inhomogeneous dielectric objects is presented. The technique is based on a higher-order method of moments (MoM) solution of the volume integral equation. This higher-order MoM solution comprises recently...... that the condition number of the resulting MoM matrix is reduced by several orders of magnitude in comparison to existing higher-order hierarchical basis functions and, consequently, an iterative solver can be applied even for high expansion orders. Numerical results demonstrate excellent agreement...

  15. Object-Oriented Programming in the Development of Containment Analysis Code

    International Nuclear Information System (INIS)

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  16. VAGINAL PROGESTERONE VERSUS CERVICAL CERCLAGE FOR THE PREVENTION OF PRETERM BIRTH IN WOMEN WITH A SONOGRAPHIC SHORT CERVIX, SINGLETON GESTATION, AND PREVIOUS PRETERM BIRTH: A SYSTEMATIC REVIEW AND INDIRECT COMPARISON META-ANALYSIS

    Science.gov (United States)

    CONDE-AGUDELO, Agustin; ROMERO, Roberto; NICOLAIDES, Kypros; CHAIWORAPONGSA, Tinnakorn; O'BRIEN, John M.; CETINGOZ, Elcin; DA FONSECA, Eduardo; CREASY, George; SOMA-PILLAY, Priya; FUSEY, Shalini; CAM, Cetin; ALFIREVIC, Zarko; HASSAN, Sonia S.

    2012-01-01

    OBJECTIVE No randomized controlled trial has directly compared vaginal progesterone and cervical cerclage for the prevention of preterm birth in women with a sonographic short cervix in the midtrimester, singleton gestation, and previous spontaneous preterm birth. We performed an indirect comparison of vaginal progesterone versus cerclage, using placebo/no cerclage as the common comparator. STUDY DESIGN Adjusted indirect meta-analysis of randomized controlled trials. RESULTS Four studies evaluating vaginal progesterone versus placebo (158 patients) and five evaluating cerclage versus no cerclage (504 patients) were included. Both interventions were associated with a statistically significant reduction in the risk of preterm birth <32 weeks of gestation and composite perinatal morbidity and mortality compared with placebo/no cerclage. Adjusted indirect meta-analyses did not show statistically significant differences between vaginal progesterone and cerclage in reducing preterm birth or adverse perinatal outcomes. CONCLUSION Based on state-of-the-art methodology for indirect comparisons, either vaginal progesterone or cerclage are equally efficacious in the prevention of preterm birth in women with a sonographic short cervix in the midtrimester, singleton gestation, and previous preterm birth. The selection of the optimal treatment may depend upon adverse events, cost and patient/clinician preferences. PMID:23157855

  17. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  18. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-03-01

    Fusion is an essentially inexhaustible source of energy that has the potential for economically attractive commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion-energy development program is the generation of centralstation electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high-energy neutrons suggests potentially unique applications. These include breeding of fissile fuels, production of hydrogen and other chemical products, transmutation or “burning” of various nuclear or chemical wastes, radiation processing of materials, production of radioisotopes, food preservation, medical diagnosis and medical treatment, and space power and space propulsion. In addition, fusion R&D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other hand, are the two primary criteria for setting long-range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R&D program toward practical applications. The transfer of fusion technology and skills from the national laboratories and universities to industry is the key to achieving the long-range objective of commercial fusion applications.

  19. Statistical motion vector analysis for object tracking in compressed video streams

    Science.gov (United States)

    Leny, Marc; Prêteux, Françoise; Nicholson, Didier

    2008-02-01

    Compressed video is the digital raw material provided by video-surveillance systems and used for archiving and indexing purposes. Multimedia standards have therefore a direct impact on such systems. If MPEG-2 used to be the coding standard, MPEG-4 (part 2) has now replaced it in most installations, and MPEG-4 AVC/H.264 solutions are now being released. Finely analysing the complex and rich MPEG-4 streams is a challenging issue addressed in that paper. The system we designed is based on five modules: low-resolution decoder, motion estimation generator, object motion filtering, low-resolution object segmentation, and cooperative decision. Our contributions refer to as the statistical analysis of the spatial distribution of the motion vectors, the computation of DCT-based confidence maps, the automatic motion activity detection in the compressed file and a rough indexation by dedicated descriptors. The robustness and accuracy of the system are evaluated on a large corpus (hundreds of hours of in-and outdoor videos with pedestrians and vehicles). The objective benchmarking of the performances is achieved with respect to five metrics allowing to estimate the error part due to each module and for different implementations. This evaluation establishes that our system analyses up to 200 frames (720x288) per second (2.66 GHz CPU).

  20. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  1. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  2. Approach to proliferation risk assessment based on multiple objective analysis framework

    International Nuclear Information System (INIS)

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  3. Prioritization of buffer areas with multi objective analysis: application in the Basin Creek St. Helena

    International Nuclear Information System (INIS)

    Zuluaga, Julian; Carvajal, Luis Fernando

    2006-01-01

    This paper shows a Multi objective Analysis (AMO-ELECTRE 111) with Geographical Information System (GIS) to establish priorities of buffer zones on the drainage network of the Santa Elena Creek, Medellin middle-east zone. 38 alternatives (small catchment) are evaluated with seven criteria, from field work, and maps. The criteria are: susceptibility to mass sliding, surface and lineal erosion, conflict by land use, and state of the waterways network in respect to hydrology, geology and human impact. The ELECTERE III method allows establishing priorities of buffer zones for each catchment; the indifference, acceptance, veto, and credibility threshold values, as well as those for criteria weighting factors are very important. The results show that the north zone of the catchment, commune 8, in particular La Castro creek, is most affected. The sensibility analysis shows that the obtained solution is robust, and that the anthropic and geologic criteria are paramount

  4. Efficacy and safety of bevacizumab plus chemotherapy compared to chemotherapy alone in previously untreated advanced or metastatic colorectal cancer: a systematic review and meta-analysis

    International Nuclear Information System (INIS)

    Botrel, Tobias Engel Ayer; Clark, Luciana Gontijo de Oliveira; Paladini, Luciano; Clark, Otávio Augusto C.

    2016-01-01

    Colorectal cancer (CRC) is the fourth most frequently diagnosed cancer and the second leading cause of neoplasm-related death in the United States. Several studies analyzed the efficacy of bevacizumab combined with different chemotherapy regimens consisting on drugs such as 5-FU, capecitabine, irinotecan and oxaliplatin. This systematic review aims to evaluate the effectiveness and safety of chemotherapy plus bevacizumab versus chemotherapy alone in patients with previously untreated advanced or metastatic colorectal cancer (mCRC). Several databases were searched, including MEDLINE, EMBASE, LILACS, and CENTRAL. The primary endpoints were overall survival and progression-free survival. Data extracted from the studies were combined by using hazard ratio (HR) or risk ratio (RR) with their corresponding 95 % confidence intervals (95 % CI). The final analysis included 9 trials comprising 3,914 patients. Patients who received the combined treatment (chemotherapy + bevacizumab) had higher response rates (RR = 0.89; 95 % CI: 0.82 to 0.96; p = 0.003) with heterogeneity, higher progression-free survival (HR = 0.69; 95 % CI: 0.63 to 0.75; p < 0.00001) and also higher overall survival rates (HR = 0.87; 95 % CI: 0.80 to 0.95; p = 0.002) with moderate heterogeneity. Regarding adverse events and severe toxicities (grade ≥ 3), the group receiving the combined therapy had higher rates of hypertension (RR = 3.56 95 % CI: 2.58 to 4.92; p < 0.00001), proteinuria (RR = 1.89; 95 % CI: 1.26 to 2.84; p = 0.002), gastrointestinal perforation (RR = 3.63; 95 % CI: 1.31 to 10.09; p = 0.01), any thromboembolic events (RR = 1.44; 95 % CI: 1.20 to 1.73; p = 0.0001), and bleeding (RR = 1.81; 95 % CI: 1.22 to 2.67; p = 0.003). The combination of chemotherapy with bevacizumab increased the response rate, progression-free survival and overall survival of patients with mCRC without prior chemotherapy. The results of progression-free survival (PFS) and overall survival (OS) were comparatively higher

  5. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  6. Robust object tracking techniques for vision-based 3D motion analysis applications

    Science.gov (United States)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  7. Object selection costs in visual working memory: A diffusion model analysis of the focus of attention.

    Science.gov (United States)

    Sewell, David K; Lilburn, Simon D; Smith, Philip L

    2016-11-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can occur. The need to orient the focus of attention implies that single-object accounts typically predict response time costs associated with object selection even when working memory is not full (i.e., memory load is less than 4 items). For other theories that assume storage of multiple items in the focus of attention, predictions depend on specific assumptions about the way resources are allocated among items held in the focus, and how this affects the time course of retrieval of items from the focus. These broad theoretical accounts have been difficult to distinguish because conventional analyses fail to separate components of empirical response times related to decision-making from components related to selection and retrieval processes associated with accessing information in working memory. To better distinguish these response time components from one another, we analyze data from a probed visual working memory task using extensions of the diffusion decision model. Analysis of model parameters revealed that increases in memory load resulted in (a) reductions in the quality of the underlying stimulus representations in a manner consistent with a sample size model of visual working memory capacity and (b) systematic increases in the time needed to selectively access a probed representation in memory. The results are consistent with single-object theories of the focus of attention. The results are also consistent with a subset of theories that assume a multiobject focus of attention in which resource allocation diminishes both the quality and accessibility of the underlying representations. (PsycINFO Database Record (c) 2016

  8. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Science.gov (United States)

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  9. GRAIN-SIZE MEASUREMENTS OF FLUVIAL GRAVEL BARS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pedro Castro

    2018-01-01

    Full Text Available Traditional techniques for classifying the average grain size in gravel bars require manual measurements of each grain diameter. Aiming productivity, more efficient methods have been developed by applying remote sensing techniques and digital image processing. This research proposes an Object-Based Image Analysis methodology to classify gravel bars in fluvial channels. First, the study evaluates the performance of multiresolution segmentation algorithm (available at the software eCognition Developer in performing shape recognition. The linear regression model was applied to assess the correlation between the gravels’ reference delineation and the gravels recognized by the segmentation algorithm. Furthermore, the supervised classification was validated by comparing the results with field data using the t-statistic test and the kappa index. Afterwards, the grain size distribution in gravel bars along the upper Bananeiras River, Brazil was mapped. The multiresolution segmentation results did not prove to be consistent with all the samples. Nonetheless, the P01 sample showed an R2 =0.82 for the diameter estimation and R2=0.45 the recognition of the eliptical ft. The t-statistic showed no significant difference in the efficiencies of the grain size classifications by the field survey data and the Object-based supervised classification (t = 2.133 for a significance level of 0.05. However, the kappa index was 0.54. The analysis of the both segmentation and classification results did not prove to be replicable.

  10. Transferability of Object-Oriented Image Analysis Methods for Slum Identification

    Directory of Open Access Journals (Sweden)

    Alfred Stein

    2013-08-01

    Full Text Available Updated spatial information on the dynamics of slums can be helpful to measure and evaluate progress of policies. Earlier studies have shown that semi-automatic detection of slums using remote sensing can be challenging considering the large variability in definition and appearance. In this study, we explored the potential of an object-oriented image analysis (OOA method to detect slums, using very high resolution (VHR imagery. This method integrated expert knowledge in the form of a local slum ontology. A set of image-based parameters was identified that was used for differentiating slums from non-slum areas in an OOA environment. The method was implemented on three subsets of the city of Ahmedabad, India. Results show that textural features such as entropy and contrast derived from a grey level co-occurrence matrix (GLCM and the size of image segments are stable parameters for classification of built-up areas and the identification of slums. Relation with classified slum objects, in terms of enclosed by slums and relative border with slums was used to refine classification. The analysis on three different subsets showed final accuracies ranging from 47% to 68%. We conclude that our method produces useful results as it allows including location specific adaptation, whereas generically applicable rulesets for slums are still to be developed.

  11. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  12. ANALYSIS AND PARTICULARITIES OF EXTERNAL FACTORS IMPACT ON ECONOMICAL RESULTS OF STRATEGIC OBJECTS PLANNING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    V. V. Gromov

    2015-01-01

    Full Text Available Summary. The relevance of the scientific problem described in the article are: to determine changes in economic performance, the effectiveness of the sectoral components of the service sector from the effects of environmental factors, which allows them to reach the planned long-term economic performance; management decision-making about structural and organizational changes, implementation of investment projects in the renovation and modernization of fixed capital, the creation of technology, process and product innovations directly connected with the impact analysis of such external factors as economic, socio-cultural, legal, political, innovative. The structure of the article is formed on the basis of presentation of the impact of specific groups of environmental factors on the competitiveness and economic performance of industry components of services based on the technology of strategic planning; complience of logical sequence of presentation of materials, establishing a causal relationship, the interaction of factors and elements of studied problems and objects. Features of external factors impact on the effectiveness of macro-economic entities, sectoral components of services are to the adequacy of the measures and strategies to counter the negative impact on the economic development of the objects of strategic development. Features of status changes and influence of internal factors on local and sectoral socio-economic systems dictate the need for a part of the available resources, the level of efficiency of the use of labor resources, fixed and current assets. The contribution of the author in a scientific perspective of this topic is to carry out a comprehensive analysis of the impact of the main groups of external factors on economic activities of the service sector development; identifying features of internal factors impact on the economic and innovative development of strategic planning objects.

  13. AN ANALYSIS OF THE ENVIRONMENTS OF FU ORIONIS OBJECTS WITH HERSCHEL

    Energy Technology Data Exchange (ETDEWEB)

    Green, Joel D.; Evans, Neal J. II; Merello, Manuel [Department of Astronomy, The University of Texas at Austin, 2515 Speedway, Stop C1400, Austin, TX 78712-1205 (United States); Kospal, Agnes [European Space Agency (ESA/ESTEC), Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Herczeg, Gregory [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China); Quanz, Sascha P. [Institute for Astronomy, ETH Zurich, Wolfgang-Pauli-Strasse 27, CH-8093 Zurich (Switzerland); Henning, Thomas; Bouwman, Jeroen [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Van Kempen, Tim A. [Leiden Observatory, Leiden University, P.O. Box 9513, 2300-RA Leiden (Netherlands); Lee, Jeong-Eun [Department of Astronomy and Space Science, Kyung Hee University, Gyeonggi 446-701 (Korea, Republic of); Dunham, Michael M. [Department of Astronomy, Yale University, New Haven, CT (United States); Meeus, Gwendolyn [Departamento de Fisica Teorica, Universidad Autonoma de Madrid, Campus Cantoblanco (Spain); Chen, Jo-hsin [Jet Propulsion Laboratory, Pasadena, CA (United States); Guedel, Manuel; Liebhart, Armin [Department of Astrophysics, University of Vienna (Austria); Skinner, Stephen L., E-mail: joel@astro.as.utexas.edu [Center for Astrophysics and Space Astronomy (CASA), University of Colorado, Boulder, CO 80309-0389 (United States)

    2013-08-01

    We present Herschel-HIFI, SPIRE, and PACS 50-670 {mu}m imaging and spectroscopy of six FU Orionis-type objects and candidates (FU Orionis, V1735 Cyg, V1515 Cyg, V1057 Cyg, V1331 Cyg, and HBC 722), ranging in outburst date from 1936 to 2010, from the 'FOOSH' (FU Orionis Objects Surveyed with Herschel) program, as well as ancillary results from Spitzer Infrared Spectrograph and the Caltech Submillimeter Observatory. In their system properties (L{sub bol}, T{sub bol}, and line emission), we find that FUors are in a variety of evolutionary states. Additionally, some FUors have features of both Class I and II sources: warm continuum consistent with Class II sources, but rotational line emission typical of Class I, far higher than Class II sources of similar mass/luminosity. Combining several classification techniques, we find an evolutionary sequence consistent with previous mid-IR indicators. We detect [O I] in every source at luminosities consistent with Class 0/I protostars, much greater than in Class II disks. We detect transitions of {sup 13}CO (J{sub up} of 5-8) around two sources (V1735 Cyg and HBC 722) but attribute them to nearby protostars. Of the remaining sources, three (FU Ori, V1515 Cyg, and V1331 Cyg) exhibit only low-lying CO, but one (V1057 Cyg) shows CO up to J = 23 {yields} 22 and evidence for H{sub 2}O and OH emission, at strengths typical of protostars rather than T Tauri stars. Rotational temperatures for 'cool' CO components range from 20 to 81 K, for {approx} 10{sup 50} total CO molecules. We detect [C I] and [N II] primarily as diffuse emission.

  14. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  15. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    Science.gov (United States)

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  16. The Establishment of Object Selection Criteria for Effect Analysis of Electromagnetic Pulse (EMP) in Operating Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Ye, Song Hae; Ryu, Hosun; Kim, Minyi; Lee, Euijong [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The electromagnetic pulse (EMP) can be used as a strategic weapon by inducing damaging voltage and currents that the electrical circuits are not designed to withstand. EMPs are lethal to electronic systems. All EMP events have three common components: a source, coupling path, and receptor. It can also travel across power grids, destroying electronics as it passes in less than a second. There have been no research studies on the effect analysis for EMP in domestic nuclear power plants and power grids. To ensure the safety of operating nuclear power plants in this environment, the emission of EMP is needed for the effect analysis and safety measures against EMPs. Actually, it is difficult and inefficient to conduct the effect analysis of EMP with all the equipment and systems in nuclear power plants (NPPs). Therefore, this paper presents the results of establishing the object selection criteria for the effect analysis of EMP in operating nuclear power plants through reviewing previous research in the US and the safety related design concepts in domestic NPPs. It is not necessary to ensure the continued operation of the plant in intense multiple EMP environments. The most probable effect of EMP on a modern nuclear power plant is an unscheduled shutdown. EMP may also cause an extended shutdown by the unnecessary activation of some safety related systems. In general, EMP can be considered a nuisance to nuclear plants, but it is not considered a serious threat to plant safety. The results of EMP effect analysis show less possibility of failure in the tested individual equipment. It was also confirmed that there is no possibility of simultaneous failure for devices in charge of the safety shutdown in the NPP.

  17. Metal loading in Soda Butte Creek upstream of Yellowstone National Park, Montana and Wyoming; a retrospective analysis of previous research; and quantification of metal loading, August 1999

    Science.gov (United States)

    Boughton, G.K.

    2001-01-01

    Acid drainage from historic mining activities has affected the water quality and aquatic biota of Soda Butte Creek upstream of Yellowstone National Park. Numerous investigations focusing on metals contamination have been conducted in the Soda Butte Creek basin, but interpretations of how metals contamination is currently impacting Soda Butte Creek differ greatly. A retrospective analysis of previous research on metal loading in Soda Butte Creek was completed to provide summaries of studies pertinent to metal loading in Soda Butte Creek and to identify data gaps warranting further investigation. Identification and quantification of the sources of metal loading to Soda Butte Creek was recognized as a significant data gap. The McLaren Mine tailings impoundment and mill site has long been identified as a source of metals but its contribution relative to the total metal load entering Yellowstone National Park was unknown. A tracer-injection and synoptic-sampling study was designed to determine metal loads upstream of Yellowstone National Park.A tracer-injection and synoptic-sampling study was conducted on an 8,511-meter reach of Soda Butte Creek from upstream of the McLaren Mine tailings impoundment and mill site downstream to the Yellowstone National Park boundary in August 1999. Synoptic-sampling sites were selected to divide the creek into discrete segments. A lithium bromide tracer was injected continuously into Soda Butte Creek for 24.5 hours. Downstream dilution of the tracer and current-meter measurements were used to calculate the stream discharge. Stream discharge values, combined with constituent concentrations obtained by synoptic sampling, were used to quantify constituent loading in each segment of Soda Butte Creek.Loads were calculated for dissolved calcium, silica, and sulfate, as well as for dissolved and total-recoverable iron, aluminum, and manganese. Loads were not calculated for cadmium, copper, lead, and zinc because these elements were infrequently

  18. Restructuring of burnup sensitivity analysis code system by using an object-oriented design approach

    International Nuclear Information System (INIS)

    Kenji, Yokoyama; Makoto, Ishikawa; Masahiro, Tatsumi; Hideaki, Hyoudou

    2005-01-01

    A new burnup sensitivity analysis code system was developed with help from the object-oriented technique and written in Python language. It was confirmed that they are powerful to support complex numerical calculation procedure such as reactor burnup sensitivity analysis. The new burnup sensitivity analysis code system PSAGEP was restructured from a complicated old code system and reborn as a user-friendly code system which can calculate the sensitivity coefficients of the nuclear characteristics considering multicycle burnup effect based on the generalized perturbation theory (GPT). A new encapsulation framework for conventional codes written in Fortran was developed. This framework supported to restructure the software architecture of the old code system by hiding implementation details and allowed users of the new code system to easily calculate the burnup sensitivity coefficients. The framework can be applied to the other development projects since it is carefully designed to be independent from PSAGEP. Numerical results of the burnup sensitivity coefficient of a typical fast breeder reactor were given with components based on GPT and the multicycle burnup effects on the sensitivity coefficient were discussed. (authors)

  19. Qualitative content analysis experiences with objective structured clinical examination among Korean nursing students.

    Science.gov (United States)

    Jo, Kae-Hwa; An, Gyeong-Ju

    2014-04-01

    The aim of this study was to explore the experiences of Korean nursing students with an objective structured clinical examination (OSCE) assessment regarding the 12 cranial nerves using qualitative content analysis. Qualitative content analysis was used to explore the subjective experiences of nursing baccalaureate students after taking the OSCE. Convenience sampling was used to select 64 4th year nursing students who were interested in taking the OSCE. The participants learned content about the 12 cranial nerve assessment by lectures, demonstrations, and videos before the OSCE. The OSCE consisted of examinations in each of three stations for 2 days. The participants wrote information about their experiences on sheets of paper immediately after the OSCE anonymously in an adjacent room. The submitted materials were analyzed via qualitative content analysis. The collected materials were classified into two themes and seven categories. One theme was "awareness of inner capabilities", which included three categories: "inner motivation", "inner confidence", and "creativity". The other theme was "barriers to nursing performance", which included four categories: "deficiency of knowledge", "deficiency of communication skill", "deficiency of attitude toward comfort", and "deficiency of repetitive practice". This study revealed that the participants simultaneously experienced the potential and deficiency of their nursing competency after an OSCE session on cranial nerves. OSCE also provided the opportunity for nursing students to realize nursing care in a holistic manner unlike concern that OSCE undermines holism. © 2013 The Authors. Japan Journal of Nursing Science © 2013 Japan Academy of Nursing Science.

  20. Multi-objective optimization and grey relational analysis on configurations of organic Rankine cycle

    International Nuclear Information System (INIS)

    Wang, Y.Z.; Zhao, J.; Wang, Y.; An, Q.S.

    2017-01-01

    Highlights: • Pareto frontier is an effective way to make comprehensive comparison of ORC. • Comprehensive performance from energy and economics of basic ORC is the best. • R141b shows the best comprehensive performance from energy and economics. - Abstract: Concerning the comprehensive performance of organic Rankine cycle (ORC), comparisons and optimizations on 3 different configurations of ORC (basic, regenerative and extractive ORCs) are investigated in this paper. Medium-temperature geothermal water is used for comparing the influence of configurations, working fluids and operating parameters on different evaluation criteria. Different evaluation and optimization methods are adopted in evaluation of ORCs to obtain the one with the best comprehensive performance, such as exergoeconomic analysis, bi-objective optimization and grey relational analysis. The results reveal that the basic ORC performs the best among these 3 ORCs in terms of comprehensive thermodynamic and economic performances when using R245fa and driven by geothermal water at 150 °C. Furthermore, R141b shows the best comprehensive performance among 14 working fluids based on the Pareto frontier solutions without considering safe factors. Meanwhile, R141b is the best among all 14 working fluids with the optimal comprehensive performance when regarding all the evaluation criteria as equal by using grey relational analysis.

  1. Optimization and objective and subjective analysis of thorax image for computerized radiology

    International Nuclear Information System (INIS)

    Velo, Alexandre F.; Miranda, Jose Ricardo A.

    2013-01-01

    This research aimed at optimizing computational chest radiographic images (in previous posterior projection-PA). To this end, we used a homogeneous patient equivalent phantom in Computational Imaging System calibration, in order to obtain a satisfactory noise signal relation for a diagnosis, adjusting to a minimum dose received by the patient. The techniques have been applied in an anthropomorphic phantom (RANDO). The images obtained were evaluated by a radiologist, which identified the best image to determine possible pathologies (fracture or pneumonia). The technique were quantified objectively (Detective Quantum Efficiency - DQE, Modulation Transfer Function MTF, Noise Power Spectrum, NPS). Comparing optimized techniques with the clinical routine, it is concluded that all provide doses below reference levels. However the choice of the best technique for viewing possible pneumonia and/or fracture, was determined based on the first 3D (Dose, Diagnostic, Dollar) and regarded as gold standard. This image presented a reduction of dose and loading of tube around 70.5% and 80% respectively when compared with the clinical routine

  2. Gaming Change: A Many-objective Analysis of Water Supply Portfolios under Uncertainty

    Science.gov (United States)

    Reed, P. M.; Kasprzyk, J.; Characklis, G.; Kirsch, B.

    2008-12-01

    This study explores the uncertainty and tradeoffs associated with up to six conflicting water supply portfolio planning objectives. A ten-year Monte Carlo simulation model is used to evaluate water supply portfolios blending permanent rights, adaptive options contracts, and spot leases for a single city in the Lower Rio Grande Valley. Historical records of reservoir mass balance, lease pricing, and demand serve as the source data for the Monte Carlo simulation. Portfolio planning decisions include the initial volume and annual increases of permanent rights, thresholds for an adaptive options contract, and anticipatory decision rules for purchasing leases and exercising options. Our work distinguishes three cases: (1) permanent rights as the sole source of supply, (2) permanent rights and adaptive options, and (3) a combination of permanent rights, adaptive options, and leases. The problems have been formulated such that cases 1 and 2 are sub-spaces of the six objective formulation used for case 3. Our solution sets provide the tradeoff surfaces between portfolios' expected values for cost, cost variability, reliability, frequency of purchasing permanent rights increases, frequency of using leases, and dropped (or unused) transfers of water. The tradeoff surfaces for the three cases show that options and leases have a dramatic impact on the marginal costs associated with improving the efficiency and reliability of urban water supplies. Moreover, our many-objective analysis permits the discovery of a broad range of high quality portfolio strategies. We differentiate the value of adaptive options versus leases by testing a representative subset of optimal portfolios' abilities to effectively address regional increases in demand during drought periods. These results provide insights into the tradeoffs inherent to a more flexible, portfolio-style approach to urban water resources management, an approach that should become increasingly attractive in an environment of

  3. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  4. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  5. Systematic analysis of the heat exchanger arrangement problem using multi-objective genetic optimization

    International Nuclear Information System (INIS)

    Daróczy, László; Janiga, Gábor; Thévenin, Dominique

    2014-01-01

    A two-dimensional cross-flow tube bank heat exchanger arrangement problem with internal laminar flow is considered in this work. The objective is to optimize the arrangement of tubes and find the most favorable geometries, in order to simultaneously maximize the rate of heat exchange while obtaining a minimum pressure loss. A systematic study was performed involving a large number of simulations. The global optimization method NSGA-II was retained. A fully automatized in-house optimization environment was used to solve the problem, including mesh generation and CFD (computational fluid dynamics) simulations. The optimization was performed in parallel on a Linux cluster with a very good speed-up. The main purpose of this article is to illustrate and analyze a heat exchanger arrangement problem in its most general form and to provide a fundamental understanding of the structure of the Pareto front and optimal geometries. The considered conditions are particularly suited for low-power applications, as found in a growing number of practical systems in an effort toward increasing energy efficiency. For such a detailed analysis with more than 140 000 CFD-based evaluations, a design-of-experiment study involving a response surface would not be sufficient. Instead, all evaluations rely on a direct solution using a CFD solver. - Highlights: • Cross-flow tube bank heat exchanger arrangement problem. • A fully automatized multi-objective optimization based on genetic algorithm. • A systematic study involving a large number of CFD (computational fluid dynamics) simulations

  6. Energy regulation in China: Objective selection, potential assessment and responsibility sharing by partial frontier analysis

    International Nuclear Information System (INIS)

    Xia, X.H.; Chen, Y.B.; Li, J.S.; Tasawar, H.; Alsaedi, A.; Chen, G.Q.

    2014-01-01

    To cope with the excessive growth of energy consumption, the Chinese government has been trying to strengthen the energy regulation system by introducing new initiatives that aim at controlling the total amount of energy consumption. A partial frontier analysis is performed in this paper to make a comparative assessment of the combinations of possible energy conservation objectives, new constraints and regulation strategies. According to the characteristics of the coordination of existing regulation structure and the optimality of regulation strategy, four scenarios are constructed and regional responsibilities are reasonably divided by fully considering the production technology in the economy. The relative importance of output objectives and the total amount controlling is compared and the impacts on the regional economy caused by the changes of regulation strategy are also evaluated for updating regulation policy. - Highlights: • New initiatives to control the total amount of energy consumption are evaluated. • Twenty-four regulation strategies and four scenarios are designed and compared. • Crucial regions for each sector and regional potential are identified. • The national goals of energy abatement are decomposed into regional responsibilities. • The changes of regulation strategy are evaluated for updating regulation policy

  7. Testing and injury potential analysis of rollovers with narrow object impacts.

    Science.gov (United States)

    Meyer, Steven E; Forrest, Stephen; Herbst, Brian; Hayden, Joshua; Orton, Tia; Sances, Anthony; Kumaresan, Srirangam

    2004-01-01

    Recent statistics highlight the significant risk of serious and fatal injuries to occupants involved in rollover collisions due to excessive roof crush. The government has reported that in 2002. Sports Utility Vehicle rollover related fatalities increased by 14% to more than 2400 annually. 61% of all SUV fatalities included rollovers [1]. Rollover crashes rely primarily upon the roof structures to maintain occupant survival space. Frequently these crashes occur off the travel lanes of the roadway and, therefore, can include impacts with various types of narrow objects such as light poles, utility poles and/or trees. A test device and methodology is presented which facilitates dynamic, repeatable rollover impact evaluation of complete vehicle roof structures with such narrow objects. These tests allow for the incorporation of Anthropomorphic Test Dummies (ATDs) which can be instrumented to measure accelerations, forces and moments to evaluate injury potential. High-speed video permits for detailed analysis of occupant kinematics and evaluation of injury causation. Criteria such as restraint performance, injury potential, survival space and the effect of roof crush associated with various types of design alternatives, countermeasures and impact circumstances can also be evaluated. In addition to presentation of the methodology, two representative vehicle crash tests are also reported. Results indicated that the reinforced roof structure significantly reduced the roof deformation compared to the production roof structure.

  8. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  9. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  10. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    Science.gov (United States)

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  11. Objective and subjective analysis of women's voice with idiopathic Parkinson's disease

    Directory of Open Access Journals (Sweden)

    Riviana Rodrigues das Graças

    2012-07-01

    Full Text Available OBJECTIVE: To compare the voice quality of women with idiopathic Parkinson's disease and those without it. METHODS: An evaluation was performed including 19 female patients diagnosed with idiopathic Parkinson's disease, with an average age of 66 years, and 27 women with an average of 67 years-old in the Control Group. The assessment was performed by computed acoustic analysis and perceptual evaluation. RESULTS: Parkinson's disease patients presented moderate rough and unstable voice quality. The parameters of grade, roughness, and instability had higher scores in Parkinson's disease patients with statistically significant differences. Acoustic measures of Jitter and period perturbation quotient (PPQ significantly differ between groups. CONCLUSIONS: Parkinson's disease female individuals showed more vocal alterations compared to the Control Group, when both perceptual and acoustic evaluations were analyzed.

  12. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    Science.gov (United States)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  13. Analysis of art objects by means of ion beam induced luminescence

    International Nuclear Information System (INIS)

    Quaranta, A; Dran, J C; Salomon, J; Pivin, J C; Vomiero, A; Tonezzer, M; Maggioni, G; Carturan, S; Mea, G Della

    2006-01-01

    The impact of energetic ions on solid samples gives rise to the emission of visible light owing to the electronic excitation of intrinsic defects or extrinsic impurities. The intensity and position of the emission features provide information on the nature of the luminescence centers and on their chemical environments. This makes ion beam induced luminescence (IBIL) a useful complement to other ion beam analyses, like PIXE, in the cultural heritage field in characterizing the composition and the provenience of art objects. In the present paper, IBIL measurements have been performed on inorganic pigments for underlying the complementary role played by IBIL in the analysis of artistic works. Some blue and red pigment has been presented as case study

  14. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  15. Shapley value-based multi-objective data envelopment analysis application for assessing academic efficiency of university departments

    Science.gov (United States)

    Abing, Stephen Lloyd N.; Barton, Mercie Grace L.; Dumdum, Michael Gerard M.; Bongo, Miriam F.; Ocampo, Lanndon A.

    2018-02-01

    This paper adopts a modified approach of data envelopment analysis (DEA) to measure the academic efficiency of university departments. In real-world case studies, conventional DEA models often identify too many decision-making units (DMUs) as efficient. This occurs when the number of DMUs under evaluation is not large enough compared to the total number of decision variables. To overcome this limitation and reduce the number of decision variables, multi-objective data envelopment analysis (MODEA) approach previously presented in the literature is applied. The MODEA approach applies Shapley value as a cooperative game to determine the appropriate weights and efficiency score of each category of inputs. To illustrate the performance of the adopted approach, a case study is conducted in a university in the Philippines. The input variables are academic staff, non-academic staff, classrooms, laboratories, research grants, and department expenditures, while the output variables are the number of graduates and publications. The results of the case study revealed that all DMUs are inefficient. DMUs with efficiency scores close to the ideal efficiency score may be emulated by other DMUs with least efficiency scores.

  16. Does objective cluster analysis serve as a useful precursor to seasonal precipitation prediction at local scale? Application to western Ethiopia

    Directory of Open Access Journals (Sweden)

    Y. Zhang

    2018-01-01

    Full Text Available Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS values of up to 0.5 and 33 %, respectively. The general skill (after bias correction of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.

  17. Infant search and object permanence: a meta-analysis of the A-not-B error.

    Science.gov (United States)

    Wellman, H M; Cross, D; Bartsch, K

    1987-01-01

    Research on Piaget's stage 4 object concept has failed to reveal a clear or consistent pattern of results. Piaget found that 8-12-month-old infants would make perserverative errors; his explanation for this phenomenon was that the infant's concept of the object was contextually dependent on his or her actions. Some studies designed to test Piaget's explanation have replicated Piaget's basic finding, yet many have found no preference for the A location or the B location or an actual preference for the B location. More recently, researchers have attempted to uncover the causes for these results concerning the A-not-B error. Again, however, different studies have yielded different results, and qualitative reviews have failed to yield a consistent explanation for the results of the individual studies. This state of affairs suggests that the phenomenon may simply be too complex to be captured by individual studies varying 1 factor at a time and by reviews based on similar qualitative considerations. Therefore, the current investigation undertook a meta-analysis, a synthesis capturing the quantitative information across the now sizable number of studies. We entered several important factors into the meta-analysis, including the effects of age, the number of A trials, the length of delay between hiding and search, the number of locations, the distances between locations, and the distinctive visual properties of the hiding arrays. Of these, the analysis consistently indicated that age, delay, and number of hiding locations strongly influence infants' search. The pattern of specific findings also yielded new information about infant search. A general characterization of the results is that, at every age, both above-chance and below-chance performance was observed. That is, at each age at least 1 combination of delay and number of locations yielded above-chance A-not-B errors or significant perseverative search. At the same time, at each age at least 1 alternative

  18. Quantification and Analysis of Icebergs in a Tidewater Glacier Fjord Using an Object-Based Approach.

    Directory of Open Access Journals (Sweden)

    Robert W McNabb

    Full Text Available Tidewater glaciers are glaciers that terminate in, and calve icebergs into, the ocean. In addition to the influence that tidewater glaciers have on physical and chemical oceanography, floating icebergs serve as habitat for marine animals such as harbor seals (Phoca vitulina richardii. The availability and spatial distribution of glacier ice in the fjords is likely a key environmental variable that influences the abundance and distribution of selected marine mammals; however, the amount of ice and the fine-scale characteristics of ice in fjords have not been systematically quantified. Given the predicted changes in glacier habitat, there is a need for the development of methods that could be broadly applied to quantify changes in available ice habitat in tidewater glacier fjords. We present a case study to describe a novel method that uses object-based image analysis (OBIA to classify floating glacier ice in a tidewater glacier fjord from high-resolution aerial digital imagery. Our objectives were to (i develop workflows and rule sets to classify high spatial resolution airborne imagery of floating glacier ice; (ii quantify the amount and fine-scale characteristics of floating glacier ice; (iii and develop processes for automating the object-based analysis of floating glacier ice for large number of images from a representative survey day during June 2007 in Johns Hopkins Inlet (JHI, a tidewater glacier fjord in Glacier Bay National Park, southeastern Alaska. On 18 June 2007, JHI was comprised of brash ice ([Formula: see text] = 45.2%, SD = 41.5%, water ([Formula: see text] = 52.7%, SD = 42.3%, and icebergs ([Formula: see text] = 2.1%, SD = 1.4%. Average iceberg size per scene was 5.7 m2 (SD = 2.6 m2. We estimate the total area (± uncertainty of iceberg habitat in the fjord to be 455,400 ± 123,000 m2. The method works well for classifying icebergs across scenes (classification accuracy of 75.6%; the largest classification errors occur in areas

  19. Ethical objections against including life-extension costs in cost-effectiveness analysis: a consistent approach.

    Science.gov (United States)

    Gandjour, Afschin; Müller, Dirk

    2014-10-01

    One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.

  20. Objective Analysis of Performance of Activities of Daily Living in People With Central Field Loss.

    Science.gov (United States)

    Pardhan, Shahina; Latham, Keziah; Tabrett, Daryl; Timmis, Matthew A

    2015-11-01

    People with central visual field loss (CFL) adopt various strategies to complete activities of daily living (ADL). Using objective movement analysis, we compared how three ADLs were completed by people with CFL compared with age-matched, visually healthy individuals. Fourteen participants with CFL (age 81 ± 10 years) and 10 age-matched, visually healthy (age 75 ± 5 years) participated. Three ADLs were assessed: pick up food from a plate, pour liquid from a bottle, and insert a key in a lock. Participants with CFL completed each ADL habitually (as they would in their home). Data were compared with visually healthy participants who were asked to complete the tasks as they would normally, but under specified experimental conditions. Movement kinematics were compared using three-dimension motion analysis (Vicon). Visual functions (distance and near acuities, contrast sensitivity, visual fields) were recorded. All CFL participants were able to complete each ADL. However, participants with CFL demonstrated significantly (P approach. Various kinematic indices correlated significantly to visual function parameters including visual acuity and midperipheral visual field loss.

  1. Aligning experimental design with bioinformatics analysis to meet discovery research objectives.

    Science.gov (United States)

    Kane, Michael D

    2002-01-01

    The utility of genomic technology and bioinformatic analytical support to provide new and needed insight into the molecular basis of disease, development, and diversity continues to grow as more research model systems and populations are investigated. Yet deriving results that meet a specific set of research objectives requires aligning or coordinating the design of the experiment, the laboratory techniques, and the data analysis. The following paragraphs describe several important interdependent factors that need to be considered to generate high quality data from the microarray platform. These factors include aligning oligonucleotide probe design with the sample labeling strategy if oligonucleotide probes are employed, recognizing that compromises are inherent in different sample procurement methods, normalizing 2-color microarray raw data, and distinguishing the difference between gene clustering and sample clustering. These factors do not represent an exhaustive list of technical variables in microarray-based research, but this list highlights those variables that span both experimental execution and data analysis. Copyright 2001 Wiley-Liss, Inc.

  2. Characterization of analysis activity in the development of object-oriented software. Application to a examination system in nuclear medicine

    International Nuclear Information System (INIS)

    Bayas, Marcos Raul Cordova.

    1995-01-01

    The object-oriented approach, formerly proposed as an alternative to conventional software coding techniques, has expanded its scope to other phases in software development, including the analysis phase. This work discusses basic concepts and major object oriented analysis methods, drawing comparisons with structured analysis, which has been the dominant paradigm in systems analysis. The comparison is based on three interdependent system aspects, that must be specified during the analysis phase: data, control and functionality. The specification of a radioisotope examination archive system is presented as a case study. (author). 45 refs., 87 figs., 1 tab

  3. MAPPING ERODED AREAS ON MOUNTAIN GRASSLAND WITH TERRESTRIAL PHOTOGRAMMETRY AND OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. Mayr

    2016-06-01

    Full Text Available In the Alps as well as in other mountain regions steep grassland is frequently affected by shallow erosion. Often small landslides or snow movements displace the vegetation together with soil and/or unconsolidated material. This results in bare earth surface patches within the grass covered slope. Close-range and remote sensing techniques are promising for both mapping and monitoring these eroded areas. This is essential for a better geomorphological process understanding, to assess past and recent developments, and to plan mitigation measures. Recent developments in image matching techniques make it feasible to produce high resolution orthophotos and digital elevation models from terrestrial oblique images. In this paper we propose to delineate the boundary of eroded areas for selected scenes of a study area, using close-range photogrammetric data. Striving for an efficient, objective and reproducible workflow for this task, we developed an approach for automated classification of the scenes into the classes grass and eroded. We propose an object-based image analysis (OBIA workflow which consists of image segmentation and automated threshold selection for classification using the Excess Green Vegetation Index (ExG. The automated workflow is tested with ten different scenes. Compared to a manual classification, grass and eroded areas are classified with an overall accuracy between 90.7% and 95.5%, depending on the scene. The methods proved to be insensitive to differences in illumination of the scenes and greenness of the grass. The proposed workflow reduces user interaction and is transferable to other study areas. We conclude that close-range photogrammetry is a valuable low-cost tool for mapping this type of eroded areas in the field with a high level of detail and quality. In future, the output will be used as ground truth for an area-wide mapping of eroded areas in coarser resolution aerial orthophotos acquired at the same time.

  4. An experimental analysis of design choices of multi-objective ant colony optimization algorithms

    OpenAIRE

    Lopez-Ibanez, Manuel; Stutzle, Thomas

    2012-01-01

    There have been several proposals on how to apply the ant colony optimization (ACO) metaheuristic to multi-objective combinatorial optimization problems (MOCOPs). This paper proposes a new formulation of these multi-objective ant colony optimization (MOACO) algorithms. This formulation is based on adding specific algorithm components for tackling multiple objectives to the basic ACO metaheuristic. Examples of these components are how to represent multiple objectives using pheromone and heuris...

  5. The Publication History of the "Journal of Organizational Behavior Management": An Objective Review and Analysis--1998-2009

    Science.gov (United States)

    VanStelle, Sarah E.; Vicars, Sara M.; Harr, Victoria; Miguel, Caio F.; Koerber, Jeana L.; Kazbour, Richard; Austin, John

    2012-01-01

    The purpose of this study was to extend into a third decade previous reviews conducted by Balcazar, Shupert, Daniels, Mawhinney, and Hopkins (1989) and Nolan, Jarema, and Austin (1999) of the "Journal of Organizational Behavior Management" ("JOBM"). Every article published in "JOBM" between 1998 and 2009 was objectively reviewed and analyzed for…

  6. Analysis and optimization with ecological objective function of irreversible single resonance energy selective electron heat engines

    International Nuclear Information System (INIS)

    Zhou, Junle; Chen, Lingen; Ding, Zemin; Sun, Fengrui

    2016-01-01

    Ecological performance of a single resonance ESE heat engine with heat leakage is conducted by applying finite time thermodynamics. By introducing Nielsen function and numerical calculations, expressions about power output, efficiency, entropy generation rate and ecological objective function are derived; relationships between ecological objective function and power output, between ecological objective function and efficiency as well as between power output and efficiency are demonstrated; influences of system parameters of heat leakage, boundary energy and resonance width on the optimal performances are investigated in detail; a specific range of boundary energy is given as a compromise to make ESE heat engine system work at optimal operation regions. Comparing performance characteristics with different optimization objective functions, the significance of selecting ecological objective function as the design objective is clarified specifically: when changing the design objective from maximum power output into maximum ecological objective function, the improvement of efficiency is 4.56%, while the power output drop is only 2.68%; when changing the design objective from maximum efficiency to maximum ecological objective function, the improvement of power output is 229.13%, and the efficiency drop is only 13.53%. - Highlights: • An irreversible single resonance energy selective electron heat engine is studied. • Heat leakage between two reservoirs is considered. • Power output, efficiency and ecological objective function are derived. • Optimal performance comparison for three objective functions is carried out.

  7. Object methods of analysis and design: presentation of U R L ...

    African Journals Online (AJOL)

    Objects invaded the world of data processing, and there is no field which did not feel their effects. The object approach originates in the programming object, whose languages Smalltalk and C++ are the most known representatives. Thereafter, its application spread with many fields such as the software genius, the left again ...

  8. Object-oriented analysis and design of a GEANT based detector simulator

    International Nuclear Information System (INIS)

    Amako, K.; Kanzaki, J.; Sasaki, T.; Takaiwa, Y.; Nakagawa, Y.; Yamagata, T.

    1994-01-01

    The authors give a status report of the project to design a detector simulation program by reengineering GEANT with the object-oriented methodology. They followed the Object Modeling Technique. They explain the object model they constructed. Also problems of the technique found during their study are discussed

  9. Analysis of current research addressing complementary use of life-cycle assessment and risk assessment for engineered nanomaterials: have lessons been learned from previous experience with chemicals?

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Laurent, Alexis; Miseljic, Mirko

    2012-01-01

    of research focused on applying LCA and RA together for NM, it appears that current research efforts have taken into account some key ‘‘lessons learned’’ from previous experience with chemicals while many key challenges remain for practically applying these methods to NM. We identified two main approaches...... for using these methods together for NM: ‘‘LC-based RA’’ (traditional RA applied in a life-cycle perspective) and ‘‘RA-complemented LCA’’ (conventional LCA supplemented by RA in specific life-cycle steps). Hence, the latter is the only identified approach which genuinely combines LC- and RA-based methods......While it is generally agreed that successful strategies to address the health and environmental impacts of engineered nanomaterials (NM) should consider the well-established frameworks for conducting life-cycle assessment (LCA) and risk assessment (RA), scientific research, and specific guidance...

  10. report on the french objectives of electricity consumption, produced from renewable energies sources and on the analysis of their realization

    International Nuclear Information System (INIS)

    2007-01-01

    This report presents the french objectives of electricity, from renewable energies sources, internal consumption for the next ten years, as the analysis of their realization taking into account the climatic factors likely to change the realization of these objectives. It also discusses the adequacy of the actions to the national engagement in matter of climatic change. (A.L.B.)

  11. Patient-specific electric field simulations and acceleration measurements for objective analysis of intraoperative stimulation tests in the thalamus

    Directory of Open Access Journals (Sweden)

    Simone Hemm-Ode

    2016-11-01

    Full Text Available Despite an increasing use of deep brain stimulation (DBS the fundamental mechanisms of action remain largely unknown. Simulation of electric entities has previously been proposed for chronic DBS combined with subjective symptom evaluations, but not for intraoperative stimulation tests. The present paper introduces a method for an objective exploitation of intraoperative stimulation test data to identify the optimal implant position of the chronic DBS lead by relating the electric field simulations to the patient-specific anatomy and the clinical effects quantified by accelerometry. To illustrate the feasibility of this approach, it was applied to five patients with essential tremor bilaterally implanted in the ventral intermediate nucleus (VIM. The VIM and its neighborhood structures were preoperatively outlined in 3D on white matter attenuated inversion recovery MR images. Quantitative intraoperative clinical assessments were performed using accelerometry. Electric field simulations (n = 272 for intraoperative stimulation test data performed along two trajectories per side were set-up using the finite element method for 143 stimulation test positions. The resulting electric field isosurface of 0.2V/mm was superimposed to the outlined anatomical structures. The percentage of volume of each structure's overlap was calculated and related to the corresponding clinical improvement. The proposed concept has been successfully applied to the five patients. For higher clinical improvements, not only the VIM but as well other neighboring structures were covered by the electric field isosurfaces. The percentage of the volumes of the VIM, of the nucleus intermediate lateral of the thalamus and the prelemniscal radiations within the prerubral field of Forel increased for clinical improvements higher than 50% compared to improvements lower than 50%. The presented new concept allows a detailed and objective analysis of a high amount of intraoperative data to

  12. Pigments analysis and gold layer thickness evaluation of polychromy on wood objects by PXRF

    International Nuclear Information System (INIS)

    Blonski, M.S.; Appoloni, C.R.

    2014-01-01

    The X-ray fluorescence technique by energy dispersion (EDXRF), being a multi elemental and non-destructive technique, has been widely used in the analysis of artworks and archeometry. An X-ray fluorescence portable equipment from the Laboratory of Applied Nuclear Physics of the State University of Londrina (LFNA/UEL) was used for the measurement of pigments in golden parts of a Gilding Preparation Standard Plaque and also pigments measurement on the Wood Adornment of the High Altar Column of the Side Pulpit of the Immaculate Conception Church Parish Sao Paulo-SP. The portable X-ray fluorescence PXRF-LFNA-02 consists of an X-ray tube with Ag anode, a Si-PIN detector (FWHM=221 eV for Mn line at 5.9 keV), a chain of electronics nuclear standard of X-ray spectrometer, a multichannel 8 K, a notebook and a mechanical system designed for the positioning of detector and X-ray tube, which allows movements with two degrees of freedom from the system of excitation–detection. The excitation–detection time of each measurement was 100 and 500 s, respectively. The presence of elements Ti, Cr, Fe, Cu, Zn and Au was found in the golden area of the Altar Column ornament. On the other hand, analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. - Highlights: • The X-ray fluorescence technique by energy dispersion (EDXRF) and an X-ray fluorescence portable equipment are used for measurement of pigments. • Analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. • The result of pigment analysis performed on these objects indicates that they are of the twentieth century

  13. Bridge Crack Detection Using Multi-Rotary Uav and Object-Base Image Analysis

    Science.gov (United States)

    Rau, J. Y.; Hsiao, K. W.; Jhan, J. P.; Wang, S. H.; Fang, W. C.; Wang, J. L.

    2017-08-01

    Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2-8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA) technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM) to obtain 3D crack information and based on image scale we

  14. BRIDGE CRACK DETECTION USING MULTI-ROTARY UAV AND OBJECT-BASE IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    J. Y. Rau

    2017-08-01

    Full Text Available Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2–8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM to obtain 3D crack information and based

  15. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  16. Sliding thin slab, minimum intensity projection imaging for objective analysis of emphysema

    International Nuclear Information System (INIS)

    Satoh, Shiro; Ohdama, Shinichi; Shibuya, Hitoshi

    2006-01-01

    The aim of this study was to determine whether sliding thin slab, minimum intensity projection (STS-MinIP) imaging is more advantageous than thin-section computed tomography (CT) for detecting and assessing emphysema. Objective quantification of emphysema by STS-MinIP and thin-section CT was defined as the percentage of area lower than the threshold in the lung section at the level of the aortic arch, tracheal carina, and 5 cm below the carina. Quantitative analysis in 100 subjects was performed and compared with pulmonary function test results. The ratio of the low attenuation area in the lung measured by STS-MinIP was significantly higher than that found by thin-section CT (P<0.01). The difference between STS-MinIP and thin-section CT was statistically evident even for mild emphysema and increased depending on whether the low attenuation in the lung increased. Moreover, STS-MinIP showed a stronger regression relation with pulmonary function results than did thin-section CT (P<0.01). STS-MinIP can be recommended as a new morphometric method for detecting and assessing the severity of emphysema. (author)

  17. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Science.gov (United States)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  18. A cluster analysis of patterns of objectively measured physical activity in Hong Kong.

    Science.gov (United States)

    Lee, Paul H; Yu, Ying-Ying; McDowell, Ian; Leung, Gabriel M; Lam, T H

    2013-08-01

    The health benefits of exercise are clear. In targeting interventions it would be valuable to know whether characteristic patterns of physical activity (PA) are associated with particular population subgroups. The present study used cluster analysis to identify characteristic hourly PA patterns measured by accelerometer. Cross-sectional design. Objectively measured PA in Hong Kong adults. Four-day accelerometer data were collected during 2009 to 2011 for 1714 participants in Hong Kong (mean age 44?2 years, 45?9% male). Two clusters were identified, one more active than the other. The ‘active cluster’ (n 480) was characterized by a routine PA pattern on weekdays and a more active and varied pattern on weekends; the other, the ‘less active cluster’ (n 1234), by a consistently low PA pattern on both weekdays and weekends with little variation from day to day. Demographic, lifestyle, PA level and health characteristics of the two clusters were compared. They differed in age, sex, smoking, income and level of PA required at work. The odds of having any chronic health conditions was lower for the active group (adjusted OR50?62, 95% CI 0?46, 0?84) but the two groups did not differ in terms of specific chronic health conditions or obesity. Implications are drawn for targeting exercise promotion programmes at the population level.

  19. Change Analysis and Decision Tree Based Detection Model for Residential Objects across Multiple Scales

    Directory of Open Access Journals (Sweden)

    CHEN Liyan

    2018-03-01

    Full Text Available Change analysis and detection plays important role in the updating of multi-scale databases.When overlap an updated larger-scale dataset and a to-be-updated smaller-scale dataset,people usually focus on temporal changes caused by the evolution of spatial entities.Little attention is paid to the representation changes influenced by map generalization.Using polygonal building data as an example,this study examines the changes from different perspectives,such as the reasons for their occurrence,their performance format.Based on this knowledge,we employ decision tree in field of machine learning to establish a change detection model.The aim of the proposed model is to distinguish temporal changes that need to be applied as updates to the smaller-scale dataset from representation changes.The proposed method is validated through tests using real-world building data from Guangzhou city.The experimental results show the overall precision of change detection is more than 90%,which indicates our method is effective to identify changed objects.

  20. Cognition and objectively measured sleep duration in children: a systematic review and meta-analysis.

    Science.gov (United States)

    Short, Michelle A; Blunden, Sarah; Rigney, Gabrielle; Matricciani, Lisa; Coussens, Scott; M Reynolds, Chelsea; Galland, Barbara

    2018-06-01

    Sleep recommendations are widely used to guide communities on children's sleep needs. Following recent adjustments to guidelines by the National Sleep Foundation and the subsequent consensus statement by the American Academy of Sleep Medicine, we undertook a systematic literature search to evaluate the current evidence regarding relationships between objectively measured sleep duration and cognitive function in children aged 5 to 13 years. Cognitive function included measures of memory, attention, processing speed, and intelligence in children aged 5 to 13 years. Keyword searches of 7 databases to December 2016 found 23 meeting inclusion criteria from 137 full articles reviewed, 19 of which were suitable for meta-analysis. A significant effect (r = .06) was found between sleep duration and cognition, suggesting that longer sleep durations were associated with better cognitive functioning. Analyses of different cognitive domains revealed that full/verbal IQ was significantly associated with sleep loss, but memory, fluid IQ, processing speed and attention were not. Comparison of study sleep durations with current sleep recommendations showed that most children studied had sleep durations that were not within the range of recommended sleep. As such, the true effect of sleep loss on cognitive function may be obscured in these samples, as most children were sleep restricted. Future research using more rigorous experimental methodologies is needed to properly elucidate the relationship between sleep duration and cognition in this age group. Copyright © 2018 National Sleep Foundation. Published by Elsevier Inc. All rights reserved.

  1. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues

  2. Calculating potential error in sodium MRI with respect to the analysis of small objects.

    Science.gov (United States)

    Stobbe, Robert W; Beaulieu, Christian

    2018-06-01

    To facilitate correct interpretation of sodium MRI measurements, calculation of error with respect to rapid signal decay is introduced and combined with that of spatially correlated noise to assess volume-of-interest (VOI) 23 Na signal measurement inaccuracies, particularly for small objects. Noise and signal decay-related error calculations were verified using twisted projection imaging and a specially designed phantom with different sized spheres of constant elevated sodium concentration. As a demonstration, lesion signal measurement variation (5 multiple sclerosis participants) was compared with that predicted from calculation. Both theory and phantom experiment showed that VOI signal measurement in a large 10-mL, 314-voxel sphere was 20% less than expected on account of point-spread-function smearing when the VOI was drawn to include the full sphere. Volume-of-interest contraction reduced this error but increased noise-related error. Errors were even greater for smaller spheres (40-60% less than expected for a 0.35-mL, 11-voxel sphere). Image-intensity VOI measurements varied and increased with multiple sclerosis lesion size in a manner similar to that predicted from theory. Correlation suggests large underestimation of 23 Na signal in small lesions. Acquisition-specific measurement error calculation aids 23 Na MRI data analysis and highlights the limitations of current low-resolution methodologies. Magn Reson Med 79:2968-2977, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  3. Application of Object Based Classification and High Resolution Satellite Imagery for Savanna Ecosystem Analysis

    Directory of Open Access Journals (Sweden)

    Jane Southworth

    2010-12-01

    Full Text Available Savanna ecosystems are an important component of dryland regions and yet are exceedingly difficult to study using satellite imagery. Savannas are composed are varying amounts of trees, shrubs and grasses and typically traditional classification schemes or vegetation indices cannot differentiate across class type. This research utilizes object based classification (OBC for a region in Namibia, using IKONOS imagery, to help differentiate tree canopies and therefore woodland savanna, from shrub or grasslands. The methodology involved the identification and isolation of tree canopies within the imagery and the creation of tree polygon layers had an overall accuracy of 84%. In addition, the results were scaled up to a corresponding Landsat image of the same region, and the OBC results compared to corresponding pixel values of NDVI. The results were not compelling, indicating once more the problems of these traditional image analysis techniques for savanna ecosystems. Overall, the use of the OBC holds great promise for this ecosystem and could be utilized more frequently in studies of vegetation structure.

  4. Objective classification of ecological status in marine water bodies using ecotoxicological information and multivariate analysis.

    Science.gov (United States)

    Beiras, Ricardo; Durán, Iria

    2014-12-01

    Some relevant shortcomings have been identified in the current approach for the classification of ecological status in marine water bodies, leading to delays in the fulfillment of the Water Framework Directive objectives. Natural variability makes difficult to settle fixed reference values and boundary values for the Ecological Quality Ratios (EQR) for the biological quality elements. Biological responses to environmental degradation are frequently of nonmonotonic nature, hampering the EQR approach. Community structure traits respond only once ecological damage has already been done and do not provide early warning signals. An alternative methodology for the classification of ecological status integrating chemical measurements, ecotoxicological bioassays and community structure traits (species richness and diversity), and using multivariate analyses (multidimensional scaling and cluster analysis), is proposed. This approach does not depend on the arbitrary definition of fixed reference values and EQR boundary values, and it is suitable to integrate nonlinear, sensitive signals of ecological degradation. As a disadvantage, this approach demands the inclusion of sampling sites representing the full range of ecological status in each monitoring campaign. National or international agencies in charge of coastal pollution monitoring have comprehensive data sets available to overcome this limitation.

  5. Evaluation of the US Food and Drug Administration sentinel analysis tools in confirming previously observed drug-outcome associations: The case of clindamycin and Clostridium difficile infection.

    Science.gov (United States)

    Carnahan, Ryan M; Kuntz, Jennifer L; Wang, Shirley V; Fuller, Candace; Gagne, Joshua J; Leonard, Charles E; Hennessy, Sean; Meyer, Tamra; Archdeacon, Patrick; Chen, Chih-Ying; Panozzo, Catherine A; Toh, Sengwee; Katcoff, Hannah; Woodworth, Tiffany; Iyer, Aarthi; Axtman, Sophia; Chrischilles, Elizabeth A

    2018-03-13

    The Food and Drug Administration's Sentinel System developed parameterized, reusable analytic programs for evaluation of medical product safety. Research on outpatient antibiotic exposures, and Clostridium difficile infection (CDI) with non-user reference groups led us to expect a higher rate of CDI among outpatient clindamycin users vs penicillin users. We evaluated the ability of the Cohort Identification and Descriptive Analysis and Propensity Score Matching tools to identify a higher rate of CDI among clindamycin users. We matched new users of outpatient dispensings of oral clindamycin or penicillin from 13 Data Partners 1:1 on propensity score and followed them for up to 60 days for development of CDI. We used Cox proportional hazards regression stratified by Data Partner and matched pair to compare CDI incidence. Propensity score models at 3 Data Partners had convergence warnings and a limited range of predicted values. We excluded these Data Partners despite adequate covariate balance after matching. From the 10 Data Partners where these models converged without warnings, we identified 807 919 new clindamycin users and 8 815 441 new penicillin users eligible for the analysis. The stratified analysis of 807 769 matched pairs included 840 events among clindamycin users and 290 among penicillin users (hazard ratio 2.90, 95% confidence interval 2.53, 3.31). This evaluation produced an expected result and identified several potential enhancements to the Propensity Score Matching tool. This study has important limitations. CDI risk may have been related to factors other than the inherent properties of the drugs, such as duration of use or subsequent exposures. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Structural analysis of eight novel and 112 previously reported missense mutations in the interactive FXI mutation database reveals new insight on FXI deficiency.

    Science.gov (United States)

    Saunders, Rebecca E; Shiltagh, Nuha; Gomez, Keith; Mellars, Gillian; Cooper, Carolyn; Perry, David J; Tuddenham, Edward G; Perkins, Stephen J

    2009-08-01

    Factor XI (FXI) functions in blood coagulation. FXI is composed of four apple (Ap) domains and a serine protease (SP) domain. Deficiency of FXI leads to an injury-related bleeding disorder, which is remarkable for the lack of correlation between bleeding symptoms and FXI coagulant activity (FXI:C). The number of mutations previously reported in our interactive web database (http://www.FactorXI.org) is now significantly increased to 183 through our new patient studies and from literature surveys. Eight novel missense mutations give a total of 120 throughout the FXI gene (F11). The most abundant defects in FXI are revealed to be those from low-protein plasma levels (Type I: CRM-) that originate from protein misfolding, rather than from functional defects (Type II: CRM+). A total of 70 Ap missense mutations were analysed using a consensus Ap domain structure generated from the FXI dimer crystal structure. This showed that all parts of the Ap domain were affected. The 47 SP missense mutations were also distributed throughout the SP domain structure. The periphery of the Ap beta-sheet structure is sensitive to structural perturbation caused by residue changes throughout the Ap domain, yet this beta-sheet is crucial for FXI dimer formation. Residues located at the Ap4:Ap4 interface in the dimer are much less directly involved. We conclude that the abundance of Type I defects in FXI results from the sensitivity of the Ap domain folding to residue changes within this, and discuss how structural knowledge of the mutations improves our understanding of FXI deficiencies.

  7. Object-oriented analysis and design of a health care management information system.

    Science.gov (United States)

    Krol, M; Reich, D L

    1999-04-01

    We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.

  8. A functional analysis of photo-object matching skills of severely retarded adolescents.

    OpenAIRE

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photo...

  9. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  10. Fludarabine-based versus CHOP-like regimens with or without rituximab in patients with previously untreated indolent lymphoma: a retrospective analysis of safety and efficacy

    Directory of Open Access Journals (Sweden)

    Xu XX

    2013-10-01

    Full Text Available Xiao-xiao Xu,1 Bei Yan,2 Zhen-xing Wang,3 Yong Yu,1 Xiao-xiong Wu,2 Yi-zhuo Zhang11Department of Hematology, Tianjin Medical University Cancer Institute and Hospital, Tianjin Key Laboratory of Cancer Prevention and Therapy, Tianjin, 2Department of Hematology, First Affiliated Hospital of Chinese People's Liberation Army General Hospital, Beijing, 3Department of Stomach Oncology, TianJin Medical University Cancer Institute and Hospital, Key Laboratory of Cancer Prevention and Therapy, Tianjin, People's Republic of ChinaAbstract: Fludarabine-based regimens and CHOP (doxorubicin, cyclophosphamide, vincristine, prednisone-like regimens with or without rituximab are the most common treatment modalities for indolent lymphoma. However, there is no clear evidence to date about which chemotherapy regimen should be the proper initial treatment of indolent lymphoma. More recently, the use of fludarabine has raised concerns due to its high number of toxicities, especially hematological toxicity and infectious complications. The present study aimed to retrospectively evaluate both the efficacy and the potential toxicities of the two main regimens (fludarabine-based and CHOP-like regimens in patients with previously untreated indolent lymphoma. Among a total of 107 patients assessed, 54 patients received fludarabine-based regimens (FLU arm and 53 received CHOP or CHOPE (doxorubicin, cyclophosphamide, vincristine, prednisone, or plus etoposide regimens (CHOP arm. The results demonstrated that fludarabine-based regimens could induce significantly improved progression-free survival (PFS compared with CHOP-like regimens. However, the FLU arm showed overall survival, complete response, and overall response rates similar to those of the CHOP arm. Grade 3–4 neutropenia occurred in 42.6% of the FLU arm and 7.5% of the CHOP arm (P 60 years and presentation of grade 3–4 myelosuppression were the independent factors to infection, and the FLU arm had significantly

  11. Error analysis of marker-based object localization using a single-plane XRII

    International Nuclear Information System (INIS)

    Habets, Damiaan F.; Pollmann, Steven I.; Yuan, Xunhua; Peters, Terry M.; Holdsworth, David W.

    2009-01-01

    The role of imaging and image guidance is increasing in surgery and therapy, including treatment planning and follow-up. Fluoroscopy is used for two-dimensional (2D) guidance or localization; however, many procedures would benefit from three-dimensional (3D) guidance or localization. Three-dimensional computed tomography (CT) using a C-arm mounted x-ray image intensifier (XRII) can provide high-quality 3D images; however, patient dose and the required acquisition time restrict the number of 3D images that can be obtained. C-arm based 3D CT is therefore limited in applications for x-ray based image guidance or dynamic evaluations. 2D-3D model-based registration, using a single-plane 2D digital radiographic system, does allow for rapid 3D localization. It is our goal to investigate - over a clinically practical range - the impact of x-ray exposure on the resulting range of 3D localization precision. In this paper it is assumed that the tracked instrument incorporates a rigidly attached 3D object with a known configuration of markers. A 2D image is obtained by a digital fluoroscopic x-ray system and corrected for XRII distortions (±0.035 mm) and mechanical C-arm shift (±0.080 mm). A least-square projection-Procrustes analysis is then used to calculate the 3D position using the measured 2D marker locations. The effect of x-ray exposure on the precision of 2D marker localization and on 3D object localization was investigated using numerical simulations and x-ray experiments. The results show a nearly linear relationship between 2D marker localization precision and the 3D localization precision. However, a significant amplification of error, nonuniformly distributed among the three major axes, occurs, and that is demonstrated. To obtain a 3D localization error of less than ±1.0 mm for an object with 20 mm marker spacing, the 2D localization precision must be better than ±0.07 mm. This requirement was met for all investigated nominal x-ray exposures at 28 cm FOV, and

  12. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    Science.gov (United States)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  13. The dynamic relationship between current and previous severe hypoglycemic events: a lagged dependent variable analysis among patients with type 2 diabetes who have initiated basal insulin.

    Science.gov (United States)

    Ganz, Michael L; Li, Qian; Wintfeld, Neil S; Lee, Yuan-Chi; Sorli, Christopher; Huang, Joanna C

    2015-01-01

    Past studies have found episodes of severe hypoglycemia (SH) to be serially dependent. Those studies, however, only considered the impact of a single (index) event on future risk; few have analyzed SH risk as it evolves over time in the presence (or absence) of continuing events. The objective of this study was to determine the dynamic risks of SH events conditional on preceding SH events among patients with type 2 diabetes (T2D) who have initiated basal insulin. We used an electronic health records database from the United States that included encounter and laboratory data and clinical notes on T2D patients who initiated basal insulin therapy between 2008 and 2011 and to identify SH events. We used a repeated-measures lagged dependent variable logistic regression model to estimate the impact of SH in one quarter on the risk of SH in the next quarter. We identified 7235 patients with T2D who initiated basal insulin. Patients who experienced ≥1 SH event during any quarter were more likely to have ≥1 SH event during the subsequent quarter than those who did not (predicted probabilities of 7.4% and 1.0%, respectively; p history of SH before starting basal insulin (predicted probabilities of 1.0% and 3.2%, respectively; p history of SH during the titration period (predicted probabilities of 1.1% and 2.8%, respectively; p history of SH events and therefore the value of preventing one SH event may be substantial. These results can inform patient care by providing clinicians with dynamic data on a patient's risk of SH, which in turn can facilitate appropriate adjustment of the risk-benefit ratio for individualized patient care. These results should, however, be interpreted in light of the key limitations of our study: not all SH events may have been captured or coded in the database, data on filled prescriptions were not available, we were unable to adjust for basal insulin dose, and the post-titration follow-up period could have divided into time units other

  14. An Exploration and Analysis of the Relationships among Object Oriented Programming, Hypermedia, and Hypertalk.

    Science.gov (United States)

    Milet, Lynn K.; Harvey, Francis A.

    Hypermedia and object oriented programming systems (OOPs) represent examples of "open" computer environments that allow the user access to parts of the code or operating system. Both systems share fundamental intellectual concepts (objects, messages, methods, classes, and inheritance), so that an understanding of hypermedia can help in…

  15. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Science.gov (United States)

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  16. Forecast skill score assessment of a relocatable ocean prediction system, using a simplified objective analysis method

    Science.gov (United States)

    Onken, Reiner

    2017-11-01

    A relocatable ocean prediction system (ROPS) was employed to an observational data set which was collected in June 2014 in the waters to the west of Sardinia (western Mediterranean) in the framework of the REP14-MED experiment. The observational data, comprising more than 6000 temperature and salinity profiles from a fleet of underwater gliders and shipborne probes, were assimilated in the Regional Ocean Modeling System (ROMS), which is the heart of ROPS, and verified against independent observations from ScanFish tows by means of the forecast skill score as defined by Murphy(1993). A simplified objective analysis (OA) method was utilised for assimilation, taking account of only those profiles which were located within a predetermined time window W. As a result of a sensitivity study, the highest skill score was obtained for a correlation length scale C = 12.5 km, W = 24 h, and r = 1, where r is the ratio between the error of the observations and the background error, both for temperature and salinity. Additional ROPS runs showed that (i) the skill score of assimilation runs was mostly higher than the score of a control run without assimilation, (i) the skill score increased with increasing forecast range, and (iii) the skill score for temperature was higher than the score for salinity in the majority of cases. Further on, it is demonstrated that the vast number of observations can be managed by the applied OA method without data reduction, enabling timely operational forecasts even on a commercially available personal computer or a laptop.

  17. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  18. Analysis Of Tourism Object Demand In The Pekanbaru City With Travel Cost Method

    Directory of Open Access Journals (Sweden)

    Eriyati

    2017-11-01

    Full Text Available The tourism sector gets attention when world oil prices are decreasing. It can not be denied that during this time the largest contribution of Pekanbaru city revenue from profit-sharing funding comes from the oil and gas sector. Currently Pekanbaru revenue is small from the oil and gas sector as oil prices continue to decline. The existence of Pekanbaru City away from the coast and mountains causing focus on the development of artificial attractions such as Alam Mayang artificial lake Bandar Kayangan Lembah Sari Pekanbaru Mosque and the tomb of the founder of Pekanbaru city. Many people bring families visiting artificial tourist attractions on weekends and holidays.This study aims to determine the factors that affect the demand and economic value of tourist attractions in Kota Pekanbaru with Travel Cost Method. Sampling non probability as much as 100 respondents visitor attraction in Pekanbaru City of population 224896 people with sampling technique using slovin formula data analysis method used in this research is descriptive quantitative method. From the results of research states that the factors that influence the demand for tourist attraction in the city of Pekanbaru is income cost and distance. The economic value of tourism object of Pekanbaru city with cost of travel method is Rp42.679.638.400 per year. This means that the price given by a person to something at a certain place and time with the size of the price specified by time goods or money that will be sacrificed by someone to own or use goods and services in want.

  19. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  20. Monitoring Urban Tree Cover Using Object-Based Image Analysis and Public Domain Remotely Sensed Data

    Directory of Open Access Journals (Sweden)

    Meghan Halabisky

    2011-10-01

    Full Text Available Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes. Landsat imagery has historically been used for per-pixel driven land use/land cover (LULC classifications, but the spatial resolution limits our ability to map small urban features. In such cases, hyperspatial resolution imagery such as aerial or satellite imagery with a resolution of 1 meter or below is preferred. Object-based image analysis (OBIA allows for use of additional variables such as texture, shape, context, and other cognitive information provided by the image analyst to segment and classify image features, and thus, improve classifications. As part of this research we created LULC classifications for a pilot study area in Seattle, WA, USA, using OBIA techniques and freely available public aerial photography. We analyzed the differences in accuracies which can be achieved with OBIA using multispectral and true-color imagery. We also compared our results to a satellite based OBIA LULC and discussed the implications of per-pixel driven vs. OBIA-driven field sampling campaigns. We demonstrated that the OBIA approach can generate good and repeatable LULC classifications suitable for tree cover assessment in urban areas. Another important finding is that spectral content appeared to be more important than spatial detail of hyperspatial data when it comes to an OBIA-driven LULC.

  1. Pricing index-based catastrophe bonds: Part 2: Object-oriented design issues and sensitivity analysis

    Science.gov (United States)

    Unger, André J. A.

    2010-02-01

    This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.

  2. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  3. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis; FINAL

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  4. Feature extraction and selection for objective gait analysis and fall risk assessment by accelerometry

    Directory of Open Access Journals (Sweden)

    Cremer Gerald

    2011-01-01

    Full Text Available Abstract Background Falls in the elderly is nowadays a major concern because of their consequences on elderly general health and moral states. Moreover, the aging of the population and the increasing life expectancy make the prediction of falls more and more important. The analysis presented in this article makes a first step in this direction providing a way to analyze gait and classify hospitalized elderly fallers and non-faller. This tool, based on an accelerometer network and signal processing, gives objective informations about the gait and does not need any special gait laboratory as optical analysis do. The tool is also simple to use by a non expert and can therefore be widely used on a large set of patients. Method A population of 20 hospitalized elderlies was asked to execute several classical clinical tests evaluating their risk of falling. They were also asked if they experienced any fall in the last 12 months. The accelerations of the limbs were recorded during the clinical tests with an accelerometer network distributed on the body. A total of 67 features were extracted from the accelerometric signal recorded during a simple 25 m walking test at comfort speed. A feature selection algorithm was used to select those able to classify subjects at risk and not at risk for several classification algorithms types. Results The results showed that several classification algorithms were able to discriminate people from the two groups of interest: fallers and non-fallers hospitalized elderlies. The classification performances of the used algorithms were compared. Moreover a subset of the 67 features was considered to be significantly different between the two groups using a t-test. Conclusions This study gives a method to classify a population of hospitalized elderlies in two groups: at risk of falling or not at risk based on accelerometric data. This is a first step to design a risk of falling assessment system that could be used to provide

  5. Analysis of disease-associated objects at the Rat Genome Database

    Science.gov (United States)

    Wang, Shur-Jen; Laulederkind, Stanley J. F.; Hayman, G. T.; Smith, Jennifer R.; Petri, Victoria; Lowry, Timothy F.; Nigam, Rajni; Dwinell, Melinda R.; Worthey, Elizabeth A.; Munzenmaier, Diane H.; Shimoyama, Mary; Jacob, Howard J.

    2013-01-01

    The Rat Genome Database (RGD) is the premier resource for genetic, genomic and phenotype data for the laboratory rat, Rattus norvegicus. In addition to organizing biological data from rats, the RGD team focuses on manual curation of gene–disease associations for rat, human and mouse. In this work, we have analyzed disease-associated strains, quantitative trait loci (QTL) and genes from rats. These disease objects form the basis for seven disease portals. Among disease portals, the cardiovascular disease and obesity/metabolic syndrome portals have the highest number of rat strains and QTL. These two portals share 398 rat QTL, and these shared QTL are highly concentrated on rat chromosomes 1 and 2. For disease-associated genes, we performed gene ontology (GO) enrichment analysis across portals using RatMine enrichment widgets. Fifteen GO terms, five from each GO aspect, were selected to profile enrichment patterns of each portal. Of the selected biological process (BP) terms, ‘regulation of programmed cell death’ was the top enriched term across all disease portals except in the obesity/metabolic syndrome portal where ‘lipid metabolic process’ was the most enriched term. ‘Cytosol’ and ‘nucleus’ were common cellular component (CC) annotations for disease genes, but only the cancer portal genes were highly enriched with ‘nucleus’ annotations. Similar enrichment patterns were observed in a parallel analysis using the DAVID functional annotation tool. The relationship between the preselected 15 GO terms and disease terms was examined reciprocally by retrieving rat genes annotated with these preselected terms. The individual GO term–annotated gene list showed enrichment in physiologically related diseases. For example, the ‘regulation of blood pressure’ genes were enriched with cardiovascular disease annotations, and the ‘lipid metabolic process’ genes with obesity annotations. Furthermore, we were able to enhance enrichment of neurological

  6. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    Science.gov (United States)

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  7. Medical Assistance in Dying in Canada: An Ethical Analysis of Conscientious and Religious Objections

    Directory of Open Access Journals (Sweden)

    Christie, Timothy

    2016-08-01

    Full Text Available Background: The Supreme Court of Canada (SCC has ruled that the federal government is required to remove the provisions of the Criminal Code of Canada that prohibit medical assistance in dying (MAID. The SCC has stipulated that individual physicians will not be required to provide MAID should they have a religious or conscientious objection. Therefore, the pending legislative response will have to balance the rights of the patients with the rights of physicians, other health care professionals, and objecting institutions. Objective: The objective of this paper is to critically assess, within the Canadian context, the moral probity of individual or institutional objections to MAID that are for either religious or conscientious reasons. Methods: Deontological ethics and the Doctrine of Double Effect. Results: The religious or conscientious objector has conflicting duties, i.e., a duty to respect the “right to life” (section 7 of the Charter and a duty to respect the tenets of his or her religious or conscientious beliefs (protected by section 2 of the Charter. Conclusion: The discussion of religious or conscientious objections to MAID has not explicitly considered the competing duties of the conscientious objector. It has focussed on the fact that a conscientious objection exists and has ignored the normative question of whether the duty to respect one’s conscience or religion supersedes the duty to respect the patient’s right to life.

  8. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  9. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  10. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    DEFF Research Database (Denmark)

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    is structurally and conceptually closer to class diagrams and object-oriented programming languages. The CPN models reduce the gap between user-level requirements and the respective implementation, thus simplifying the implementation or code generation. Finally, we discuss the code generation from object-oriented......In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  11. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery.

    Science.gov (United States)

    Connolly, J; Holden, N M

    2017-12-01

    Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA) was performed on a very high resolution satellite image (Geoeye-1) to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ) were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA) of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95-97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO 2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of drains on a blanket bog in the west of Ireland. The

  12. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery

    Directory of Open Access Journals (Sweden)

    J. Connolly

    2017-03-01

    Full Text Available Abstract Background Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA was performed on a very high resolution satellite image (Geoeye-1 to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. Results The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95–97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. Conclusions The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of

  13. Objective Acoustic-Phonetic Speech Analysis in Patients Treated for Oral or Oropharyngeal Cancer

    NARCIS (Netherlands)

    de Bruijn, Marieke J.; ten Bosch, Louis; Kuik, Dirk J.; Quene, Hugo; Langendijk, Johannes A.; Leemans, C. Rene; Verdonck-de Leeuw, Irma M.

    2009-01-01

    Objective: Speech impairment often occurs in patients after treatment for head and neck cancer. New treatment modalities such as surgical reconstruction or (chemo) radiation techniques aim at sparing anatomical structures that are correlated with speech and swallowing. In randomized trials

  14. Learning Objectives and Testing: An Analysis of Six Principles of Economics Textbooks, Using Bloom's Taxonomy.

    Science.gov (United States)

    Karns, James M. L.; And Others

    1983-01-01

    Significant differences were found between the stated objectives of most college level economics textbooks and the instruments included in the instructor's manuals to measure student achievement. (Author/RM)

  15. Analysis of students’ spatial thinking in geometry: 3D object into 2D representation

    Science.gov (United States)

    Fiantika, F. R.; Maknun, C. L.; Budayasa, I. K.; Lukito, A.

    2018-05-01

    The aim of this study is to find out the spatial thinking process of students in transforming 3-dimensional (3D) object to 2-dimensional (2D) representation. Spatial thinking is helpful in using maps, planning routes, designing floor plans, and creating art. The student can engage geometric ideas by using concrete models and drawing. Spatial thinking in this study is identified through geometrical problems of transforming a 3-dimensional object into a 2-dimensional object image. The problem was resolved by the subject and analyzed by reference to predetermined spatial thinking indicators. Two representative subjects of elementary school were chosen based on mathematical ability and visual learning style. Explorative description through qualitative approach was used in this study. The result of this study are: 1) there are different representations of spatial thinking between a boy and a girl object, 2) the subjects has their own way to invent the fastest way to draw cube net.

  16. Analysis of double support phase of biped robot and multi-objective ...

    Indian Academy of Sciences (India)

    ing objectives, namely power consumption and dynamic balance margin have been ... in detail to arrive at a complete knowledge of the biped walking systems on .... measured in the anti-clockwise sense with respect to the vertical axis.

  17. DGTD Analysis of Electromagnetic Scattering from Penetrable Conductive Objects with IBC

    KAUST Repository

    Li, Ping; Shi, Yifei; Jiang, Li; Bagci, Hakan

    2015-01-01

    To avoid straightforward volumetric discretization, a discontinuous Galerkin time-domain (DGTD) method integrated with the impedance boundary condition (IBC) is presented in this paper to analyze the scattering from objects with finite conductivity

  18. Scientific analysis of a calcified object from a post-medieval burial in Vienna, Austria.

    Science.gov (United States)

    Binder, Michaela; Berner, Margit; Krause, Heike; Kucera, Matthias; Patzak, Beatrix

    2016-09-01

    Calcifications commonly occur in association with soft tissue inflammation. However, they are not often discussed in palaeopathological literature, frequently due to problems of identification and diagnosis. We present a calcified object (40×27×27cm) found with a middle-aged male from a post-medieval cemetery in Vienna. It was not recognized during excavation, thus its anatomical location within the body remains unknown. The object was subject to X-ray, SEM and CT scanning and compared to historic pathological objects held in the collection of the Natural History Museum Vienna. Two of closest resemblance, a thyroid adenoma and goitre were subject to similar analytical techniques for comparison. Despite similarities between all objects, the structure of the object most closely conforms to a thyroid tumor. Nevertheless, due to similar pathophysiological pathways and biochemical composition of calcified soft tissue, a secure identification outside of its anatomical context is not possible. The research further highlights the fact that recognition of such objects during excavation is crucial for a more conclusive diagnosis. Historic medical records indicate that they were common and might therefore be expected to frequently occur in cemeteries. Consequently, an increasing the dataset of calcifications would also aid in extending the knowledge about diseases in past human populations. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  20. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience.

    Science.gov (United States)

    Dunstone, Kimberley; Brennan, Emily; Slater, Michael D; Dixon, Helen G; Durkin, Sarah J; Pettigrew, Simone; Wakefield, Melanie A

    2017-04-11

    Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads). Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40%) and the United Kingdom (26%). The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38%) or to behave responsibly and/or not get drunk when drinking (33%). Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of different characteristics of alcohol harm reduction ads. Given

  1. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience

    Directory of Open Access Journals (Sweden)

    Kimberley Dunstone

    2017-04-01

    Full Text Available Abstract Background Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads. Method Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. Results In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40% and the United Kingdom (26%. The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38% or to behave responsibly and/or not get drunk when drinking (33%. Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Conclusions Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of

  2. Land Cover/Land Use Classification and Change Detection Analysis with Astronaut Photography and Geographic Object-Based Image Analysis

    Science.gov (United States)

    Hollier, Andi B.; Jagge, Amy M.; Stefanov, William L.; Vanderbloemen, Lisa A.

    2017-01-01

    For over fifty years, NASA astronauts have taken exceptional photographs of the Earth from the unique vantage point of low Earth orbit (as well as from lunar orbit and surface of the Moon). The Crew Earth Observations (CEO) Facility is the NASA ISS payload supporting astronaut photography of the Earth surface and atmosphere. From aurora to mountain ranges, deltas, and cities, there are over two million images of the Earth's surface dating back to the Mercury missions in the early 1960s. The Gateway to Astronaut Photography of Earth website (eol.jsc.nasa.gov) provides a publically accessible platform to query and download these images at a variety of spatial resolutions and perform scientific research at no cost to the end user. As a demonstration to the science, application, and education user communities we examine astronaut photography of the Washington D.C. metropolitan area for three time steps between 1998 and 2016 using Geographic Object-Based Image Analysis (GEOBIA) to classify and quantify land cover/land use and provide a template for future change detection studies with astronaut photography.

  3. Irreversibility analysis for optimization design of plate fin heat exchangers using a multi-objective cuckoo search algorithm

    International Nuclear Information System (INIS)

    Wang, Zhe; Li, Yanzhong

    2015-01-01

    Highlights: • The first application of IMOCS for plate-fin heat exchanger design. • Irreversibility degrees of heat transfer and fluid friction are minimized. • Trade-off of efficiency, total cost and pumping power is achieved. • Both EGM and EDM methods have been compared in the optimization of PFHE. • This study has superiority over other single-objective optimization design. - Abstract: This paper introduces and applies an improved multi-objective cuckoo search (IMOCS) algorithm, a novel met-heuristic optimization algorithm based on cuckoo breeding behavior, for the multi-objective optimization design of plate-fin heat exchangers (PFHEs). A modified irreversibility degree of the PFHE is separated into heat transfer and fluid friction irreversibility degrees which are adopted as two initial objective functions to be minimized simultaneously for narrowing the search scope of the design. The maximization efficiency, minimization of pumping power, and total annual cost are considered final objective functions. Results obtained from a two dimensional normalized Pareto-optimal frontier clearly demonstrate the trade-off between heat transfer and fluid friction irreversibility. Moreover, a three dimensional Pareto-optimal frontier reveals a relationship between efficiency, total annual cost, and pumping power in the PFHE design. Three examples presented here further demonstrate that the presented method is able to obtain optimum solutions with higher accuracy, lower irreversibility, and fewer iterations as compared to the previous methods and single-objective design approaches

  4. Analysis of Greedy Decision Making for Geographic Routing for Networks of Randomly Moving Objects

    Directory of Open Access Journals (Sweden)

    Amber Israr

    2016-04-01

    Full Text Available Autonomous and self-organizing wireless ad-hoc communication networks for moving objects consist of nodes, which use no centralized network infrastructure. Examples of moving object networks are networks of flying objects, networks of vehicles, networks of moving people or robots. Moving object networks have to face many critical challenges in terms of routing because of dynamic topological changes and asymmetric networks links. A suitable and effective routing mechanism helps to extend the deployment of moving nodes. In this paper an attempt has been made to analyze the performance of the Greedy Decision method (position aware distance based algorithm for geographic routing for network nodes moving according to the random waypoint mobility model. The widely used GPSR (Greedy Packet Stateless Routing protocol utilizes geographic distance and position based data of nodes to transmit packets towards destination nodes. In this paper different scenarios have been tested to develop a concrete set of recommendations for optimum deployment of distance based Greedy Decision of Geographic Routing in randomly moving objects network

  5. Measuring systems of hard to get objects: problems with analysis of measurement results

    Science.gov (United States)

    Gilewska, Grazyna

    2005-02-01

    The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.

  6. Designing personal grief rituals: An analysis of symbolic objects and actions.

    Science.gov (United States)

    Sas, Corina; Coman, Alina

    2016-10-01

    Personal grief rituals are beneficial in dealing with complicated grief, but challenging to design, as they require symbolic objects and actions meeting clients' emotional needs. The authors reported interviews with 10 therapists with expertise in both grief therapy and grief rituals. Findings indicate three types of rituals supporting honoring, letting go, and self transformation, with the latter being particularly complex. Outcomes also point to a taxonomy of ritual objects for framing and remembering ritual experience, and for capturing and processing grief. Besides symbolic possessions, the authors identified other types of ritual objects including transformational and future-oriented ones. Symbolic actions include creative craft of ritual objects, respectful handling, disposal, and symbolic play. They conclude with theoretical implications of these findings, and a reflection on their value for tailored, creative co-design of grief rituals. In particular, several implications for designing grief rituals were identified that include accounting for the client's need, selecting (or creating) the most appropriate objects and actions from the identified types, integrating principles of both grief and art/drama therapy, exploring clients' affinity for the ancient elements as medium of disposal in letting go rituals, and the value of technology for recording and reflecting on ritual experience.

  7. Multi-objective game-theory models for conflict analysis in reservoir watershed management.

    Science.gov (United States)

    Lee, Chih-Sheng

    2012-05-01

    This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    Science.gov (United States)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  9. Indicators analysis and objectives for the development sustainable and sustainability environmental

    Directory of Open Access Journals (Sweden)

    Pedro Noboa-Romero

    2016-09-01

    Full Text Available The present article is product of a research qualitative, descriptive and analytical of the indicators and objectives aimed to the development sustainable. The main objective of this essay is to analyze sustainability indicators: index of human development (IDH, sustainable development goals (SDGS, objectives of the Millennium Goals (MDGS and the index of Multidimensional poverty (IPM; through a review of research and work on these issues, in order to establish progress and results that have been generated during the use of these indicators in the field of health education, technology, and environment. Demonstrate that there is inequality between Nations, the approach is oriented to a development in the short term, benefit exclusively to current generations, exhausting natural resources, regardless of a vision in the long term for the future generations.

  10. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Directory of Open Access Journals (Sweden)

    Paddy K C Janssen

    Full Text Available OBJECTIVE: To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE. METHODS: Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE. Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs reported in the recently published meta-analysis. RESULTS: Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%, SL(-2.3%, SS(0.6%. In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%, SL(-18.5% and SS(21.8% with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. CONCLUSIONS: In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  11. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  12. Java programming fundamentals problem solving through object oriented analysis and design

    CERN Document Server

    Nair, Premchand S

    2008-01-01

    While Java texts are plentiful, it's difficult to find one that takes a real-world approach, and encourages novice programmers to build on their Java skills through practical exercise. Written by an expert with 19 experience teaching computer programming, Java Programming Fundamentals presents object-oriented programming by employing examples taken from everyday life. Provides a foundation in object-oriented design principles and UML notation Describes common pitfalls and good programming practicesFurnishes supplemental links, documents, and programs on its companion website, www.premnair.netU

  13. Adobe Boxes: Locating Object Proposals Using Object Adobes.

    Science.gov (United States)

    Fang, Zhiwen; Cao, Zhiguo; Xiao, Yang; Zhu, Lei; Yuan, Junsong

    2016-09-01

    Despite the previous efforts of object proposals, the detection rates of the existing approaches are still not satisfactory enough. To address this, we propose Adobe Boxes to efficiently locate the potential objects with fewer proposals, in terms of searching the object adobes that are the salient object parts easy to be perceived. Because of the visual difference between the object and its surroundings, an object adobe obtained from the local region has a high probability to be a part of an object, which is capable of depicting the locative information of the proto-object. Our approach comprises of three main procedures. First, the coarse object proposals are acquired by employing randomly sampled windows. Then, based on local-contrast analysis, the object adobes are identified within the enlarged bounding boxes that correspond to the coarse proposals. The final object proposals are obtained by converging the bounding boxes to tightly surround the object adobes. Meanwhile, our object adobes can also refine the detection rate of most state-of-the-art methods as a refinement approach. The extensive experiments on four challenging datasets (PASCAL VOC2007, VOC2010, VOC2012, and ILSVRC2014) demonstrate that the detection rate of our approach generally outperforms the state-of-the-art methods, especially with relatively small number of proposals. The average time consumed on one image is about 48 ms, which nearly meets the real-time requirement.

  14. Classification and Regression Tree Analysis of Clinical Patterns that Predict Survival in 127 Chinese Patients with Advanced Non-small Cell Lung Cancer Treated by Gefitinib Who Failed to Previous Chemotherapy

    Directory of Open Access Journals (Sweden)

    Ziping WANG

    2011-09-01

    Full Text Available Background and objective It has been proven that gefitinib produces only 10%-20% tumor regression in heavily pretreated, unselected non-small cell lung cancer (NSCLC patients as the second- and third-line setting. Asian, female, nonsmokers and adenocarcinoma are favorable factors; however, it is difficult to find a patient satisfying all the above clinical characteristics. The aim of this study is to identify novel predicting factors, and to explore the interactions between clinical variables and their impact on the survival of Chinese patients with advanced NSCLC who were heavily treated with gefitinib in the second- or third-line setting. Methods The clinical and follow-up data of 127 advanced NSCLC patients referred to the Cancer Hospital & Institute, Chinese Academy of Medical Sciences from March 2005 to March 2010 were analyzed. Multivariate analysis of progression-free survival (PFS was performed using recursive partitioning, which is referred to as the classification and regression tree (CART analysis. Results The median PFS of 127 eligible consecutive advanced NSCLC patients was 8.0 months (95%CI: 5.8-10.2. CART was performed with an initial split on first-line chemotherapy outcomes and a second split on patients’ age. Three terminal subgroups were formed. The median PFS of the three subsets ranged from 1.0 month (95%CI: 0.8-1.2 for those with progressive disease outcome after the first-line chemotherapy subgroup, 10 months (95%CI: 7.0-13.0 in patients with a partial response or stable disease in first-line chemotherapy and age <70, and 22.0 months for patients obtaining a partial response or stable disease in first-line chemotherapy at age 70-81 (95%CI: 3.8-40.1. Conclusion Partial response, stable disease in first-line chemotherapy and age ≥ 70 are closely correlated with long-term survival treated by gefitinib as a second- or third-line setting in advanced NSCLC. CART can be used to identify previously unappreciated patient

  15. A Component Analysis of the Impact of Evaluative and Objective Feedback on Performance

    Science.gov (United States)

    Johnson, Douglas A.

    2013-01-01

    Despite the frequency with which performance feedback interventions are used in organizational behavior management, component analyses of such feedback are rare. It has been suggested that evaluation of performance and objective details about performance are two necessary components for performance feedback. The present study was designed to help…

  16. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    DEFF Research Database (Denmark)

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  17. An integrated approach for visual analysis of a multisource moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; van Hage, W.R.; de Vries, G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  18. An Integrated Approach for Visual Analysis of a Multi-Source Moving Objects Knowledge Base

    NARCIS (Netherlands)

    Willems, C.M.E.; van Hage, W.R.; de Vries, G.K.D.; Janssens, J.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  19. An integrated approach for visual analysis of a multi-source moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; Hage, van W.R.; Vries, de G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  20. Optimum analysis of pavement maintenance using multi-objective genetic algorithms

    Directory of Open Access Journals (Sweden)

    Amr A. Elhadidy

    2015-04-01

    Full Text Available Road network expansion in Egypt is considered as a vital issue for the development of the country. This is done while upgrading current road networks to increase the safety and efficiency. A pavement management system (PMS is a set of tools or methods that assist decision makers in finding optimum strategies for providing and maintaining pavements in a serviceable condition over a given period of time. A multi-objective optimization problem for pavement maintenance and rehabilitation strategies on network level is discussed in this paper. A two-objective optimization model considers minimum action costs and maximum condition for used road network. In the proposed approach, Markov-chain models are used for predicting the performance of road pavement and to calculate the expected decline at different periods of time. A genetic-algorithm-based procedure is developed for solving the multi-objective optimization problem. The model searched for the optimum maintenance actions at adequate time to be implemented on an appropriate pavement. Based on the computing results, the Pareto optimal solutions of the two-objective optimization functions are obtained. From the optimal solutions represented by cost and condition, a decision maker can easily obtain the information of the maintenance and rehabilitation planning with minimum action costs and maximum condition. The developed model has been implemented on a network of roads and showed its ability to derive the optimal solution.

  1. An Analysis of Learning Objectives and Content Coverage in Introductory Psychology Syllabi

    Science.gov (United States)

    Homa, Natalie; Hackathorn, Jana; Brown, Carrie M.; Garczynski, Amy; Solomon, Erin D.; Tennial, Rachel; Sanborn, Ursula A.; Gurung, Regan A. R.

    2013-01-01

    Introductory psychology is one of the most popular undergraduate courses and often serves as the gateway to choosing psychology as an academic major. However, little research has examined the typical structure of introductory psychology courses. The current study examined student learning objectives (SLOs) and course content in introductory…

  2. Object Selection Costs in Visual Working Memory: A Diffusion Model Analysis of the Focus of Attention

    Science.gov (United States)

    Sewell, David K.; Lilburn, Simon D.; Smith, Philip L.

    2016-01-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can…

  3. Simple proteomics data analysis in the object-oriented PowerShell.

    Science.gov (United States)

    Mohammed, Yassene; Palmblad, Magnus

    2013-01-01

    Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."

  4. Analysis of Buried Dielectric Objects Using Higher-Order MoM for Volume Integral Equations

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Meincke, Peter; Breinbjerg, Olav

    2004-01-01

    A higher-order method of moments (MoM) is applied to solve a volume integral equation for dielectric objects in layered media. In comparison to low-order methods, the higher-order MoM, which is based on higher-order hierarchical Legendre vector basis functions and curvilinear hexahedral elements,...

  5. Analysis of porous media and objects of cultural heritage by mobile NMR

    International Nuclear Information System (INIS)

    Haber, Agnes

    2012-01-01

    Low-field NMR techniques are used to study porous system, from simple to complex structures, and objects of cultural heritage. It is shown that NMR relaxometry can be used to study the fluid dynamics inside a porous system. A simple theoretical model for multi-site relaxation exchange NMR is used to extract exchange kinetic parameters when applied on a model porous systems. It provides a first step towards the study of more complex systems, where continuous relaxation distributions are present, such as soil systems or building materials. Moisture migration is observed in the soil systems with the help of 1D and 2D NMR relaxometry methods. In case of the concrete samples, the difference in composition makes a significant difference in the ability of water uptake. The single-sided NMR sensor proves to be a useful tool for on-site measurements. This is very important also in the case of the cultural heritage objects, as most of the objects can not be moved out of their environment. Mobile NMR turns out to be a simple but reliable and powerful tool to investigate moisture distributions and pore structures in porous media as well as the conservation state and history of objects of cultural heritage.

  6. Analysis of Optical Variations of BL Lac Object AO 0235+164 Wang ...

    Indian Academy of Sciences (India)

    obtain statistically meaningful values for the cross-correlation time lags ... deviation, the fifth represents the largest variations, the sixth represents the fractional ..... 6. Conclusions. The multi-band optical data are collected on the object of AO 0235 + 164. The time lags among the B, V, R and I bands have been analysed.

  7. Development and Factor Analysis of an Instrument to Measure Preservice Teachers' Perceptions of Learning Objects

    Science.gov (United States)

    Sahin, Sami

    2010-01-01

    The purpose of this study was to develop a questionnaire to measure student teachers' perception of digital learning objects. The participants included 308 voluntary senior students attending courses in a college of education of a public university in Turkey. The items were extracted to their related factors by the principal axis factoring method.…

  8. An analysis of nature and mechanisms of the Lira objects territories' radionuclide contamination

    International Nuclear Information System (INIS)

    Kadyrzhanov, K.K; Tuleushev, A.Zh.; Lukashenko, S.N.; Solodukhin, V.P.; Kazachevskij, I.V.; Reznikov, S.V.

    2001-01-01

    In the paper the results of study of radioactive contamination of 'Lira' objects territories are presented. Obtained data are evidencing, that existing radiation situation does not presents a threat for operating personnel of both the occupied on the deposit and its objects furthermore for inhabitants of the closest localities. Therewith a radionuclides concentration in the soils on the examined areas is slightly exceeds the background values characteristic for this region. Two hypothesises for reveled radionuclide contamination have been considered: yield on the surface and distribution by territory immediately after explosion 137 Xe and 90 Kr inert gases - they are genetical predecessors of 137 Cs and 90 Sr, relatively; existence of a constant effluence of these radionuclides on a surface from a 'ditch cavities' of the 'Lira' objects by the zones of dis-consolidation and crack propagations in the earth crust. With purpose for these hypothesis correctness clarification the distribution of radionuclides by soil layer depth in the vicinities of militant wells (TK-2 and TK-5), as well as in the case and riverbed of the Berezovka river. There are not data confirm the hypothesis about possible constant radionuclides influent from a 'ditch cavities'. So, the hypothesis of the 'Lira' objects territories radionuclide contamination due to inert gases yield on the surface is a more rightful

  9. A multi-level object store and its application to HEP data analysis

    International Nuclear Information System (INIS)

    May, E.; Lifka, D.; Malon, D.; Grossman, R.L.; Qin, X.; Valsamis, D.; Xu, W.

    1994-01-01

    We present a design and demonstration of a scientific data manager consisting of a low overhead, high performance object store interfaced to a hierarchical storage system. This was done with the framework of the Mark1 testbeds of the PASS project

  10. An Achievement Degree Analysis Approach to Identifying Learning Problems in Object-Oriented Programming

    Science.gov (United States)

    Allinjawi, Arwa A.; Al-Nuaim, Hana A.; Krause, Paul

    2014-01-01

    Students often face difficulties while learning object-oriented programming (OOP) concepts. Many papers have presented various assessment methods for diagnosing learning problems to improve the teaching of programming in computer science (CS) higher education. The research presented in this article illustrates that although max-min composition is…

  11. Earlier and greater hand pre-shaping in the elderly: a study based on kinematic analysis of reaching movements to grasp objects.

    Science.gov (United States)

    Tamaru, Yoshiki; Naito, Yasuo; Nishikawa, Takashi

    2017-11-01

    Elderly people are less able to manipulate objects skilfully than young adults. Although previous studies have examined age-related deterioration of hand movements with a focus on the phase after grasping objects, the changes in the reaching phase have not been studied thus far. We aimed to examine whether changes in hand shape patterns during the reaching phase of grasping movements differ between young adults and the elderly. Ten healthy elderly adults and 10 healthy young adults were examined using the Simple Test for Evaluating Hand Functions and kinetic analysis of hand pre-shaping reach-to-grasp tasks. The results were then compared between the two groups. For kinetic analysis, we measured the time of peak tangential velocity of the wrist and the inter-fingertip distance (the distance between the tips of the thumb and index finger) at different time points. The results showed that the elderly group's performance on the Simple Test for Evaluating Hand Functions was significantly lower than that of the young adult group, irrespective of whether the dominant or non-dominant hand was used, indicating deterioration of hand movement in the elderly. The peak tangential velocity of the wrist in either hand appeared significantly earlier in the elderly group than in the young adult group. The elderly group also showed larger inter-fingertip distances with arch-like fingertip trajectories compared to the young adult group for all object sizes. To perform accurate prehension, elderly people have an earlier peak tangential velocity point than young adults. This allows for a longer adjustment time for reaching and grasping movements and for reducing errors in object prehension by opening the hand and fingers wider. Elderly individuals gradually modify their strategy based on previous successes and failures during daily living to compensate for their decline in dexterity and operational capabilities. © 2017 Japanese Psychogeriatric Society.

  12. Object and Objective Lost?

    DEFF Research Database (Denmark)

    Lopdrup-Hjorth, Thomas

    2015-01-01

    This paper explores the erosion and problematization of ‘the organization’ as a demarcated entity. Utilizing Foucault's reflections on ‘state-phobia’ as a source of inspiration, I show how an organization-phobia has gained a hold within Organization Theory (OT). By attending to the history...... of this organization-phobia, the paper argues that OT has become increasingly incapable of speaking about its core object. I show how organizations went from being conceptualized as entities of major importance to becoming theoretically deconstructed and associated with all kinds of ills. Through this history......, organizations as distinct entities have been rendered so problematic that they have gradually come to be removed from the center of OT. The costs of this have been rather significant. Besides undermining the grounds that gave OT intellectual credibility and legitimacy to begin with, the organization-phobia...

  13. Effect of B vitamins and lowering homocysteine on cognitive impairment in patients with previous stroke or transient ischemic attack: a prespecified secondary analysis of a randomized, placebo-controlled trial and meta-analysis.

    Science.gov (United States)

    Hankey, Graeme J; Ford, Andrew H; Yi, Qilong; Eikelboom, John W; Lees, Kennedy R; Chen, Christopher; Xavier, Denis; Navarro, Jose C; Ranawaka, Udaya K; Uddin, Wasim; Ricci, Stefano; Gommans, John; Schmidt, Reinhold; Almeida, Osvaldo P; van Bockxmeer, Frank M

    2013-08-01

    High plasma total homocysteine (tHcy) has been associated with cognitive impairment but lowering tHcy with B-vitamins has produced equivocal results. We aimed to determine whether B-vitamin supplementation would reduce tHcy and the incidence of new cognitive impairment among individuals with stroke or transient ischemic attack≥6 months previously. A total of 8164 patients with stroke or transient ischemic attack were randomly allocated to double-blind treatment with one tablet daily of B-vitamins (folic acid, 2 mg; vitamin B6, 25 mg; vitamin B12, 500 μg) or placebo and followed up for 3.4 years (median) in the VITAmins TO Prevent Stroke (VITATOPS) trial. For this prespecified secondary analysis of VITATOPS, the primary outcome was a new diagnosis of cognitive impairment, defined as a Mini-Mental State Examination (MMSE) score6 months after the qualifying stroke; 2608 participants were cognitively unimpaired (MMSE≥24), of whom 2214 participants (1110 B-vitamins versus 1104 placebo) had follow-up MMSEs during 2.8 years (median). At final follow-up, allocation to B-vitamins, compared with placebo, was associated with a reduction in mean tHcy (10.2 μmol/L versus 14.2 μmol/L; Pvitamin B6, and vitamin B12 to a self-selected clinical trial cohort of cognitively unimpaired patients with previous stroke or transient ischemic attack lowered mean tHcy but had no effect on the incidence of cognitive impairment or cognitive decline, as measured by the MMSE, during a median of 2.8 years. URL: http://www.controlled-trials.com. Unique identifier: ISRCTN74743444; URL: http://www.clinicaltrials.gov. Unique identifier: NCT00097669.

  14. Children’s everyday exposure to food marketing: an objective analysis using wearable cameras

    OpenAIRE

    Signal, L. N.; Stanley, J.; Smith, M.; Barr, M. B.; Chambers, T. J.; Zhou, J.; Duane, A.; Gurrin, C.; Smeaton, A. F.; McKerchar, C.; Pearson, A. L.; Hoek, J.; Jenkin, G. L. S.; Ni Mhurchu, C.

    2017-01-01

    Background Over the past three decades the global prevalence of childhood overweight and obesity has increased by 47%. Marketing of energy-dense nutrient-poor foods and beverages contributes to this worldwide increase. Previous research on food marketing to children largely uses self-report, reporting by parents, or third-party observation of children’s environments, with the focus mostly on single settings and/or media. This paper reports on innovative research, Kids’Cam, in which children w...

  15. Object position and image magnification in dental panoramic radiography: a theoretical analysis.

    Science.gov (United States)

    Devlin, H; Yuan, J

    2013-01-01

    The purpose of our study was to investigate how image magnification and distortion in dental panoramic radiography are influenced by object size and position for a small round object such as a ball bearing used for calibration. Two ball bearings (2.5 mm and 6 mm in diameter) were placed at approximately the same position between the teeth of a plastic skull and radiographed 21 times. The skull was replaced each time. Their images were measured by software using edge detection and ellipse-fitting algorithms. Using a standard definition of magnification, equations were derived to enable an object's magnification to be determined from its position and vice versa knowing the diameter and machine parameters. The average magnification of the 2.5 mm ball bearing was 1.292 (0.0445) horizontally and 1.257 (0.0067) vertically with a mean ratio of 1.028 (0.0322); standard deviations are in parentheses. The figures for the 6 mm ball bearing were 1.286 (0.0068), 1.255 (0.0018) and 1.025 (0.0061), respectively. Derived positions of each ball bearing from magnification were more consistent horizontally than vertically. There was less variation in either direction for the 6 mm ball bearing than the 2.5 mm one. Automatic measurement of image size resulted in less variation in vertical magnification values than horizontal. There are only certain positions in the focal trough that achieve zero distortion. Object location can be determined from its diameter, measured magnification and machine parameters. The 6 mm diameter ball bearing is preferable to the 2.5 mm one for more reliable magnification measurement and position determination.

  16. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Science.gov (United States)

    Hennig, Patrick; Möller, Ralf; Egelhaaf, Martin

    2008-08-28

    Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells) in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model), an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model) requires smaller changes in input resistance in the inhibited neurons during visual stimulation. Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the retinotopic elements. Hence, distributed inhibition might be an alternative explanation of

  17. DOCUMENTATION OF HISTORICAL UNDERGROUND OBJECT IN SKORKOV VILLAGE WITH SELECTED MEASURING METHODS, DATA ANALYSIS AND VISUALIZATION

    Directory of Open Access Journals (Sweden)

    A. Dlesk

    2016-06-01

    Full Text Available The author analyzes current methods of 3D documentation of historical tunnels in Skorkov village, which lies at the Jizera river, approximately 30 km away from Prague. The area is known as a former military camp from Thirty Years’ War in 17th Century. There is an extensive underground compound with one entrance corridor and two transverse, situated approximately 2 to 5 m under the local development. The object has been partly documented by geodetic polar method, intersection photogrammetry, image-based modelling and laser scanning. Data have been analyzed and methods have been compared. Then the 3D model of object has been created and compound with cadastral data, orthophoto, historical maps and digital surface model which was made by photogrammetric method using remotely piloted aircraft system. Then the measuring has been realized with ground penetrating radar. Data have been analyzed and the result compared with real status. All the data have been combined and visualized into one 3D model. Finally, the discussion about advantages and disadvantages of used measuring methods has been livened up. The tested methodology has been also used for other documentation of historical objects in this area. This project has been created as a part of research at EuroGV. s.r.o. Company lead by Ing. Karel Vach CSc. in cooperation with prof. Dr. Ing. Karel Pavelka from Czech Technical University in Prague and Miloš Gavenda, the renovator.

  18. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for the Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements

  19. Documentation of Historical Underground Object in Skorkov Village with Selected Measuring Methods, Data Analysis and Visualization

    Science.gov (United States)

    Dlesk, A.

    2016-06-01

    The author analyzes current methods of 3D documentation of historical tunnels in Skorkov village, which lies at the Jizera river, approximately 30 km away from Prague. The area is known as a former military camp from Thirty Years' War in 17th Century. There is an extensive underground compound with one entrance corridor and two transverse, situated approximately 2 to 5 m under the local development. The object has been partly documented by geodetic polar method, intersection photogrammetry, image-based modelling and laser scanning. Data have been analyzed and methods have been compared. Then the 3D model of object has been created and compound with cadastral data, orthophoto, historical maps and digital surface model which was made by photogrammetric method using remotely piloted aircraft system. Then the measuring has been realized with ground penetrating radar. Data have been analyzed and the result compared with real status. All the data have been combined and visualized into one 3D model. Finally, the discussion about advantages and disadvantages of used measuring methods has been livened up. The tested methodology has been also used for other documentation of historical objects in this area. This project has been created as a part of research at EuroGV. s.r.o. Company lead by Ing. Karel Vach CSc. in cooperation with prof. Dr. Ing. Karel Pavelka from Czech Technical University in Prague and Miloš Gavenda, the renovator.

  20. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis; FINAL

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for the Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements

  1. Multi-Objective Analysis of a CHP Plant Integrated Microgrid in Pakistan

    Directory of Open Access Journals (Sweden)

    Asad Waqar

    2017-10-01

    Full Text Available In developing countries like Pakistan, the capacity shortage (CS of electricity is a critical problem. The frequent natural gas (NG outages compel consumers to use electricity to fulfill the thermal loads, which ends up as an increase in electrical load. In this scenario, the authors have proposed the concept of a combined heat & power (CHP plant to be a better option for supplying both electrical and thermal loads simultaneously. A CHP plant-based microgrid comprising a PV array, diesel generators and batteries (operating in grid-connected as well as islanded modes has been simulated using the HOMER Pro software. Different configurations of distributed generators (DGs with/without batteries have been evaluated considering multiple objectives. The multiple objectives include the minimization of the total net present cost (TNPC, cost of generated energy (COE and the annual greenhouse gas (GHG emissions, as well as the maximization of annual waste heat recovery (WHR of thermal units and annual grid sales (GS. These objectives are subject to the constraints of power balance, battery operation within state of charge (SOC limits, generator operation within capacity limits and zero capacity shortage. The simulations have been performed on six cities including Islamabad, Lahore, Karachi, Peshawar, Quetta and Gilgit. The simulation results have been analyzed to find the most optimal city for the CHP plant integrated microgrid.

  2. An object-oriented framework for magnetic-fusion modeling and analysis codes

    International Nuclear Information System (INIS)

    Cohen, R H; Yang, T Y Brian.

    1999-01-01

    The magnetic-fusion energy (MFE) program, like many other scientific and engineering activities, has a need to efficiently develop complex modeling codes which combine detailed models of components to make an integrated model of a device, as well as a rich supply of legacy code that could provide the component models. There is also growing recognition in many technical fields of the desirability of steerable software: computer programs whose functionality can be changed by the user as it is run. This project had as its goals the development of two key pieces of infrastructure that are needed to combine existing code modules, written mainly in Fortran, into flexible, steerable, object-oriented integrated modeling codes for magnetic- fusion applications. These two pieces are (1) a set of tools to facilitate the interfacing of Fortran code with a steerable object-oriented framework (which we have chosen to be based on PythonlW3, an object-oriented interpreted language), and (2) a skeleton for the integrated modeling code which defines the relationships between the modules. The first of these activities obviously has immediate applicability to a spectrum of projects; the second is more focussed on the MFE application, but may be of value as an example for other applications

  3. Neutron activation analysis capability of natural objects' estimation for Latvian environment

    International Nuclear Information System (INIS)

    Damburg, N.A.; Mednis, I.V.; Taure, I.Ya.; Virtsavs, M.V.

    1989-01-01

    A review of literature data and the NAA techniques developed by the authors for the analysis of environmental saples (aerosols, fly ash, soil, pine needls, natural and technological waters) are presented. The methods are used for the routine analysis of some samples from the environment of industrial and power plants of Latvia to investigate and control the local pollution with heavy metals, arsenic, halogens

  4. Global Warming and Geographically Scalar Climatic Objects Exist: An Ontologically Realist and Object-Oriented Analysis of the Daymet TMAX Climate Summaries for North America

    Science.gov (United States)

    Jackson, C. P.

    2017-12-01

    The scientific materialist worldview, what Peter Unger refers to as the Scientiphical worldview, or Scientiphicalism, has been utterly catastrophic for mesoscale objects in general, but, with its closely associated twentieth-century formal logic, this has been especially true for notoriously vague things like climate change, coastlines, mountains and dust storms. That is, any so-called representations or references ultimately suffer the same ontological demise as their referents, no matter how well-defined their boundaries may in fact be. Against this reductionist metaphysics, climatic objects are discretized within three separate ontologically realist systems, Graham Harman's object-oriented philosophy, or ontology (OOO), Markus Gabriel's ontology of fields of sense (OFS) and Tristan Garcia's two systems and new order of time, so as to make an ontological case for any geographically scalar object, beginning with pixels, as well as any notoriously vague thing they are said to represent. Four-month overlapping TMAX seasonals were first developed from the Oak Ridge National Laboratory (ORNL) Daymet climate temperature maximum (TMAX) monthly summaries (1980-2016) for North America and segmented within Trimble's eCognition Developer using the simple and widely familiar quadtree algorithm with a scale parameter of four, in this example. The regression coefficient was then calculated for the resulting 37-year climatic objects and an equally simple classification was applied. The same segmentation and classification was applied to the Daymet annual summaries, as well, for comparison. As was expected, the mean warming and cooling trends are lowest for the annual summary TMAX climatic objects. However, the Fall (SOND) season has the largest and smallest areas of warming and cooling, respectively, and the highest mean trend for warming objects. Conversely, Spring (MAMJ) has the largest and smallest areas undergoing cooling and warming, respectively. Finally, Summer (JJAS

  5. Multi-objective analysis of the conjunctive use of surface water and groundwater in a multisource water supply system

    Science.gov (United States)

    Vieira, João; da Conceição Cunha, Maria

    2017-04-01

    A multi-objective decision model has been developed to identify the Pareto-optimal set of management alternatives for the conjunctive use of surface water and groundwater of a multisource urban water supply system. A multi-objective evolutionary algorithm, Borg MOEA, is used to solve the multi-objective decision model. The multiple solutions can be shown to stakeholders allowing them to choose their own solutions depending on their preferences. The multisource urban water supply system studied here is dependent on surface water and groundwater and located in the Algarve region, southernmost province of Portugal, with a typical warm Mediterranean climate. The rainfall is low, intermittent and concentrated in a short winter, followed by a long and dry period. A base population of 450 000 inhabitants and visits by more than 13 million tourists per year, mostly in summertime, turns water management critical and challenging. Previous studies on single objective optimization after aggregating multiple objectives together have already concluded that only an integrated and interannual water resources management perspective can be efficient for water resource allocation in this drought prone region. A simulation model of the multisource urban water supply system using mathematical functions to represent the water balance in the surface reservoirs, the groundwater flow in the aquifers, and the water transport in the distribution network with explicit representation of water quality is coupled with Borg MOEA. The multi-objective problem formulation includes five objectives. Two objective evaluate separately the water quantity and the water quality supplied for the urban use in a finite time horizon, one objective calculates the operating costs, and two objectives appraise the state of the two water sources - the storage in the surface reservoir and the piezometric levels in aquifer - at the end of the time horizon. The decision variables are the volume of withdrawals from

  6. Life table analysis of the United States' Year 2000 mortality objectives.

    Science.gov (United States)

    Rockett, I R; Pollard, J H

    1995-06-01

    The US Year 2000 mortality objectives are model standards cast as targeted changes in age-adjusted cause-specific death rates. This research centred on the projected impact of such changes on life expectancy and the mortality toll for each sex. A computer simulation was conducted using single decrement, multiple decrement and cause-elimination life table techniques, together with a decomposition procedure. Male and female life expectancy at birth was projected to increase by 1.71 and 1.51 years, respectively, between the designated 1987 baseline and 2000. The leading beneficiaries would be those aged 65 and older, followed by those aged 45-64, and infants. Declines in coronary heart disease, stroke and injury death rates would most influence the projected life expectancy changes, irrespective of sex. Approximately 782,000 male deaths and 730,000 female deaths would be averted under Year 2000 assumptions. Life expectancy would be a useful summary measure to incorporate into official evaluations of the Year 2000 mortality objectives. Targeting of excess male mortality in the US and other highly industrialized nations is recommended.

  7. Subjective and objective analysis of three water pump systems carried by forest firefighters.

    Science.gov (United States)

    Moser, Daniel J; Graham, Ryan B; Stevenson, Joan M; Costigan, Patrick A

    2014-01-01

    The Mark 3 (M3) water power pump is an integral piece of wildfire fighting equipment. However, it is provided to fire stations without a carrying harness. The currently-used carrying harness is very uncomfortable, especially when carrying the pumps considerable distance in a forest to reach a water source. The purpose of this study was to advise the Ontario Ministry of Natural Resources on the selection of a new M3 load carriage system. Twenty Fire Rangers wore the three systems (Original, Prototype, and Modified) through a circuit of tasks representative of their working environment. Subjective and objective approaches were combined to assess and rank the M3 carriage systems. Subjective visual analogue scale ratings were obtained for ease of loading/unloading, comfort, system stability, and overall performance. Tri-axial accelerometers were mounted on each pump and at the sternum of each participant to determine relative pump-carrier accelerations. Overall, the Prototype was ranked as the best system; it resulted in the lowest relative pump-carrier accelerations on 10 out of 15 objective measures, and also received a first place ranking on all subjective measures. It was recommended that the Prototype be implemented as the M3 carriage system for fire suppression teams.

  8. Optimal Waste Load Allocation Using Multi-Objective Optimization and Multi-Criteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    L. Saberi

    2016-10-01

    Full Text Available Introduction: Increasing demand for water, depletion of resources of acceptable quality, and excessive water pollution due to agricultural and industrial developments has caused intensive social and environmental problems all over the world. Given the environmental importance of rivers, complexity and extent of pollution factors and physical, chemical and biological processes in these systems, optimal waste-load allocation in river systems has been given considerable attention in the literature in the past decades. The overall objective of planning and quality management of river systems is to develop and implement a coordinated set of strategies and policies to reduce or allocate of pollution entering the rivers so that the water quality matches by proposing environmental standards with an acceptable reliability. In such matters, often there are several different decision makers with different utilities which lead to conflicts. Methods/Materials: In this research, a conflict resolution framework for optimal waste load allocation in river systems is proposed, considering the total treatment cost and the Biological Oxygen Demand (BOD violation characteristics. There are two decision-makers inclusive waste load discharges coalition and environmentalists who have conflicting objectives. This framework consists of an embedded river water quality simulator, which simulates the transport process including reaction kinetics. The trade-off curve between objectives is obtained using the Multi-objective Particle Swarm Optimization Algorithm which these objectives are minimization of the total cost of treatment and penalties that must be paid by discharges and a violation of water quality standards considering BOD parameter which is controlled by environmentalists. Thus, the basic policy of river’s water quality management is formulated in such a way that the decision-makers are ensured their benefits will be provided as far as possible. By using MOPSO

  9. Auditory Scene Analysis and sonified visual images. Does consonance negatively impact on object formation when using complex sonified stimuli?

    Directory of Open Access Journals (Sweden)

    David J Brown

    2015-10-01

    Full Text Available A critical task for the brain is the sensory representation and identification of perceptual objects in the world. When the visual sense is impaired, hearing and touch must take primary roles and in recent times compensatory techniques have been developed that employ the tactile or auditory system as a substitute for the visual system. Visual-to-auditory sonifications provide a complex, feature-based auditory representation that must be decoded and integrated into an object-based representation by the listener. However, we don’t yet know what role the auditory system plays in the object integration stage and whether the principles of auditory scene analysis apply. Here we used coarse sonified images in a two-tone discrimination task to test whether auditory feature-based representations of visual objects would be confounded when their features conflicted with the principles of auditory consonance. We found that listeners (N = 36 performed worse in an object recognition task when the auditory feature-based representation was harmonically consonant. We also found that this conflict was not negated with the provision of congruent audio-visual information. The findings suggest that early auditory processes of harmonic grouping dominate the object formation process and that the complexity of the signal, and additional sensory information have limited effect on this.

  10. Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2009-01-01

    -- an outline which at the same time indicates the need for transformations of the Durkheimian model on decisive points. Thus, thirdly, it returns to Durkheim and undertakes to develop his concepts in a direction suitable for a sociological theory of fashion. Finally, it discusses the theoretical implications......This article attempts to create a framework for understanding modern fashion phenomena on the basis of Durkheim's sociology of religion. It focuses on Durkheim's conception of the relation between the cult and the sacred object, on his notion of 'exteriorisation', and on his theory of the social...... symbol in an attempt to describe the peculiar attraction of the fashion object and its social constitution. However, Durkheim's notions of cult and ritual must undergo profound changes if they are to be used in an analysis of fashion. The article tries to expand the Durkheimian cult, radically enlarging...

  11. Children's exposure to alcohol marketing within supermarkets: An objective analysis using GPS technology and wearable cameras.

    Science.gov (United States)

    Chambers, T; Pearson, A L; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L

    2017-07-01

    Exposure to alcohol marketing within alcohol retailers has been associated with higher rates of childhood drinking, brand recognition, and marketing recall. This study aimed to objectively measure children's everyday exposure to alcohol marketing within supermarkets. Children aged 11-13 (n = 167) each wore a wearable camera and GPS device for four consecutive days. Micro-spatial analyses were used to examine exposures within supermarkets. In alcohol retailing supermarkets (n = 30), children encountered alcohol marketing on 85% of their visits (n = 78). Alcohol marketing was frequently near everyday goods (bread and milk) or entrance/exit. Alcohol sales in supermarkets should be banned in order to protect children from alcohol marketing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Fuzzy multinomial logistic regression analysis: A multi-objective programming approach

    Science.gov (United States)

    Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan

    2017-05-01

    Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.

  13. Instrumental Supporting System for Developing and Analysis of Software-Defined Networks of Mobile Objects

    Directory of Open Access Journals (Sweden)

    V. A. Sokolov

    2015-01-01

    Full Text Available This article describes the organization principles for wireless mesh-networks (software-defined net-works of mobile objects. The emphasis is on the questions of getting effective routing algorithms for such networks. The mathematical model of the system is the standard transportation network. The key parameter of the routing system is the node reachability coefficient — the function depending on several basic and additional parameters (“mesh-factors”, which characterize the route between two network nodes. Each pair (arc, node is juxtaposed to a composite parameter which characterizes the “reacha-bility” of the node by the route which begins with this arc. The best (“shortest” route between two nodes is the route with the maximum reachability coefficient. The rules of building and refreshing the routing tables by the network nodes are described. With the announcement from the neighbor the node gets the information about the connection energy and reliability, the announcement time of receipt, the absence of transitional nodes and also about the connection capability. On the basis of this informationthe node applies the penalization (decreasing the reachability coefficient or the reward (increasing the reachability coefficient to all routes through this neighbor node. The penalization / reward scheme has some separate aspects: 1. Penalization for the actuality of information. 2. Penalization / reward for the reliability of a node. 3. Penalization for the connection energy. 4. Penalization for the present connection capability. The simulator of the wireless mesh-network of mobile objects is written. It is based on the suggested heuristic algorithms. The description and characteristics of the simulator are stated in the article. The peculiarities of its program realization are also examined.

  14. Genome sequence analysis of five Canadian isolates of strawberry mottle virus reveals extensive intra-species diversity and a longer RNA2 with increased coding capacity compared to a previously characterized European isolate.

    Science.gov (United States)

    Bhagwat, Basdeo; Dickison, Virginia; Ding, Xinlun; Walker, Melanie; Bernardy, Michael; Bouthillier, Michel; Creelman, Alexa; DeYoung, Robyn; Li, Yinzi; Nie, Xianzhou; Wang, Aiming; Xiang, Yu; Sanfaçon, Hélène

    2016-06-01

    In this study, we report the genome sequence of five isolates of strawberry mottle virus (family Secoviridae, order Picornavirales) from strawberry field samples with decline symptoms collected in Eastern Canada. The Canadian isolates differed from the previously characterized European isolate 1134 in that they had a longer RNA2, resulting in a 239-amino-acid extension of the C-terminal region of the polyprotein. Sequence analysis suggests that reassortment and recombination occurred among the isolates. Phylogenetic analysis revealed that the Canadian isolates are diverse, grouping in two separate branches along with isolates from Europe and the Americas.

  15. A cardiovascular life history: a life course analysis of the original Framingham Heart Objective.

    NARCIS (Netherlands)

    Peeters, A.; Mamun, A.A.; Willekens, F.J.; Bonneux, L.

    2002-01-01

    dietary behaviour and to further examine the associations of different dietary compositions with selected characteristics. Design: Latent class analysis was applied to data from the recent cross-sectional National Family Health Survey that collected information on the intake frequency of selected

  16. Nonradioactive Dangerous Waste Landfill sampling and analysis plan and data quality objectives process summary report

    International Nuclear Information System (INIS)

    Smith, R.C.

    1997-08-01

    This sampling and analysis plan defines the sampling and analytical activities and associated procedures that will be used to support the Nonradioactive Dangerous Waste Landfill soil-gas investigation. This SAP consists of three sections: this introduction, the field sampling plan, and the quality assurance project plan. The field sampling plan defines the sampling and analytical methodologies to be performed

  17. Mapping of landslides under dense vegetation cover using object - oriented analysis and LiDAR derivatives

    NARCIS (Netherlands)

    Van Den Eeckhout, Miet; Kerle, N.; Hervas, Javier; Supper, Robert; Margottini, C.; Canuti, P.; Sassa, K.

    2013-01-01

    Light Detection and Ranging (LiDAR) and its wide range of derivative products have become a powerful tool in landslide research, particularly for landslide identification and landslide inventory mapping. In contrast to the many studies that use expert-based analysis of LiDAR derivatives to identify

  18. A comparative analysis of pixel- and object-based detection of landslides from very high-resolution images

    Science.gov (United States)

    Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.

    2018-02-01

    A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.

  19. Exploring advantages of 4He-PIXE analysis for layered objects in cultural heritage

    International Nuclear Information System (INIS)

    Roehrs, S.; Calligaro, T.; Mathis, F.; Ortega-Feliu, I.; Salomon, J.; Walter, P.

    2006-01-01

    In the field of cultural heritage 4 He particle beams are often used to perform RBS analysis. In most cases the simultaneously produced X-rays are not considered for PIXE analysis. This paper aims to explore the potentials of 4 He induced X-ray emission (α-PIXE) using 4, 5 and 6 MeV 4 He beams and to compare its performance with that of conventional PIXE with 3 MeV protons. The α-PIXE and α-RBS spectra were collected at the same time in a vacuum chamber. The X-ray yields produced by 6 MeV 4 He beam for K-lines were found to be superior to those of protons for atomic numbers below 25. An additional advantage of α-PIXE is the lower bremsstrahlung background which leads to an improved peak to noise ratio for certain elements

  20. DECIPHERING THE FINEST IMPRINT OF GLACIAL EROSION: OBJECTIVE ANALYSIS OF STRIAE PATTERNS ON BEDROCK

    Directory of Open Access Journals (Sweden)

    Piet Stroeven

    2011-05-01

    Full Text Available The aim of this study is to compare the efficiency of different mathematical and statistical geometrical methods applied to characterise the orientation distribution of striae on bedrock for deciphering the finest imprint of glacial erosion. The involved methods include automatic image analysis techniques of Fast Fourier Transform (FFT, and the experimental investigations by means of Saltikov's directed secants analysis (rose of intersection densities, applied to digital and analogue images of the striae pattern, respectively. In addition, the experimental data were compared with the modelling results made on the basis of Underwood's concept of linear systems in a plane. The experimental and modelling approaches in the framework of stereology yield consistent results. These results reveal that stereological methods allow a reliable and efficient delineation of different families of glacial striae from a complex record imprinted in bedrock.

  1. Сryptocurrency as an object of economic analysis in insurance companies

    Directory of Open Access Journals (Sweden)

    O.O. Poplavskiy

    2016-12-01

    Full Text Available The article is devoted to topical issues of economic nature of cryptocurrency and development of the theoretical approaches for its analysis in insurance companies. The attention has been focused to risks and opportunities of cryptocurrency in modern economy, its place in evolution of money. The author studied the international experience of cryptocurrency operation in insurance companies order to identify trends of development of the industry and to determine their potential impact on the domestic insurance sector. The legal status of bitcion, main popular cryptocurrency, in different countries of the world has not been left without attention. The author has defined that bitcion is prohibited in states with low level of insurance density and penetration (like Bangladesh, Russia, Indonesia and is legalized in development countries, such USA, Japan, Germany and others. Major indicators of analysis of cryptocurrency in insurance companies which can be used for management decisions have been suggested.

  2. Multi-Objective data analysis using Bayesian Inference for MagLIF experiments

    Science.gov (United States)

    Knapp, Patrick; Glinksy, Michael; Evans, Matthew; Gom, Matth; Han, Stephanie; Harding, Eric; Slutz, Steve; Hahn, Kelly; Harvey-Thompson, Adam; Geissel, Matthias; Ampleford, David; Jennings, Christopher; Schmit, Paul; Smith, Ian; Schwarz, Jens; Peterson, Kyle; Jones, Brent; Rochau, Gregory; Sinars, Daniel

    2017-10-01

    The MagLIF concept has recently demonstrated Gbar pressures and confinement of charged fusion products at stagnation. We present a new analysis methodology that allows for integration of multiple diagnostics including nuclear, x-ray imaging, and x-ray power to determine the temperature, pressure, liner areal density, and mix fraction. A simplified hot-spot model is used with a Bayesian inference network to determine the most probable model parameters that describe the observations while simultaneously revealing the principal uncertainties in the analysis. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  3. Applications of RIGAKU Dmax Rapid II micro-X-ray diffractometer in the analysis of archaeological metal objects

    Science.gov (United States)

    Mozgai, Viktória; Szabó, Máté; Bajnóczi, Bernadett; Weiszburg, Tamás G.; Fórizs, István; Mráv, Zsolt; Tóth, Mária

    2017-04-01

    During material analysis of archaeological metal objects, especially their inlays or corrosion products, not only microstructure and chemical composition, but mineralogical composition is necessary to be determined. X-ray powder diffraction (XRD) is a widely-used method to specify the mineralogical composition. However, when sampling is not or limitedly allowed due to e.g. the high value of the object, the conventional XRD analysis can hardly be used. Laboratory micro-XRD instruments provide good alternatives, like the RIGAKU Dmax Rapid II micro-X-ray diffractometer, which is a unique combination of a MicroMax-003 third generation microfocus, sealed tube X-ray generator and a curved 'image plate' detector. With this instrument it is possible to measure as small as 10 µm area in diameter on the object. Here we present case studies for the application of the micro-XRD technique in the study of archaeological metal objects. In the first case niello inlay of a Late Roman silver augur staff was analysed. Due to the high value of the object, since it is the only piece known from the Roman Empire, only non-destructive analyses were allowed. To reconstruct the preparation of the niello, SEM-EDX analysis was performed on the niello inlays to characterise their chemical composition and microstructure. Two types of niello are present: a homogeneous, silver sulphide niello (acanthite) and an inhomogeneous silver-copper sulphide niello (exsolution of acanthite and jalpaite or jalpaite and stromeyerite). The micro-X-ray diffractometer was used to verify the mineralogical composition of the niello, supposed on the base of SEM results. In the second case corrosion products of a Late Roman copper cauldron with uncertain provenance were examined, since they may hold clues about the burial conditions (pH, Eh, etc.) of the object. A layer by layer analysis was performed in cross sections of small metal samples by using electron microprobe and micro-X-ray diffractometer. The results

  4. Cross-Sectional Analysis of Levels and Patterns of Objectively Measured Sedentary Time in Adolescent Females

    LENUS (Irish Health Repository)

    Harrington, Deirdre M.

    2011-10-28

    Abstract Background Adolescent females have been highlighted as a particularly sedentary population and the possible negative effects of a sedentary lifestyle are being uncovered. However, much of the past sedentary research is based on self-report or uses indirect methods to quantity sedentary time. Total time spent sedentary and the possible intricate sedentary patterns of adolescent females have not been described using objective and direct measure of body inclination. The objectives of this article are to examine the sedentary levels and patterns of a group of adolescent females using the ActivPAL™ and to highlight possible differences in sedentary levels and patterns across the week and within the school day. A full methodological description of how the data was analyzed is also presented. Methods One hundred and eleven adolescent females, age 15-18 yrs, were recruited from urban and rural areas in the Republic of Ireland. Participants wore an ActivPAL physical activity monitor for a 7.5 day period. The ActivPAL directly reports total time spent sitting\\/lying every 15 seconds and accumulation (frequency and duration) of sedentary activity was examined using a customized MATLAB ® computer software programme. Results While no significant difference was found in the total time spent sitting\\/lying over the full 24 hour day between weekday and weekend day (18.8 vs. 18.9 hours; p = .911), significantly more sedentary bouts of 1 to 5 minutes and 21 to 40 minutes in duration were accumulated on weekdays compared to weekend days (p < .001). The mean length of each sedentary bout was also longer (9.8 vs. 8.8 minutes; p < .001). When school hours (9 am-3 pm) and after school hours (4 pm-10 pm) were compared, there was no difference in total time spent sedentary (3.9 hours; p = .796) but the pattern of accumulation of the sedentary time differed. There were a greater number of bouts of > 20 minutes duration during school hours than after school hours (4.7 vs. 3

  5. OBJECTIVE BAYESIAN ANALYSIS OF ''ON/OFF'' MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Casadei, Diego, E-mail: diego.casadei@fhnw.ch [Visiting Scientist, Department of Physics and Astronomy, UCL, Gower Street, London WC1E 6BT (United Kingdom)

    2015-01-01

    In high-energy astrophysics, it is common practice to account for the background overlaid with counts from the source of interest with the help of auxiliary measurements carried out by pointing off-source. In this ''on/off'' measurement, one knows the number of photons detected while pointing toward the source, the number of photons collected while pointing away from the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of photons detected is so low that the approximations that hold asymptotically are not valid. On the other hand, an analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. Here we illustrate the objective Bayesian solution based on the reference posterior and compare the result with the approach very recently proposed by Knoetig, and discuss its most delicate points. In addition, we propose to compute the significance of the excess with respect to the background-only expectation with a method that is able to account for any uncertainty on the background and is valid for any photon count. This method is compared to the widely used significance formula by Li and Ma, which is based on asymptotic properties.

  6. Engineering analysis and literature review of the use of CORBA in distributed object-oriented systems

    Energy Technology Data Exchange (ETDEWEB)

    Holloway, F., LLNL

    1997-06-11

    This note was written based upon review of many papers, articles, and text books at time when we had little experience with an actual CORBA product. Some of the references conflicted with each other, and we now see that some of the comments made by other authors were misunderstandings or wrong. The product that we are currently using (and many others) do not implement all of CORBA - in fact, it is obvious that there is a great deal more to add to CORBA to meet all of the goals and expectations summarized herein. With all of its capability and promised advantage, CORBA is a complex package of technologies and products which requires quite a concentrated effort to master. We have developed a Test Package which to date has been used to send and receive data from up to 12 Servers on up to 5 computers in our office network, with up to 1000 objects per server. The data is expressed in all available IDL types including long arrays and unbounded strings. Performance measurements have established solid data upon which we can base estimates and models of the performance of the communication layer of the overall system after further design and prototyping on the frameworks and applications has been completed (see CORBA Test Package, ref. 14).

  7. A Conceptual Model for Delineating Land Management Units (LMUs Using Geographical Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Deniz Gerçek

    2017-06-01

    Full Text Available Land management and planning is crucial for present and future use of land and the sustainability of land resources. Physical, biological and cultural characteristics of land can be used to define Land Management Units (LMUs that aid in decision making for managing land and communicating information between different research and application domains. This study aims to describe the classification of ecologically relevant land units that are suitable for land management, planning and conservation purposes. Relying on the idea of strong correlation between landform and potential landcover, a conceptual model for creating Land Management Units (LMUs from topographic data and biophysical information is presented. The proposed method employs a multi-level object-based classification of Digital Terrain Models (DTMs to derive landform units. The sensitivity of landform units to changes in segmentation scale is examined, and the outcome of the landform classification is evaluated. Landform classes are then aggregated with landcover information to produce ecologically relevant landform/landcover assemblages. These conceptual units that constitute a framework of connected entities are finally enriched given available socio-economic information e.g., land use, ownership, protection status, etc. to generate LMUs. LMUs attached to a geographic database enable the retrieval of information at various levels to support decision making for land management at various scales. LMUs that are created present a basis for conservation and management in a biodiverse area in the Black Sea region of Turkey.

  8. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  9. A Knowledge-Based System For Analysis, Intervention Planning and Prevention of Defects in Immovable Cultural Heritage Objects and Monuments

    Science.gov (United States)

    Valach, J.; Cacciotti, R.; Kuneš, P.; ČerÅanský, M.; Bláha, J.

    2012-04-01

    The paper presents a project aiming to develop a knowledge-based system for documentation and analysis of defects of cultural heritage objects and monuments. The MONDIS information system concentrates knowledge on damage of immovable structures due to various causes, and preventive/remedial actions performed to protect/repair them, where possible. The currently built system is to provide for understanding of causal relationships between a defect, materials, external load, and environment of built object. Foundation for the knowledge-based system will be the systemized and formalized knowledge on defects and their mitigation acquired in the process of analysis of a representative set of cases documented in the past. On the basis of design comparability, used technologies, materials and the nature of the external forces and surroundings, the developed software system has the capacity to indicate the most likely risks of new defect occurrence or the extension of the existing ones. The system will also allow for a comparison of the actual failure with similar cases documented and will propose a suitable technical intervention plan. The system will provide conservationists, administrators and owners of historical objects with a toolkit for defect documentation for their objects. Also, advanced artificial intelligence methods will offer accumulated knowledge to users and will also enable them to get oriented in relevant techniques of preventive interventions and reconstructions based on similarity with their case.

  10. Sensitivity Analysis of Multi-objective Optimization for Solid Waste Management: A Case Study of Dar es Salaam, Tanzania

    Directory of Open Access Journals (Sweden)

    Halidi Lyeme

    2017-11-01

    Full Text Available In this study, a sensitivity analysis of a multi-objective optimization model for solid waste management (SWM for Dar es Salaam city in Tanzania is considered. Our objectives were to identify the most sensitive parameters and effect of other input data to the model output. Five scenarios were considered by varying their associated parameter values. The results showed that the decrease of total cost for the SWM system in all scenarios was observed compared to the baseline solution when the single landfill was considered. Furthermore, the analysis shows that the variable cost parameter for the processing facilities is very sensitivity in such a way that if you increase the variable cost then, there is a rapid increase of total cost for the SWM system and the vice versa is true. The relevant suggestions to the decision makers were also discussed.

  11. The method for objective evaluation of the intensity of radial bone lesions in rheumatoid arthritis using digital image analysis

    International Nuclear Information System (INIS)

    Zielinski, K.W.; Krekora, K.

    2004-01-01

    The semiquantitative methods used in everyday diagnostic practice for scoring the intensity of bone lesions in rheumatoid arthritis are susceptible to a subjective error. The paper describes the original algorithm for an image analysis as a method for quantitative and objective evaluation of the intensity of radiological lesions in rheumatoid arthritis. 75 plain radiograms of the hand of patients diagnosed with rheumatoid arthritis, in various stages of bone pathology, were evaluated. The analysis focused on the signs of pathological rebuilding of the affected bone, especially in the distal epiphysis of the radial bone. The plain radiograms of the hand were digitally analysed based on the modified method, formerly used for quantitative assessment of bone trabeculation. The method allowed us to objectively verify various scoring systems of radiograms widely used in rheumatological diagnosis. (author)

  12. Application of objective clinical human reliability analysis (OCHRA) in assessment of technical performance in laparoscopic rectal cancer surgery.

    Science.gov (United States)

    Foster, J D; Miskovic, D; Allison, A S; Conti, J A; Ockrim, J; Cooper, E J; Hanna, G B; Francis, N K

    2016-06-01

    Laparoscopic rectal resection is technically challenging, with outcomes dependent upon technical performance. No robust objective assessment tool exists for laparoscopic rectal resection surgery. This study aimed to investigate the application of the objective clinical human reliability analysis (OCHRA) technique for assessing technical performance of laparoscopic rectal surgery and explore the validity and reliability of this technique. Laparoscopic rectal cancer resection operations were described in the format of a hierarchical task analysis. Potential technical errors were defined. The OCHRA technique was used to identify technical errors enacted in videos of twenty consecutive laparoscopic rectal cancer resection operations from a single site. The procedural task, spatial location, and circumstances of all identified errors were logged. Clinical validity was assessed through correlation with clinical outcomes; reliability was assessed by test-retest. A total of 335 execution errors identified, with a median 15 per operation. More errors were observed during pelvic tasks compared with abdominal tasks (p technical performance of laparoscopic rectal surgery.

  13. Multi-band morpho-Spectral Component Analysis Deblending Tool (MuSCADeT): Deblending colourful objects

    Science.gov (United States)

    Joseph, R.; Courbin, F.; Starck, J.-L.

    2016-05-01

    We introduce a new algorithm for colour separation and deblending of multi-band astronomical images called MuSCADeT which is based on Morpho-spectral Component Analysis of multi-band images. The MuSCADeT algorithm takes advantage of the sparsity of astronomical objects in morphological dictionaries such as wavelets and their differences in spectral energy distribution (SED) across multi-band observations. This allows us to devise a model independent and automated approach to separate objects with different colours. We show with simulations that we are able to separate highly blended objects and that our algorithm is robust against SED variations of objects across the field of view. To confront our algorithm with real data, we use HST images of the strong lensing galaxy cluster MACS J1149+2223 and we show that MuSCADeT performs better than traditional profile-fitting techniques in deblending the foreground lensing galaxies from background lensed galaxies. Although the main driver for our work is the deblending of strong gravitational lenses, our method is fit to be used for any purpose related to deblending of objects in astronomical images. An example of such an application is the separation of the red and blue stellar populations of a spiral galaxy in the galaxy cluster Abell 2744. We provide a python package along with all simulations and routines used in this paper to contribute to reproducible research efforts. Codes can be found at http://lastro.epfl.ch/page-126973.html

  14. A NEW FRAMEWORK FOR OBJECT-BASED IMAGE ANALYSIS BASED ON SEGMENTATION SCALE SPACE AND RANDOM FOREST CLASSIFIER

    Directory of Open Access Journals (Sweden)

    A. Hadavand

    2015-12-01

    Full Text Available In this paper a new object-based framework is developed for automate scale selection in image segmentation. The quality of image objects have an important impact on further analyses. Due to the strong dependency of segmentation results to the scale parameter, choosing the best value for this parameter, for each class, becomes a main challenge in object-based image analysis. We propose a new framework which employs pixel-based land cover map to estimate the initial scale dedicated to each class. These scales are used to build segmentation scale space (SSS, a hierarchy of image objects. Optimization of SSS, respect to NDVI and DSM values in each super object is used to get the best scale in local regions of image scene. Optimized SSS segmentations are finally classified to produce the final land cover map. Very high resolution aerial image and digital surface model provided by ISPRS 2D semantic labelling dataset is used in our experiments. The result of our proposed method is comparable to those of ESP tool, a well-known method to estimate the scale of segmentation, and marginally improved the overall accuracy of classification from 79% to 80%.

  15. Exergoeconomic analysis and multi-objective optimization of an ejector refrigeration cycle powered by an internal combustion (HCCI) engine

    International Nuclear Information System (INIS)

    Sadeghi, Mohsen; Mahmoudi, S.M.S.; Khoshbakhti Saray, R.

    2015-01-01

    Highlights: • Ejector refrigeration systems powered by HCCI engine is proposed. • A new two-dimensional model is developed for the ejector. • Multi-objective optimization is performed for the proposed system. • Pareto frontier is plotted for multi-objective optimization. - Abstract: Ejector refrigeration systems powered by low-grade heat sources have been an attractive research subject for a lot of researchers. In the present work the waste heat from exhaust gases of a HCCI (homogeneous charge compression ignition) engine is utilized to drive the ejector refrigeration system. Considering the frictional effects on the ejector wall, a new two-dimensional model is developed for the ejector. Energy, exergy and exergoeconomic analysis performed for the proposed system using the MATLAB software. In addition, considering the exergy efficiency and the product unit cost of the system as objective functions, a multi-objective optimization is performed for the system to find the optimum design variables including the generator, condenser and evaporator temperatures. The product unit cost is minimized while the exergy efficiency is maximized using the genetic algorithm. The optimization results are obtained as a set of optimal points and the Pareto frontier is plotted for multi-objective optimization. The results of the optimization show that ejector refrigeration cycle is operating at optimum state based on exergy efficiency and product unit cost when generator, condenser and evaporator work at 94.54 °C, 33.44 °C and 0.03 °C, respectively

  16. Analysis of absorbed dose in cervical spine scanning by computerized tomography using simulator objects

    International Nuclear Information System (INIS)

    Lyra, Maria Henriqueta Freire

    2015-01-01

    The Computed tomography (CT) has become an important diagnostic tool after the continued development of Multidetector CT (MDCT), which allows faster acquisition of images with better quality than the previous technology. However, there is an increased radiation exposure, especially in examinations that require more than one acquisition, as dynamic exams and enhancement studies in order to discriminate low contrast soft tissue injury from normal tissue. Cervical spine MDCT examinations are used for diagnosis of soft tissue and vascular changes, fractures, dysplasia and other diseases with instability, which guide the patient treatment and rehabilitation. This study aims at checking the absorbed dose range in the thyroid and other organs during MDCT scan of cervical spine, with and without bismuth thyroid shield. In this experiment a cervical spine MDCT scan was performed on anthropomorphic phantoms, from the occipital to the first thoracic vertebra, using a 64 and a 16 – channel CT scanners. Thermoluminescent dosimeters were used to obtain the absorbed dose in thyroid, lenses, magnum foramen and breasts of the phantom. The results show that the thyroid received the highest dose, 60.0 mGy, in the female phantom, according to the incidence of the primary X-ray beam. The absorbed doses in these tests showed significant differences in the evaluated organs, p value < 0.005, except for the magnum foramen and breasts. With the bismuth thyroid shield applied on the female phantom, the doses in the thyroid and in the lenses were reduced by 27% and 52%, respectively. On the other hand, a reduction of 23.3% in the thyroid and increasing of 49.0% in the lens were measured on the male phantom. (author)

  17. The microstructure of fuel pellets as object of quality characterization on base of FMEA analysis

    International Nuclear Information System (INIS)

    Goncharov, U.V.; Matveev, A.A.; Strucov, A.V.; Loktev, I.I.

    2012-01-01

    It is difficult to find new effective reserves in nuclear fuel production as its experience of production and operation become more and more. FMEA method can help it on base of the system analysis. The state corporation Rosatom, consistently pursuing a policy of economical manufacture, make all efforts for identification of deep dependences between conditions of manufacture, characteristics of fuel materials and features of their operational behaviour. This report continues earlier discussion of the important feature of produced nuclear fuel pellets grain size distribution. This distribution defines gas release in reactor and has not appropriate method of characterization. There are descriptions of optimal microstructure of fuel pellets with large grain size literature

  18. Econometric Analysis on Developing Decision to Promote an Investment Object of Small Business

    Directory of Open Access Journals (Sweden)

    Cristina Gabriela ZAMFIR

    2014-12-01

    Full Text Available Econometric applications should be used for decision making on economic issues of the day. One of the most important is access to finance sources, a vital field to the country's economic activity. Accessing funding source involves feasibility studies for decision making on opening funding. Therefore, I decided to approach applied econometrics in the feasibility studies: I avoided advanced software applications, limiting to universally accepted methodology of the World Bank and the functions for calculating Excel spreadsheet-success of a feasibility study is correctness and depth of analysis and processing raw data, not in getting and keeping a reputable software.

  19. The secret life of objects. A psycho-social analysis of consumption's imaginary

    Directory of Open Access Journals (Sweden)

    Andrés Almagro González

    2008-02-01

    Full Text Available Advertising in general, and television advertising in particular, is an important communication channel through which values, lifestyles, and even the socially-shared "imaginary" are transmitted. A psycho-social analysis of the imaginary contents in television advertisements is specially called-for, given that contemporary culture, strongly marked by the use of the images, constitutes what some theoreticians call "the audio-visual age". In this article we analyse advertisements from a psycho-social perspective, to identify the way in which advertisements articulate, through images, the social imaginary.

  20. The secret life of objects. A psycho-social analysis of consumption's imaginary

    Directory of Open Access Journals (Sweden)

    Almagro González, Andrés

    2008-05-01

    Full Text Available Advertising in general, and television advertising in particular, is an important communication channel through which values, lifestyles, and even the socially-shared "imaginary" are transmitted. A psycho-social analysis of the imaginary contents in television advertisements is specially called-for, given that contemporary culture, strongly marked by the use of the images, constitutes what some theoreticians call "the audio-visual age". In this article we analyse advertisements from a psycho-social perspective, to identify the way in which advertisements articulate, through images, the social imaginary.

  1. An Object-Based Image Analysis Approach for Detecting Penguin Guano in very High Spatial Resolution Satellite Images

    OpenAIRE

    Chandi Witharana; Heather J. Lynch

    2016-01-01

    The logistical challenges of Antarctic field work and the increasing availability of very high resolution commercial imagery have driven an interest in more efficient search and classification of remotely sensed imagery. This exploratory study employed geographic object-based analysis (GEOBIA) methods to classify guano stains, indicative of chinstrap and Adélie penguin breeding areas, from very high spatial resolution (VHSR) satellite imagery and closely examined the transferability of knowle...

  2. Nonvitamin-K-antagonist oral anticoagulants versus warfarin in patients with atrial fibrillation and previous stroke or transient ischemic attack: An updated systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Ntaios, George; Papavasileiou, Vasileios; Diener, Hans-Chris; Makaritsis, Konstantinos; Michel, Patrik

    2017-08-01

    Background In a previous systematic review and meta-analysis, we assessed the efficacy and safety of nonvitamin-K antagonist oral anticoagulants versus warfarin in patients with atrial fibrillation and stroke or transient ischemic attack. Since then, new information became available. Aim The aim of the present work was to update the results of the previous systematic review and meta-analysis. Methods We searched PubMed until 24 August 2016 for randomized controlled trials using the following search items: "atrial fibrillation" and "anticoagulation" and "warfarin" and "previous stroke or transient ischemic attack." Eligible studies had to be phase III trials in patients with atrial fibrillation comparing warfarin with nonvitamin-K antagonist oral anticoagulants currently on the market or with the intention to be brought to the market in North America or Europe. The outcomes assessed in the efficacy analysis included stroke or systemic embolism, stroke, ischemic or unknown stroke, disabling or fatal stroke, hemorrhagic stroke, cardiovascular death, death from any cause, and myocardial infarction. The outcomes assessed in the safety analysis included major bleeding, intracranial bleeding, and major gastrointestinal bleeding. We performed fixed effects analyses on intention-to-treat basis. Results Among 183 potentially eligible articles, four were included in the meta-analysis. In 20,500 patients, compared to warfarin, nonvitamin-K antagonist oral anticoagulants were associated with a significant reduction of stroke/systemic embolism (relative risk reduction: 13.7%, absolute risk reduction: 0.78%, number needed to treat to prevent one event: 127), hemorrhagic stroke (relative risk reduction: 50.0%, absolute risk reduction: 0.63%, number needed to treat: 157), any stroke (relative risk reduction: 13.1%, absolute risk reduction: 0.7%, number needed to treat: 142), and intracranial hemorrhage (relative risk reduction: 46.1%, absolute risk reduction: 0.88%, number needed

  3. Objective approach for analysis of noise source characteristics and acoustic conditions in noisy computerized embroidery workrooms.

    Science.gov (United States)

    Aliabadi, Mohsen; Golmohammadi, Rostam; Mansoorizadeh, Muharram

    2014-03-01

    It is highly important to analyze the acoustic properties of workrooms in order to identify best noise control measures from the standpoint of noise exposure limits. Due to the fact that sound pressure is dependent upon environments, it cannot be a suitable parameter for determining the share of workroom acoustic characteristics in producing noise pollution. This paper aims to empirically analyze noise source characteristics and acoustic properties of noisy embroidery workrooms based on special parameters. In this regard, reverberation time as the special room acoustic parameter in 30 workrooms was measured based on ISO 3382-2. Sound power quantity of embroidery machines was also determined based on ISO 9614-3. Multiple linear regression was employed for predicting reverberation time based on acoustic features of the workrooms using MATLAB software. The results showed that the measured reverberation times in most of the workrooms were approximately within the ranges recommended by ISO 11690-1. Similarity between reverberation time values calculated by the Sabine formula and measured values was relatively poor (R (2) = 0.39). This can be due to the inaccurate estimation of the acoustic influence of furniture and formula preconditions. Therefore, this value cannot be considered representative of an actual acoustic room. However, the prediction performance of the regression method with root mean square error (RMSE) = 0.23 s and R (2) = 0.69 is relatively acceptable. Because the sound power of the embroidery machines was relatively high, these sources get the highest priority when it comes to applying noise controls. Finally, an objective approach for the determination of the share of workroom acoustic characteristics in producing noise could facilitate the identification of cost-effective noise controls.

  4. DGTD Analysis of Electromagnetic Scattering from Penetrable Conductive Objects with IBC

    KAUST Repository

    Li, Ping

    2015-10-16

    To avoid straightforward volumetric discretization, a discontinuous Galerkin time-domain (DGTD) method integrated with the impedance boundary condition (IBC) is presented in this paper to analyze the scattering from objects with finite conductivity. Two situations are considered: i) the skin depth is smaller than the thickness of the conductive volume; ii) the skin depth is larger than the thickness of a thin conductive sheet. For the first situation, a surface impedance boundary condition (SIBC) is employed, wherein the surface impedance usually exhibits a complex relation with the frequency. To incorporate the SIBC into DGTD, the surface impedance is firstly approximated by rational functions in the Laplace domain using the fast relaxation vector-fitting (FRVF) technique. Via inverse Laplace transform, the time-domain DGTD matrix equations can be obtained conveniently in integral form with respect to time t. For the second situation, a transmission IBC (TIBC) is used to include the transparent effects of the fields. In the TIBC, the tangential magnetic field jump is related with the tangential electric field via the surface conductivity. In this work, a specifically designed DGTD algorithm with TIBC is developed to model the graphene up to the terahertz (THz) band. In order to incorporate the TIBC into DGTD without involving the time-domain convolution, an auxiliary surface polarization current governed by a first order differential equation is introduced over the graphene. For open-region scattering problems, the DGTD algorithm is further hybridized with the time-domain boundary integral (TDBI) method to rigorously truncate the computational domain. To demonstrate the accuracy and applicability of the proposed algorithm, several representative examples are provided.

  5. Objectively measured residential environment and self-reported health: a multilevel analysis of UK census data.

    Directory of Open Access Journals (Sweden)

    Frank Dunstan

    Full Text Available Little is known about the association between health and the quality of the residential environment. What is known is often based on subjective assessments of the environment rather than on measurements by independent observers. The aim of this study, therefore, was to determine the association between self-reported general health and an objectively assessed measure of the residential environment. We studied over 30,000 residents aged 18 or over living in 777 neighbourhoods in south Wales. Built environment quality was measured by independent observers using a validated tool, the Residential Environment Assessment Tool (REAT, at unit postcode level. UK Census data on each resident, which included responses to a question which assessed self-reported general health, was linked to the REAT score. The Census data also contained detailed information on socio-economic and demographic characteristics of all respondents and was also linked to the Welsh Index of Multiple Deprivation. After adjusting for both the individual characteristics and area deprivation, respondents in the areas of poorest neighbourhood quality were more likely to report poor health compared to those living in areas of highest quality (OR 1.36, 95% confidence interval 1.22-1.49. The particular neighbourhood characteristics associated with poor health were physical incivilities and measures of how well the residents maintained their properties. Measures of green space were not associated with self-reported health. This is the first full population study to examine such associations and the results demonstrate the importance for health of the quality of the neighbourhood area in which people live and particularly the way in which residents behave towards their own and their neighbours' property. A better understanding of causal pathways that allows the development of interventions to improve neighbourhood quality would offer significant potential health gains.

  6. Object analysis of bone marrow MR imaging using double echo STIR sequence in hematological diseases

    Energy Technology Data Exchange (ETDEWEB)

    Mizuno, Hitomi [Saitama Medical School, Moroyama (Japan)

    1995-07-01

    The bone marrow of 84 patients with hematological disorders was investigated using short inversion time inversion recovery sequence (STIR) on an 1.5 Tesla superconducting MRI system. Double echo times of 20 and 100 msec were applied to research the signal characteristics of the lesion and carry out quantitative analysis of the receiver operating characteristic curve (ROC). The hematological diseases included 19 cases of myelodysplastic syndrome (MDS), 18 of multiple myeloma (MM), 18 of chronic myelocytic leukemia (CML), 9 of aplastic anemia (AA), 8 of acute myelocytic leukemia (AML), 3 of chronic lymphocytic leukemia (CLL), 3 of myelofibrosis, and 3 others. Using STIR with double echo times, bone marrow showed high signal intensity (SI) on short TE and low SI on long TE in MDS and CML; high SI on short and long TE in myelofibrosis and CLL; high SI on short TE and high to moderately high SI on long TE in MM; and low SI on short and long TE in AA. Quantitative analysis of 33 patients showed high sensitivity and specificity in AA (81% and 94%, respectively) and moderate sensitivity and high specificity in MM (61%, 88%). CML and MDS were similar with low sensitivities (40%, 41%) and high specificities (80%, 78%). Differential diagnosis between CML and MDS was difficult using STIR with the double echo time method. (author).

  7. THЕ ANALYSIS OF NORMATIVE AND LEGAL SUPPORT OF CONSTRUCTION OF OBJECTS OF AFFORDABLE HOUSING

    Directory of Open Access Journals (Sweden)

    MYKHAILOVA I. O.

    2017-05-01

    Full Text Available Raising of problem. The considering the acute socio-economic problem of forming an affordable housing fund, it is necessary to analyze of the methods of formation, selection and substantiation of project and organizational and technological decisions for the construction of affordable housing. Purpose of the article. The analysis of the methods of forming the choice and justification of project and organizational and technological decisions for the construction of affordable housing, taking into account scientific works, are based on the aspects of technology and the comparison of the factors of influence on the indicators of duration and cost under the conditions of working state mechanisms (programs to implement in order to overcome the problems of construction of affordable housing. Conclusion. The analysis of government mechanisms and statistics has shown that the programs are working, but unfortunately, at a rather low level of power. State mechanisms should be more appropriately implemented, funded, and effectively functioning, which is projected to reduce of the acute situation in improving of the living conditions of Ukrainian citizens who need it.

  8. Objective function analysis for electric soundings (VES), transient electromagnetic soundings (TEM) and joint inversion VES/TEM

    Science.gov (United States)

    Bortolozo, Cassiano Antonio; Bokhonok, Oleg; Porsani, Jorge Luís; Monteiro dos Santos, Fernando Acácio; Diogo, Liliana Alcazar; Slob, Evert

    2017-11-01

    Ambiguities in geophysical inversion results are always present. How these ambiguities appear in most cases open to interpretation. It is interesting to investigate ambiguities with regard to the parameters of the models under study. Residual Function Dispersion Map (RFDM) can be used to differentiate between global ambiguities and local minima in the objective function. We apply RFDM to Vertical Electrical Sounding (VES) and TEM Sounding inversion results. Through topographic analysis of the objective function we evaluate the advantages and limitations of electrical sounding data compared with TEM sounding data, and the benefits of joint inversion in comparison with the individual methods. The RFDM analysis proved to be a very interesting tool for understanding the joint inversion method of VES/TEM. Also the advantage of the applicability of the RFDM analyses in real data is explored in this paper to demonstrate not only how the objective function of real data behaves but the applicability of the RFDM approach in real cases. With the analysis of the results, it is possible to understand how the joint inversion can reduce the ambiguity of the methods.

  9. A combined use of multispectral and SAR images for ship detection and characterization through object based image analysis

    Science.gov (United States)

    Aiello, Martina; Gianinetto, Marco

    2017-10-01

    Marine routes represent a huge portion of commercial and human trades, therefore surveillance, security and environmental protection themes are gaining increasing importance. Being able to overcome the limits imposed by terrestrial means of monitoring, ship detection from satellite has recently prompted a renewed interest for a continuous monitoring of illegal activities. This paper describes an automatic Object Based Image Analysis (OBIA) approach to detect vessels made of different materials in various sea environments. The combined use of multispectral and SAR images allows for a regular observation unrestricted by lighting and atmospheric conditions and complementarity in terms of geographic coverage and geometric detail. The method developed adopts a region growing algorithm to segment the image in homogeneous objects, which are then classified through a decision tree algorithm based on spectral and geometrical properties. Then, a spatial analysis retrieves the vessels' position, length and heading parameters and a speed range is associated. Optimization of the image processing chain is performed by selecting image tiles through a statistical index. Vessel candidates are detected over amplitude SAR images using an adaptive threshold Constant False Alarm Rate (CFAR) algorithm prior the object based analysis. Validation is carried out by comparing the retrieved parameters with the information provided by the Automatic Identification System (AIS), when available, or with manual measurement when AIS data are not available. The estimation of length shows R2=0.85 and estimation of heading R2=0.92, computed as the average of R2 values obtained for both optical and radar images.

  10. Characteristics of ancient Egyptian glazed ceramic objects from Fatimid and Mamluk periods as revealed by ion beam analysis

    International Nuclear Information System (INIS)

    Sadek, Hamada; Abd El Hady M M

    2012-01-01

    Ion beam analysis (PIXE, μPIXE) has been successfully applied in analysis of archaeological materials, it has many advantages. In this work Ion Beam Analysis (IBA) used in analysis of ancient Egyptian glazed ceramic from 10th to the 16th centuries (Fatimid and Mamluk periods). Glazed ceramic samples from Al-Fustat Excavation store have been chosen to represent different colours (green, blue, brown, black ...etc), the colours of glaze depend on many factors such as oxides present in the glaze layer, fluxes and the conditions in which objects had been manufactured in the past. Ion Beam allows the identification of elemental composition of the glaze layer i.e., the information about colorants used in glaze, which is of great importance for compositional data play a key role in solving questions concerning dating, provenance, technology, use and the relationship between ancient cultures with the environment.

  11. Children's everyday exposure to food marketing: an objective analysis using wearable cameras.

    Science.gov (United States)

    Signal, L N; Stanley, J; Smith, M; Barr, M B; Chambers, T J; Zhou, J; Duane, A; Gurrin, C; Smeaton, A F; McKerchar, C; Pearson, A L; Hoek, J; Jenkin, G L S; Ni Mhurchu, C

    2017-10-08

    Over the past three decades the global prevalence of childhood overweight and obesity has increased by 47%. Marketing of energy-dense nutrient-poor foods and beverages contributes to this worldwide increase. Previous research on food marketing to children largely uses self-report, reporting by parents, or third-party observation of children's environments, with the focus mostly on single settings and/or media. This paper reports on innovative research, Kids'Cam, in which children wore cameras to examine the frequency and nature of everyday exposure to food marketing across multiple media and settings. Kids'Cam was a cross-sectional study of 168 children (mean age 12.6 years, SD = 0.5) in Wellington, New Zealand. Each child wore a wearable camera on four consecutive days, capturing images automatically every seven seconds. Images were manually coded as either recommended (core) or not recommended (non-core) to be marketed to children by setting, marketing medium, and product category. Images in convenience stores and supermarkets were excluded as marketing examples were considered too numerous to count. On average, children were exposed to non-core food marketing 27.3 times a day (95% CI 24.8, 30.1) across all settings. This was more than twice their average exposure to core food marketing (12.3 per day, 95% CI 8.7, 17.4). Most non-core exposures occurred at home (33%), in public spaces (30%) and at school (19%). Food packaging was the predominant marketing medium (74% and 64% for core and non-core foods) followed by signs (21% and 28% for core and non-core). Sugary drinks, fast food, confectionary and snack foods were the most commonly encountered non-core foods marketed. Rates were calculated using Poisson regression. Children in this study were frequently exposed, across multiple settings, to marketing of non-core foods not recommended to be marketed to children. The study provides further evidence of the need for urgent action to reduce children's exposure to

  12. Teaching tools in Evidence Based Practice: evaluation of reusable learning objects (RLOs for learning about Meta-analysis

    Directory of Open Access Journals (Sweden)

    Wharrad Heather

    2011-05-01

    Full Text Available Abstract Background All healthcare students are taught the principles of evidence based practice on their courses. The ability to understand the procedures used in systematically reviewing evidence reported in studies, such as meta-analysis, are an important element of evidence based practice. Meta-analysis is a difficult statistical concept for healthcare students to understand yet it is an important technique used in systematic reviews to pool data from studies to look at combined effectiveness of treatments. In other areas of the healthcare curricula, by supplementing lectures, workbooks and workshops with pedagogically designed, multimedia learning objects (known as reusable learning objects or RLOs we have shown an improvement in students' perceived understanding in subjects they found difficult. In this study we describe the development and evaluation of two RLOs on meta-analysis. The RLOs supplement associated lectures and aim to improve students' understanding of meta-analysis in healthcare students. Methods Following a quality controlled design process two RLOs were developed and delivered to two cohorts of students, a Master in Public Health course and Postgraduate diploma in nursing course. Students' understanding of five key concepts of Meta-analysis were measured before and after a lecture and again after RLO use. RLOs were also evaluated for their educational value, learning support, media attributes and usability using closed and open questions. Results Students rated their understanding of meta-analysis as improved after a lecture and further improved after completing the RLOs (Wilcoxon paired test, p Conclusions Meta-analysis RLOs that are openly accessible and unrestricted by usernames and passwords provide flexible support for students who find the process of meta-analysis difficult.

  13. Statistical Analysis for Subjective and Objective Evaluations of Dental Drill Sounds.

    Directory of Open Access Journals (Sweden)

    Tomomi Yamada

    Full Text Available The sound produced by a dental air turbine handpiece (dental drill can markedly influence the sound environment in a dental clinic. Indeed, many patients report that the sound of a dental drill elicits an unpleasant feeling. Although several manufacturers have attempted to reduce the sound pressure levels produced by dental drills during idling based on ISO 14457, the sound emitted by such drills under active drilling conditions may negatively influence the dental clinic sound environment. The physical metrics related to the unpleasant impressions associated with dental drill sounds have not been determined. In the present study, psychological measurements of dental drill sounds were conducted with the aim of facilitating improvement of the sound environment at dental clinics. Specifically, we examined the impressions elicited by the sounds of 12 types of dental drills in idling and drilling conditions using a semantic differential. The analysis revealed that the impressions of dental drill sounds varied considerably between idling and drilling conditions and among the examined drills. This finding suggests that measuring the sound of a dental drill in idling conditions alone may be insufficient for evaluating the effects of the sound. We related the results of the psychological evaluations to those of measurements of the physical metrics of equivalent continuous A-weighted sound pressure levels (LAeq and sharpness. Factor analysis indicated that impressions of the dental drill sounds consisted of two factors: "metallic and unpleasant" and "powerful". LAeq had a strong relationship with "powerful impression", calculated sharpness was positively related to "metallic impression", and "unpleasant impression" was predicted by the combination of both LAeq and calculated sharpness. The present analyses indicate that, in addition to a reduction in sound pressure level, refining the frequency components of dental drill sounds is important for creating a

  14. Pigments analysis and gold layer thickness evaluation of polychromy on wood objects by PXRF.

    Science.gov (United States)

    Blonski, M S; Appoloni, C R

    2014-07-01

    The X-ray fluorescence technique by energy dispersion (EDXRF), being a multi elemental and non-destructive technique, has been widely used in the analysis of artworks and archeometry. An X-ray fluorescence portable equipment from the Laboratory of Applied Nuclear Physics of the State University of Londrina (LFNA/UEL) was used for the measurement of pigments in golden parts of a Gilding Preparation Standard Plaque and also pigments measurement on the Wood Adornment of the High Altar Column of the Side Pulpit of the Immaculate Conception Church Parish Sao Paulo-SP. The portable X-ray fluorescence PXRF-LFNA-02 consists of an X-ray tube with Ag anode, a Si-PIN detector (FWHM=221 eV for Mn line at 5.9 keV), a chain of electronics nuclear standard of X-ray spectrometer, a multichannel 8K, a notebook and a mechanical system designed for the positioning of detector and X-ray tube, which allows movements with two degrees of freedom from the system of excitation-detection. The excitation-detection time of each measurement was 100 and 500 s, respectively. The presence of elements Ti, Cr, Fe, Cu, Zn and Au was found in the golden area of the Altar Column ornament. On the other hand, analysis of the ratios for the intensities of Kα/Kβ lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Towards truly simultaneous PIXE and RBS analysis of layered objects in cultural heritage

    International Nuclear Information System (INIS)

    Pascual-Izarra, Carlos; Barradas, Nuno P.; Reis, Miguel A.; Jeynes, Chris; Menu, Michel; Lavedrine, Bertrand; Ezrati, Jean Jacques; Roehrs, Stefan

    2007-01-01

    For a long time, RBS and PIXE techniques have been used in the field of cultural heritage. Although the complementarity of both techniques has long been acknowledged, its full potential has not been yet developed due to the lack of general purpose software tools for analysing the data from both techniques in a coherent way. In this work we provide an example of how the recent addition of PIXE to the set of techniques supported by the DataFurnace code can significantly change this situation. We present a case in which a non homogeneous sample (an oxidized metal from a photographic plate - heliography - made by Niepce in 1827) is analysed using RBS and PIXE in a straightforward and powerful way that can only be performed with a code that treats both techniques simultaneously as a part of one single and coherent analysis. The optimization capabilities of DataFurnace, allowed us to obtain the composition profiles for these samples in a very simple way

  16. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  17. Evaluation of thermal comfort in university classrooms through objective approach and subjective preference analysis.

    Science.gov (United States)

    Nico, Maria Anna; Liuzzi, Stefania; Stefanizzi, Pietro

    2015-05-01

    Assessing thermal comfort becomes more relevant when the aim is to maximise learning and productivity performances, as typically occurs in offices and schools. However, if, in the offices, the Fanger model well represents the thermal occupant response, then on the contrary, in schools, adaptive mechanisms significantly influence the occupants' thermal preference. In this study, an experimental approach was performed in the Polytechnic University of Bari, during the first days of March, in free running conditions. First, the results of questionnaires were compared according to the application of the Fanger model and the adaptive model; second, using a subjective scale, a complete analysis was performed on thermal preference in terms of acceptability, neutrality and preference, with particular focus on the influence of gender. The user possibility to control the indoor plant system produced a significant impact on the thermal sensation and the acceptability of the thermal environment. Gender was also demonstrated to greatly influence the thermal judgement of the thermal environment when an outdoor cold climate occurs. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. [Using infrared thermal asymmetry analysis for objective assessment of the lesion of facial nerve function].

    Science.gov (United States)

    Liu, Xu-long; Hong, Wen-xue; Song, Jia-lin; Wu, Zhen-ying

    2012-03-01

    The skin temperature distribution of a healthy human body exhibits a contralateral symmetry. Some lesions of facial nerve function are associated with an alteration of the thermal distribution of the human body. Since the dissipation of heat through the skin occurs for the most part in the form of infrared radiation, infrared thermography is the method of choice to capture the alteration of the infrared thermal distribution. This paper presents a new method of analysis of the thermal asymmetry named effective thermal area ratio, which is a product of two variables. The first variable is mean temperature difference between the specific facial region and its contralateral region. The second variable is a ratio, which is equal to the area of the abnormal region divided by the total area. Using this new method, we performed a controlled trial to assess the facial nerve function of the healthy subjects and the patients with Bell's palsy respectively. The results show: that the mean specificity and sensitivity of this method are 0.90 and 0.87 respectively, improved by 7% and 26% compared with conventional methods. Spearman correlation coefficient between effective thermal area ratio and the degree of facial nerve function is an average of 0.664. Hence, concerning the diagnosis and assessment of facial nerve function, infrared thermography is a powerful tool; while the effective ther mal area ratio is an efficient clinical indicator.

  19. Landscape object-based analysis of wetland plant functional types: the effects of spatial scale, vegetation classes and classifier methods

    Science.gov (United States)

    Dronova, I.; Gong, P.; Wang, L.; Clinton, N.; Fu, W.; Qi, S.

    2011-12-01

    Remote sensing-based vegetation classifications representing plant function such as photosynthesis and productivity are challenging in wetlands with complex cover and difficult field access. Recent advances in object-based image analysis (OBIA) and machine-learning algorithms offer new classification tools; however, few comparisons of different algorithms and spatial scales have been discussed to date. We applied OBIA to delineate wetland plant functional types (PFTs) for Poyang Lake, the largest freshwater lake in China and Ramsar wetland conservation site, from 30-m Landsat TM scene at the peak of spring growing season. We targeted major PFTs (C3 grasses, C3 forbs and different types of C4 grasses and aquatic vegetation) that are both key players in system's biogeochemical cycles and critical providers of waterbird habitat. Classification results were compared among: a) several object segmentation scales (with average object sizes 900-9000 m2); b) several families of statistical classifiers (including Bayesian, Logistic, Neural Network, Decision Trees and Support Vector Machines) and c) two hierarchical levels of vegetation classification, a generalized 3-class set and more detailed 6-class set. We found that classification benefited from object-based approach which allowed including object shape, texture and context descriptors in classification. While a number of classifiers achieved high accuracy at the finest pixel-equivalent segmentation scale, the highest accuracies and best agreement among algorithms occurred at coarser object scales. No single classifier was consistently superior across all scales, although selected algorithms of Neural Network, Logistic and K-Nearest Neighbors families frequently provided the best discrimination of classes at different scales. The choice of vegetation categories also affected classification accuracy. The 6-class set allowed for higher individual class accuracies but lower overall accuracies than the 3-class set because

  20. [Fundamental frequency analysis - a contribution to the objective examination of the speaking and singing voice (author's transl)].

    Science.gov (United States)

    Schultz-Coulon, H J

    1975-07-01

    The applicability of a newly developed fundamental frequency analyzer to diagnosis in phoniatrics is reviewed. During routine voice examination, the analyzer allows a quick and accurate measurement of fundamental frequency and sound level of the speaking voice, and of vocal range and maximum phonation time. By computing fundamental frequency histograms, the median fundamental frequency and the total pitch range can be better determined and compared. Objective studies of certain technical faculties of the singing voice, which usually are estimated subjectively by the speech therapist, may now be done by means of this analyzer. Several examples demonstrate the differences between correct and incorrect phonation. These studies compare the pitch perturbations during the crescendo and decrescendo of a swell-tone, and show typical traces of staccato, thrill and yodel. Conclusions of the study indicate that fundamental frequency analysis is a valuable supplemental method for objective voice examination.

  1. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    Science.gov (United States)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  2. Analysis of the efficiency of the linearization techniques for solving multi-objective linear fractional programming problems by goal programming

    Directory of Open Access Journals (Sweden)

    Tunjo Perić

    2017-01-01

    Full Text Available This paper presents and analyzes the applicability of three linearization techniques used for solving multi-objective linear fractional programming problems using the goal programming method. The three linearization techniques are: (1 Taylor’s polynomial linearization approximation, (2 the method of variable change, and (3 a modification of the method of variable change proposed in [20]. All three linearization techniques are presented and analyzed in two variants: (a using the optimal value of the objective functions as the decision makers’ aspirations, and (b the decision makers’ aspirations are given by the decision makers. As the criteria for the analysis we use the efficiency of the obtained solutions and the difficulties the analyst comes upon in preparing the linearization models. To analyze the applicability of the linearization techniques incorporated in the linear goal programming method we use an example of a financial structure optimization problem.

  3. Identification of Forested Landslides Using LiDar Data, Object-based Image Analysis, and Machine Learning Algorithms

    Directory of Open Access Journals (Sweden)

    Xianju Li

    2015-07-01

    Full Text Available For identification of forested landslides, most studies focus on knowledge-based and pixel-based analysis (PBA of LiDar data, while few studies have examined (semi- automated methods and object-based image analysis (OBIA. Moreover, most of them are focused on soil-covered areas with gentle hillslopes. In bedrock-covered mountains with steep and rugged terrain, it is so difficult to identify landslides that there is currently no research on whether combining semi-automated methods and OBIA with only LiDar derivatives could be more effective. In this study, a semi-automatic object-based landslide identification approach was developed and implemented in a forested area, the Three Gorges of China. Comparisons of OBIA and PBA, two different machine learning algorithms and their respective sensitivity to feature selection (FS, were first investigated. Based on the classification result, the landslide inventory was finally obtained according to (1 inclusion of holes encircled by the landslide body; (2 removal of isolated segments, and (3 delineation of closed envelope curves for landslide objects by manual digitizing operation. The proposed method achieved the following: (1 the filter features of surface roughness were first applied for calculating object features, and proved useful; (2 FS improved classification accuracy and reduced features; (3 the random forest algorithm achieved higher accuracy and was less sensitive to FS than a support vector machine; (4 compared to PBA, OBIA was more sensitive to FS, remarkably reduced computing time, and depicted more contiguous terrain segments; (5 based on the classification result with an overall accuracy of 89.11% ± 0.03%, the obtained inventory map was consistent with the referenced landslide inventory map, with a position mismatch value of 9%. The outlined approach would be helpful for forested landslide identification in steep and rugged terrain.

  4. Using object-based image analysis to conduct high-resolution conifer extraction at regional spatial scales

    Science.gov (United States)

    Coates, Peter S.; Gustafson, K. Benjamin; Roth, Cali L.; Chenaille, Michael P.; Ricca, Mark A.; Mauch, Kimberly; Sanchez-Chopitea, Erika; Kroger, Travis J.; Perry, William M.; Casazza, Michael L.

    2017-08-10

    imagery based on their spectral and spatial signatures. We classified conifers in 6,230 tiles and then tested for errors of omission and commission using confusion matrices. Accuracy ranged from 79.1 to 96.8, with an overall accuracy of 84.3 percent across all mapped areas. An estimated accuracy coefficient (kappa) indicated substantial to nearly perfect agreement, which varied across mapped areas. For this mapping process across the entire mapping extent, four sets of products are available at https://doi.org/10.5066/F7348HVC, including (1) a shapefile representing accuracy results linked to mapping subunits; (2) binary rasters representing conifer presence or absence at a 1 × 1 m resolution; (3) a 30 × 30 m resolution raster representing percentages of conifer canopy cover within each cell from 0 to 100; and (4) 1 × 1 m resolution canopy cover classification rasters derived from a 50-m-radius moving window analysis. The latter two products can be reclassified in a geographic information system (GIS) into user-specified bins to meet different objectives, which include approximations for phases of encroachment. These products complement, and in some cases improve upon, existing conifer maps in the Western United States, and will help facilitate sage-grouse habitat management and sagebrush ecosystem restoration.

  5. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  6. Analysis of warm season thunderstorms using an object-oriented tracking method based on radar and total lightning data

    Directory of Open Access Journals (Sweden)

    T. Rigo

    2010-09-01

    Full Text Available Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data. In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm.

    The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation, are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%, while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%. Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms

  7. Analysis of warm season thunderstorms using an object-oriented tracking method based on radar and total lightning data

    Science.gov (United States)

    Rigo, T.; Pineda, N.; Bech, J.

    2010-09-01

    Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest

  8. Automated Glacier Mapping using Object Based Image Analysis. Case Studies from Nepal, the European Alps and Norway

    Science.gov (United States)

    Vatle, S. S.

    2015-12-01

    Frequent and up-to-date glacier outlines are needed for many applications of glaciology, not only glacier area change analysis, but also for masks in volume or velocity analysis, for the estimation of water resources and as model input data. Remote sensing offers a good option for creating glacier outlines over large areas, but manual correction is frequently necessary, especially in areas containing supraglacial debris. We show three different workflows for mapping clean ice and debris-covered ice within Object Based Image Analysis (OBIA). By working at the object level as opposed to the pixel level, OBIA facilitates using contextual, spatial and hierarchical information when assigning classes, and additionally permits the handling of multiple data sources. Our first example shows mapping debris-covered ice in the Manaslu Himalaya, Nepal. SAR Coherence data is used in combination with optical and topographic data to classify debris-covered ice, obtaining an accuracy of 91%. Our second example shows using a high-resolution LiDAR derived DEM over the Hohe Tauern National Park in Austria. Breaks in surface morphology are used in creating image objects; debris-covered ice is then classified using a combination of spectral, thermal and topographic properties. Lastly, we show a completely automated workflow for mapping glacier ice in Norway. The NDSI and NIR/SWIR band ratio are used to map clean ice over the entire country but the thresholds are calculated automatically based on a histogram of each image subset. This means that in theory any Landsat scene can be inputted and the clean ice can be automatically extracted. Debris-covered ice can be included semi-automatically using contextual and morphological information.

  9. Comparative analysis of the symptomatology of children with lower urinary tract dysfunction in relation to objective data

    Directory of Open Access Journals (Sweden)

    Ubirajara Barroso Jr

    2006-02-01

    Full Text Available OBJECTIVES: To assess the clinical presentation of children with lower urinary tract dysfunction (LUTD relating to objective examination data. MATERIALS AND METHODS: Forty-four children (36 girls and 8 boys with mean age of 6.8 years with LUTD were prospectively assessed through a specific questionnaire that analyzed clinical presentation of those patients. These data were then compared to objective data, such as micturition diary and uroflowmetry with electromyography. RESULTS: A urinary tract infection (UTI antecedent was observed in 31 cases (70.5%, and of those, 24 cases of UTI were accompanied by fever. All children presented micturition urgency. Daily urinary incontinence was observed in 33 cases (75% and nocturnal enuresis in 23 (52.3%. As for micturition frequency, 15 (34.1% had normal frequency 19 (43.2% presented more than 10 daily micturition episodes and 10 (22.7% thought they urinated less than 5 times a day. In the uroflowmetry and electromyography examination, 14 (31.8% experienced lack of coordination during micturition. Of 10 children with infrequent micturition, 5 confirmed this in their micturition diaries and 2 listed more than 5 micturition episodes per day in the diary. Of 19 patients presenting polaciuria, only 5 confirmed this in their micturition diaries, while 7 had less than 10 micturition episodes per day. CONCLUSION: Most children with LUTD presented a previous UTI, and daily incontinence was verified in around 75% of the patients. Complaints of polaciuria or infrequent micturition are not noted completely in the micturition diaries and there is no parameter in the clinical history that offers good sensitivity or specificity for the diagnosis of lack of perineal coordination.

  10. An Objective Screening Method for Major Depressive Disorder Using Logistic Regression Analysis of Heart Rate Variability Data Obtained in a Mental Task Paradigm

    Directory of Open Access Journals (Sweden)

    Guanghao Sun

    2016-11-01

    Full Text Available Background and Objectives: Heart rate variability (HRV has been intensively studied as a promising biological marker of major depressive disorder (MDD. Our previous study confirmed that autonomic activity and reactivity in depression revealed by HRV during rest and mental task (MT conditions can be used as diagnostic measures and in clinical evaluation. In this study, logistic regression analysis (LRA was utilized for the classification and prediction of MDD based on HRV data obtained in an MT paradigm.Methods: Power spectral analysis of HRV on R-R intervals before, during, and after an MT (random number generation was performed in 44 drug-naïve patients with MDD and 47 healthy control subjects at Department of Psychiatry in Shizuoka Saiseikai General Hospital. Logit scores of LRA determined by HRV indices and heart rates discriminated patients with MDD from healthy subjects. The high frequency (HF component of HRV and the ratio of the low frequency (LF component to the HF component (LF/HF correspond to parasympathetic and sympathovagal balance, respectively.Results: The LRA achieved a sensitivity and specificity of 80.0% and 79.0%, respectively, at an optimum cutoff logit score (0.28. Misclassifications occurred only when the logit score was close to the cutoff score. Logit scores also correlated significantly with subjective self-rating depression scale scores (p < 0.05.Conclusion: HRV indices recorded during a mental task may be an objective tool for screening patients with MDD in psychiatric practice. The proposed method appears promising for not only objective and rapid MDD screening, but also evaluation of its severity.

  11. Comparative Analysis of Several Real-Time Systems for Tracking People and/or Moving Objects using GPS

    OpenAIRE

    Radinski, Gligorcho; Mileva, Aleksandra

    2015-01-01

    When we talk about real-time systems for tracking people and/or moving objects using a Global Positioning System (GPS), there are several categories of such systems and the ways in which they work. Some uses additional hardware to extend the functionality of the offered opportunities, some are free, some are too complex and cost too much money. This paper aims to provide a clearer picture of several such systems and to show results from a comparative analysis of some popular systems for trac...

  12. Pygrass: An Object Oriented Python Application Programming Interface (API for Geographic Resources Analysis Support System (GRASS Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    Marco Ciolli

    2013-03-01

    Full Text Available PyGRASS is an object-oriented Python Application Programming Interface (API for Geographic Resources Analysis Support System (GRASS Geographic Information System (GIS, a powerful open source GIS widely used in academia, commercial settings and governmental agencies. We present the architecture of the PyGRASS library, covering interfaces to GRASS modules, vector and raster data, with a focus on the new capabilities that it provides to GRASS users and developers. Our design concept of the module interface allows the direct linking of inputs and outputs of GRASS modules to create process chains, including compatibility checks, process control and error handling. The module interface was designed to be easily extended to work with remote processing services (Web Processing Service (WPS, Web Service Definition Language (WSDL/Simple Object Access Protocol (SOAP. The new object-oriented Python programming API introduces an abstract layer that opens the possibility to use and access transparently the efficient raster and vector functions of GRASS that are implemented in C. The design goal was to provide an easy to use, but powerful, Python interface for users and developers who are not familiar with the programming language C and with the GRASS C-API. We demonstrate the capabilities, scalability and performance of PyGRASS with several dedicated tests and benchmarks. We compare and discuss the results of the benchmarks with dedicated C implementations.

  13. Integrating fuzzy object based image analysis and ant colony optimization for road extraction from remotely sensed images

    Science.gov (United States)

    Maboudi, Mehdi; Amini, Jalal; Malihi, Shirin; Hahn, Michael

    2018-04-01

    Updated road network as a crucial part of the transportation database plays an important role in various applications. Thus, increasing the automation of the road extraction approaches from remote sensing images has been the subject of extensive research. In this paper, we propose an object based road extraction approach from very high resolution satellite images. Based on the object based image analysis, our approach incorporates various spatial, spectral, and textural objects' descriptors, the capabilities of the fuzzy logic system for handling the uncertainties in road modelling, and the effectiveness and suitability of ant colony algorithm for optimization of network related problems. Four VHR optical satellite images which are acquired by Worldview-2 and IKONOS satellites are used in order to evaluate the proposed approach. Evaluation of the extracted road networks shows that the average completeness, correctness, and quality of the results can reach 89%, 93% and 83% respectively, indicating that the proposed approach is applicable for urban road extraction. We also analyzed the sensitivity of our algorithm to different ant colony optimization parameter values. Comparison of the achieved results with the results of four state-of-the-art algorithms and quantifying the robustness of the fuzzy rule set demonstrate that the proposed approach is both efficient and transferable to other comparable images.

  14. Measurement and Analysis of Olfactory Responses with the Aim of Establishing an Objective Diagnostic Method for Central Olfactory Disorders

    Science.gov (United States)

    Uno, Tominori; Wang, Li-Qun; Miwakeichi, Fumikazu; Tonoike, Mitsuo; Kaneda, Teruo

    In order to establish a new diagnostic method for central olfactory disorders and to identify objective indicators, we measured and analyzed brain activities in the parahippocampal gyrus and uncus, region of responsibility for central olfactory disorders. The relationship between olfactory stimulation and brain response at region of responsibility can be examined in terms of fitted responses (FR). FR in these regions may be individual indicators of changes in brain olfactory responses. In the present study, in order to non-invasively and objectively measure olfactory responses, an odor oddball task was conducted on four healthy volunteers using functional magnetic resonance imaging (fMRI) and a odorant stimulator with blast-method. The results showed favorable FR and activation in the parahippocampal gyrus or uncus in all subjects. In some subjects, both the parahippocampal gyrus and uncus were activated. Furthermore, activation was also confirmed in the cingulate gyrus, middle frontal gyrus, precentral gyrus, postcentral gyrus, superior temporal gyrus and insula. The hippocampus and uncus are known to be involved in the olfactory disorders associated with early-stage Alzheimer's disease and other olfactory disorders. In the future, it will be necessary to further develop the present measurement and analysis method to clarify the relationship between central olfactory disorders and brain activities and establish objective indicators that are useful for diagnosis.

  15. Implementation of Neutronics Analysis Code using the Features of Object Oriented Programming via Fortran90/95

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young; Cho, Beom Jin [KEPCO Nuclear Fuel, Daejeon (Korea, Republic of)

    2011-05-15

    The object-oriented programming (OOP) concept was radically established after 1990s and successfully involved in Fortran 90/95. The features of OOP are such as the information hiding, encapsulation, modularity and inheritance, which lead to producing code that satisfy three R's: reusability, reliability and readability. The major OOP concepts, however, except Module are not mainly used in neutronics analysis codes even though the code was written by Fortran 90/95. In this work, we show that the OOP concept can be employed to develop the neutronics analysis code, ASTRA1D (Advanced Static and Transient Reactor Analyzer for 1-Dimension), via Fortran90/95 and those can be more efficient and reasonable programming methods

  16. Assessment of pharmacy students' communication competence using the Roter Interaction Analysis System during objective structured clinical examinations.

    Science.gov (United States)

    Kubota, Yoshie; Yano, Yoshitaka; Seki, Susumu; Takada, Kaori; Sakuma, Mio; Morimoto, Takeshi; Akaike, Akinori; Hiraide, Atsushi

    2011-04-11

    To determine the value of using the Roter Interaction Analysis System during objective structured clinical examinations (OSCEs) to assess pharmacy students' communication competence. As pharmacy students completed a clinical OSCE involving an interview with a simulated patient, 3 experts used a global rating scale to assess students' overall performance in the interview, and both the student's and patient's languages were coded using the Roter Interaction Analysis System (RIAS). The coders recorded the number of utterances (ie, units of spoken language) in each RIAS category. Correlations between the raters' scores and the number and types of utterances were examined. There was a significant correlation between students' global rating scores on the OSCE and the number of utterances in the RIAS socio-emotional category but not the RIAS business category. The RIAS proved to be a useful tool for assessing the socio-emotional aspect of students' interview skills.

  17. Bite Mark Analysis in Foodstuffs and Inanimate Objects and the Underlying Proofs for Validity and Judicial Acceptance.

    Science.gov (United States)

    Rivera-Mendoza, Fernando; Martín-de-Las-Heras, Stella; Navarro-Cáceres, Pablo; Fonseca, Gabriel M

    2018-03-01

    Even though one of the first bite mark cases was Doyle v. State in 1954 (a bitten cheese case), the research has focused on bite marks inflicted in human skin. As published Papers, Case Reports, or Technical Notes can constitute precedents which are relied upon in making the legal arguments and a considerable amount of case law exists in this area, we present a systematic review on bite mark analysis in foodstuffs and inanimate objects and their underlying proofs for validity and judicial acceptance according to Daubert rulings. Results showed that there is vulnerability in these procedures, and it is essential to demand for focus scrutiny on the known error rates when such evidence is presented in trials. These kinds of bite marks are well documented; however, there has been little research in this field knowing that the protocols of analysis and comparison are the responsibility of the forensic odontologists. © 2017 American Academy of Forensic Sciences.

  18. Objectivity of two methods of differentiating fibre types and repeatability of measurements by application of the TEMA image analysis system.

    Science.gov (United States)

    Henckel, P; Ducro, B; Oksbjerg, N; Hassing, L

    1998-01-01

    The objectivity of two of the most widely used methods for differentiation of fibre types, i.e. 1) the myosin ATP-ase method (Brooke and Kaiser, 1970a,b) and 2) the combined method, by which the myosin ATP-ase reaction is used to differentiate between fast and slow twitch fibres and NADH-tetrazolium reductase activity is used to identify the subgroups of fast twitch fibres (Ashmore and Doerr, 1970, Peter et al., 1972), was assessed in muscle samples from horses, calves and pigs. We also assessed the objectivity of the alpha-amylase-PAS preparation for the visualisation of capillaries (Andersen, 1975) in these species. For the purpose of reducing the time costs of histochemical analysis of muscle samples, we have developed an interactive image analysis system which is described. All analyses are performed on this system. In accordance with several other investigations, differences between the two methods of differentiating fibre types were found only for the relative distribution of the fast-twitch fibre subgroups (p 87%), the impact of differences in pre-requisites (varied degrees of overlap between the fibre types) for performing the differentiation by the combined method raises a question of the reliability of this method. Apparently, no general rules for comparison of results of distribution of the two subgroups of fast twitch fibres by the two methods are applicable. The alpha-amylase-PAS method was found to be a fairly objective method to identify capillaries in muscles from horses, calves and pigs. However, as capillarity described in combination with other traits to give an indication of diffusion characteristics is significantly influenced by person, it is recommended that the same person perform all the analysis of a project. In addition to the methodological results in this study, we have shown that by application of the TEMA image analysis system, which is more rapid compared with the time-consuming traditional method for evaluation of histochemical

  19. Combining Landform Thematic Layer and Object-Oriented Image Analysis to Map the Surface Features of Mountainous Flood Plain Areas

    Science.gov (United States)

    Chuang, H.-K.; Lin, M.-L.; Huang, W.-C.

    2012-04-01

    The Typhoon Morakot on August 2009 brought more than 2,000 mm of cumulative rainfall in southern Taiwan, the extreme rainfall event caused serious damage to the Kaoping River basin. The losses were mostly blamed on the landslides along sides of the river, and shifting of the watercourse even led to the failure of roads and bridges, as well as flooding and levees damage happened around the villages on flood bank and terraces. Alluvial fans resulted from debris flow of stream feeders blocked the main watercourse and debris dam was even formed and collapsed. These disasters have highlighted the importance of identification and map the watercourse alteration, surface features of flood plain area and artificial structures soon after the catastrophic typhoon event for natural hazard mitigation. Interpretation of remote sensing images is an efficient approach to acquire spatial information for vast areas, therefore making it suitable for the differentiation of terrain and objects near the vast flood plain areas in a short term. The object-oriented image analysis program (Definiens Developer 7.0) and multi-band high resolution satellite images (QuickBird, DigitalGlobe) was utilized to interpret the flood plain features from Liouguei to Baolai of the the Kaoping River basin after Typhoon Morakot. Object-oriented image interpretation is the process of using homogenized image blocks as elements instead of pixels for different shapes, textures and the mutual relationships of adjacent elements, as well as categorized conditions and rules for semi-artificial interpretation of surface features. Digital terrain models (DTM) are also employed along with the above process to produce layers with specific "landform thematic layers". These layers are especially helpful in differentiating some confusing categories in the spectrum analysis with improved accuracy, such as landslides and riverbeds, as well as terraces, riverbanks, which are of significant engineering importance in disaster

  20. Explicit area-based accuracy assessment for mangrove tree crown delineation using Geographic Object-Based Image Analysis (GEOBIA)

    Science.gov (United States)

    Kamal, Muhammad; Johansen, Kasper

    2017-10-01

    Effective mangrove management requires spatially explicit information of mangrove tree crown map as a basis for ecosystem diversity study and health assessment. Accuracy assessment is an integral part of any mapping activities to measure the effectiveness of the classification approach. In geographic object-based image analysis (GEOBIA) the assessment of the geometric accuracy (shape, symmetry and location) of the created image objects from image segmentation is required. In this study we used an explicit area-based accuracy assessment to measure the degree of similarity between the results of the classification and reference data from different aspects, including overall quality (OQ), user's accuracy (UA), producer's accuracy (PA) and overall accuracy (OA). We developed a rule set to delineate the mangrove tree crown using WorldView-2 pan-sharpened image. The reference map was obtained by visual delineation of the mangrove tree crowns boundaries form a very high-spatial resolution aerial photograph (7.5cm pixel size). Ten random points with a 10 m radius circular buffer were created to calculate the area-based accuracy assessment. The resulting circular polygons were used to clip both the classified image objects and reference map for area comparisons. In this case, the area-based accuracy assessment resulted 64% and 68% for the OQ and OA, respectively. The overall quality of the calculation results shows the class-related area accuracy; which is the area of correctly classified as tree crowns was 64% out of the total area of tree crowns. On the other hand, the overall accuracy of 68% was calculated as the percentage of all correctly classified classes (tree crowns and canopy gaps) in comparison to the total class area (an entire image). Overall, the area-based accuracy assessment was simple to implement and easy to interpret. It also shows explicitly the omission and commission error variations of object boundary delineation with colour coded polygons.

  1. An Analysis of Light Periods of BL Lac Object S5 0716+714 with the MUSIC Algorithm

    Science.gov (United States)

    Tang, Jie

    2012-07-01

    The multiple signal classification (MUSIC) algorithm is introduced to the estimation of light periods of BL Lac objects. The principle of the MUSIC algorithm is given, together with a testing on its spectral resolution by using a simulative signal. From a lot of literature, we have collected a large number of effective observational data of the BL Lac object S5 0716+714 in the three optical wavebands V, R, and I from 1994 to 2008. The light periods of S5 0716+714 are obtained by means of the MUSIC algorithm and average periodogram algorithm, respectively. It is found that there exist two major periodic components, one is the period of (3.33±0.08) yr, another is the period of (1.24±0.01) yr. The comparison of the performances of periodicity analysis of two algorithms indicates that the MUSIC algorithm has a smaller requirement on the sample length, as well as a good spectral resolution and anti-noise ability, to improve the accuracy of periodicity analysis in the case of short sample length.

  2. The Impact of Objective Mathematical Analysis During Fractional Flow Reserve Measurement. Results from the OMA-FFR Study.

    Science.gov (United States)

    Sciola, Martina I; Morris, Paul D; Gosling, Rebecca; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2018-02-13

    Fractional flow reserve (FFR), the reference-standard for guiding coronary revascularisation, is most commonly acquired during intravenous adenosine infusion. Results may be sensitive to system- and operator-dependent variability in how pressure data are analysed and interpreted. We developed a computational protocol to process the recorded pressure signals in a consistent manner to objectively quantify FFR. We studied the impact upon lesion (re)classification and compared this with the operator-selected FFR obtained during cardiac catheterisation. The algorithm used a moving average and Fourier transformation to identify the Pd/Pa ratio at its nadir (FFRmin) and during the stable hyperaemic period (FFRstable) in <2s with 100% repeatability, in 163 coronary stenoses (93 patients). The mean operator-selected FFR (FFRCL) was higher than FFRmin and lower than FFRstable (0.779 vs 0.762 vs 0.806, P=<0.01). Compared with FFRmin, FFRstable resulted in 16.5% of all lesions being re-classified, all from significant to non-significant (p<0.01). FFRCL classified lesion significance differently to both FFRstable and FFRmin (11.7% and 6.1% lesions reclassified respectively, p<0.01). Subtle differences in how pressure data are analysed and interpreted by the operator during adenosine infusion result in significant differences in the classification of physiological lesion significance. An algorithmic analysis may be helpful in standardising FFR analysis providing an objective and repeatable result.

  3. The role of network theory and object-oriented modeling within a framework for the vulnerability analysis of critical infrastructures

    International Nuclear Information System (INIS)

    Eusgeld, Irene; Kroeger, Wolfgang; Sansavini, Giovanni; Schlaepfer, Markus; Zio, Enrico

    2009-01-01

    A framework for the analysis of the vulnerability of critical infrastructures has been proposed by some of the authors. The framework basically consists of two successive stages: (i) a screening analysis for identifying the parts of the critical infrastructure most relevant with respect to its vulnerability and (ii) a detailed modeling of the operational dynamics of the identified parts for gaining insights on the causes and mechanisms responsible for the vulnerability. In this paper, a critical presentation is offered of the results of a set of investigations aimed at evaluating the potentials of (i) using network analysis based on measures of topological interconnection and reliability efficiency, for the screening task; (ii) using object-oriented modeling as the simulation framework to capture the detailed dynamics of the operational scenarios involving the most vulnerable parts of the critical infrastructure as identified by the preceding network analysis. A case study based on the Swiss high-voltage transmission system is considered. The results are cross-compared and evaluated; the needs of further research are defined

  4. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jakeman, John Davis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stephens, John Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vigil, Dena M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wildey, Timothy Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bohnhoff, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hu, Kenneth T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dalbey, Keith R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauman, Lara E [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hough, Patricia Diane [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  5. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    Science.gov (United States)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical

  6. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  7. Second primary cancers in subsites of colon and rectum in patients with previous colorectal cancer

    NARCIS (Netherlands)

    Liu, L.; Lemmens, V.E.; de Hingh, I.H.J.T.; de Vries, E.; Roukema, J.A.; van Leerdam, M.E.; Coebergh, J.W.; Soerjomataram, I.

    Background: Compared with the general population, patients with a previous colorectal cancer are at higher risk for a second colorectal cancer, but detailed risk analysis by subsite is scarce. Objective: Our goal was to investigate the risk of a second cancer in relation to subsite as a basis for

  8. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    Science.gov (United States)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination

  9. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Science.gov (United States)

    Janssen, Paddy K C; Olivier, Berend; Zwinderman, Aeilko H; Waldinger, Marcel D

    2014-01-01

    To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE). Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE) of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE). Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs) reported in the recently published meta-analysis. Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%), SL(-2.3%), SS(0.6%). In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%), SL(-18.5%) and SS(21.8%) with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  10. Extraction of Terraces on the Loess Plateau from High-Resolution DEMs and Imagery Utilizing Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Hanqing Zhao

    2017-05-01

    Full Text Available Abstract: Terraces are typical artificial landforms on the Loess Plateau, with ecological functions in water and soil conservation, agricultural production, and biodiversity. Recording the spatial distribution of terraces is the basis of monitoring their extent and understanding their ecological effects. The current terrace extraction method mainly relies on high-resolution imagery, but its accuracy is limited due to vegetation coverage distorting the features of terraces in imagery. High-resolution topographic data reflecting the morphology of true terrace surfaces are needed. Terraces extraction on the Loess Plateau is challenging because of the complex terrain and diverse vegetation after the implementation of “vegetation recovery”. This study presents an automatic method of extracting terraces based on 1 m resolution digital elevation models (DEMs and 0.3 m resolution Worldview-3 imagery as auxiliary information used for object-based image analysis (OBIA. A multi-resolution segmentation method was used where slope, positive and negative terrain index (PN, accumulative curvature slope (AC, and slope of slope (SOS were determined as input layers for image segmentation by correlation analysis and Sheffield entropy method. The main classification features based on DEMs were chosen from the terrain features derived from terrain factors and texture features by gray-level co-occurrence matrix (GLCM analysis; subsequently, these features were determined by the importance analysis on classification and regression tree (CART analysis. Extraction rules based on DEMs were generated from the classification features with a total classification accuracy of 89.96%. The red band and near-infrared band of images were used to exclude construction land, which is easily confused with small-size terraces. As a result, the total classification accuracy was increased to 94%. The proposed method ensures comprehensive consideration of terrain, texture, shape, and

  11. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating

    Science.gov (United States)

    Matikainen, Leena; Karila, Kirsi; Hyyppä, Juha; Litkey, Paula; Puttonen, Eetu; Ahokas, Eero

    2017-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with passive multispectral information from aerial images, has shown its high feasibility for automated mapping processes. The main benefits have been achieved in the mapping of elevated objects such as buildings and trees. Recently, the first multispectral airborne laser scanners have been launched, and active multispectral information is for the first time available for 3D ALS point clouds from a single sensor. This article discusses the potential of this new technology in map updating, especially in automated object-based land cover classification and change detection in a suburban area. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from an object-based random forests analysis suggest that the multispectral ALS data are very useful for land cover classification, considering both elevated classes and ground-level classes. The overall accuracy of the land cover classification results with six classes was 96% compared with validation points. The classes under study included building, tree, asphalt, gravel, rocky area and low vegetation. Compared to classification of single-channel data, the main improvements were achieved for ground-level classes. According to feature importance analyses, multispectral intensity features based on several channels were more useful than those based on one channel. Automatic change detection for buildings and roads was also demonstrated by utilising the new multispectral ALS data in combination with old map vectors. In change detection of buildings, an old digital surface model (DSM) based on single-channel ALS data was also used. Overall, our analyses suggest that the new data have high potential for further increasing the automation level in mapping. Unlike passive aerial imaging commonly used in mapping, the multispectral ALS technology is independent of external illumination conditions, and there are

  12. Mapping urban impervious surface using object-based image analysis with WorldView-3 satellite imagery

    Science.gov (United States)

    Iabchoon, Sanwit; Wongsai, Sangdao; Chankon, Kanoksuk

    2017-10-01

    Land use and land cover (LULC) data are important to monitor and assess environmental change. LULC classification using satellite images is a method widely used on a global and local scale. Especially, urban areas that have various LULC types are important components of the urban landscape and ecosystem. This study aims to classify urban LULC using WorldView-3 (WV-3) very high-spatial resolution satellite imagery and the object-based image analysis method. A decision rules set was applied to classify the WV-3 images in Kathu subdistrict, Phuket province, Thailand. The main steps were as follows: (1) the image was ortho-rectified with ground control points and using the digital elevation model, (2) multiscale image segmentation was applied to divide the image pixel level into image object level, (3) development of the decision ruleset for LULC classification using spectral bands, spectral indices, spatial and contextual information, and (4) accuracy assessment was computed using testing data, which sampled by statistical random sampling. The results show that seven LULC classes (water, vegetation, open space, road, residential, building, and bare soil) were successfully classified with overall classification accuracy of 94.14% and a kappa coefficient of 92.91%.

  13. An Object-Based Image Analysis Approach for Detecting Penguin Guano in very High Spatial Resolution Satellite Images

    Directory of Open Access Journals (Sweden)

    Chandi Witharana

    2016-04-01

    Full Text Available The logistical challenges of Antarctic field work and the increasing availability of very high resolution commercial imagery have driven an interest in more efficient search and classification of remotely sensed imagery. This exploratory study employed geographic object-based analysis (GEOBIA methods to classify guano stains, indicative of chinstrap and Adélie penguin breeding areas, from very high spatial resolution (VHSR satellite imagery and closely examined the transferability of knowledge-based GEOBIA rules across different study sites focusing on the same semantic class. We systematically gauged the segmentation quality, classification accuracy, and the reproducibility of fuzzy rules. A master ruleset was developed based on one study site and it was re-tasked “without adaptation” and “with adaptation” on candidate image scenes comprising guano stains. Our results suggest that object-based methods incorporating the spectral, textural, spatial, and contextual characteristics of guano are capable of successfully detecting guano stains. Reapplication of the master ruleset on candidate scenes without modifications produced inferior classification results, while adapted rules produced comparable or superior results compared to the reference image. This work provides a road map to an operational “image-to-assessment pipeline” that will enable Antarctic wildlife researchers to seamlessly integrate VHSR imagery into on-demand penguin population census.

  14. Comparisons of amine solvents for post-combustion CO{sub 2} capture: A multi-objective analysis approach

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Anita S; Eslick, John C; Miller, David C; Kitchin, John R

    2013-10-01

    Amine solvents are of great interest for post-combustion CO{sub 2} capture applications. Although the development of new solvents is predominantly conducted at the laboratory scale, the ability to assess the performance of newly developed solvents at the process scale is crucial to identifying the best solvents for CO{sub 2} capture. In this work we present a methodology to evaluate and objectively compare the process performance of different solvents. We use Aspen Plus, with the electrolyte-NRTL thermodynamic model for the solvent CO{sub 2} interactions, coupled with a multi-objective genetic algorithm optimization to determine the best process design and operating conditions for each solvent. This ensures that the processes utilized for the comparison are those which are best suited for the specific solvent. We evaluate and compare the process performance of monoethanolamine (MEA), diethanolamine (DEA), and 2-amino-2-methyl-1-propanol (AMP) in a 90% CO{sub 2} capture process from a 550 MW coal fired power plant. From our analysis the best process specifications are amine specific and with those specific, optimized specifications DEA has the potential to be a better performing solvent than MEA, with a lower energy penalty and lower capital cost investment.

  15. A Python-Based Open Source System for Geographic Object-Based Image Analysis (GEOBIA Utilizing Raster Attribute Tables

    Directory of Open Access Journals (Sweden)

    Daniel Clewley

    2014-06-01

    Full Text Available A modular system for performing Geographic Object-Based Image Analysis (GEOBIA, using entirely open source (General Public License compatible software, is presented based around representing objects as raster clumps and storing attributes as a raster attribute table (RAT. The system utilizes a number of libraries, developed by the authors: The Remote Sensing and GIS Library (RSGISLib, the Raster I/O Simplification (RIOS Python Library, the KEA image format and TuiView image viewer. All libraries are accessed through Python, providing a common interface on which to build processing chains. Three examples are presented, to demonstrate the capabilities of the system: (1 classification of mangrove extent and change in French Guiana; (2 a generic scheme for the classification of the UN-FAO land cover classification system (LCCS and their subsequent translation to habitat categories; and (3 a national-scale segmentation for Australia. The system presented provides similar functionality to existing GEOBIA packages, but is more flexible, due to its modular environment, capable of handling complex classification processes and applying them to larger datasets.

  16. Distributed and hierarchical object-based image analysis for damage assessment: a case study of 2008 Wenchuan earthquake, China

    Directory of Open Access Journals (Sweden)

    Jing Sun

    2016-11-01

    Full Text Available Object-based image analysis (OBIA is an emerging technique for analyzing remote sensing image based on object properties including spectral, geometry, contextual and texture information. To reduce the computational cost of this comprehensive OBIA and make it more feasible in disaster responses, we developed a unique approach – distributed and hierarchical OBIA approach for damage assessment. This study demonstrated a completed classification of YingXiu town, heavily devastated by the 2008 Wenchuan earthquake using Quickbrid imagery. Two distinctive areas, mountainous areas and urban, were analyzed separately. This approach does not require substantial processing power and large amounts of available memory because image of a large disaster-affected area was split in smaller pieces. Two or more computers could be used in parallel to process and analyze these sub-images based on different requirements. The approach can be applicable in other cases whereas the established set of rules can be adopted in similar study areas. More experiments will be carried out in future studies to prove its feasibility.

  17. Does birth weight influence physical activity in youth? A combined analysis of four studies using objectively measured physical activity

    DEFF Research Database (Denmark)

    Ridgway, Charlotte L; Brage, Søren; Sharp, Stephen J

    2011-01-01

    Animal models suggest growth restriction in utero leads to lower levels of motor activity. Furthermore, individuals with very low birth weight report lower levels of physical activity as adults. The aim of this study was to examine whether birth weight acts as a biological determinant of physical...... activity and sedentary time. This study uses combined analysis of three European cohorts and one from South America (n = 4,170). Birth weight was measured or parentally reported. Height and weight were measured and used to calculate Body Mass Index (BMI). PA was objectively measured using accelerometry...... for ≥3 days, ≥10 hours day. Data was standardized to allow comparisons between different monitors. Total physical activity was assessed as counts per minute (cpm), with time spent above moderate activity (MVPA) >2,000 counts and time spent sedentary (...

  18. Comparative analysis of objective techniques for criteria weighing in two MCDM methods on example of an air conditioner selection

    Directory of Open Access Journals (Sweden)

    Vujičić Momčilo D.

    2017-01-01

    Full Text Available This paper deals with comparative analysis of two different types of objective techniques for criteria weighing: Entropy and CRITIC and two MCDM methods: MOORA and SAW on example of an air conditioner selection. We used six variants for calculation of normalized performance ratings. Results showed that the decision of the best air conditioner was basically independent of the MCDM method used, despite the applied technique for determination of criteria weights. Complete ranking within all of the combinations of methods and techniques with diverse ratio calculation variants showed that the best ranked air conditioner was A7, while the worst ones were A5 and A9. Significant positive correlation was obtained for almost all the pairs of variants in all the combinations except for the MOORA - CRITIC combination with SAW - Entropy combination to have the highest correlations between variants (p < 0.01.

  19. Application of Multi-Objective Optimization on the Basis of Ratio Analysis (MOORA Method for Bank Branch Location Selection

    Directory of Open Access Journals (Sweden)

    Ali Gorener

    2013-04-01

    Full Text Available Location selection problem in banking is an important issue for the commercial success in competitive environment. There is a strategic fit between the location selection decision and overall performance of a new branch. Providing physical service in requested location as well as alternative distribution channels to meet profitable client needs is the current problematic to achieve the competitive advantage over the rivalry in financial system. In this paper, an integrated model has been developed to support in the decision of branch location selection for a new bank branch. Analytic Hierarchy Process (AHP technique has been conducted to prioritize of evaluation criteria, and multi-objective optimization on the basis of ratio analysis (MOORA method has been applied to rank location alternatives of bank branch.   

  20. The Analysis of Object-Based Change Detection in Mining Area: a Case Study with Pingshuo Coal Mine

    Science.gov (United States)

    Zhang, M.; Zhou, W.; Li, Y.

    2017-09-01

    Accurate information on mining land use and land cover change are crucial for monitoring and environmental change studies. In this paper, RapidEye Remote Sensing Image (Map 2012) and SPOT7 Remote Sensing Image (Map 2015) in Pingshuo Mining Area are selected to monitor changes combined with object-based classification and change vector analysis method, we also used R in highresolution remote sensing image for mining land classification, and found the feasibility and the flexibility of open source software. The results show that (1) the classification of reclaimed mining land has higher precision, the overall accuracy and kappa coefficient of the classification of the change region map were 86.67 % and 89.44 %. It's obvious that object-based classification and change vector analysis which has a great significance to improve the monitoring accuracy can be used to monitor mining land, especially reclaiming mining land; (2) the vegetation area changed from 46 % to 40 % accounted for the proportion of the total area from 2012 to 2015, and most of them were transformed into the arable land. The sum of arable land and vegetation area increased from 51 % to 70 %; meanwhile, build-up land has a certain degree of increase, part of the water area was transformed into arable land, but the extent of the two changes is not obvious. The result illustrated the transformation of reclaimed mining area, at the same time, there is still some land convert to mining land, and it shows the mine is still operating, mining land use and land cover are the dynamic procedure.

  1. Elegant objects

    CERN Document Server

    Bugayenko, Yegor

    2017-01-01

    There are 23 practical recommendations for object-oriented programmers. Most of them are completely against everything you've read in other books. For example, static methods, NULL references, getters, setters, and mutable classes are called evil. Compound variable names, validators, private static literals, configurable objects, inheritance, annotations, MVC, dependency injection containers, reflection, ORM and even algorithms are our enemies.

  2. Objective lens

    Science.gov (United States)

    Olczak, Eugene G. (Inventor)

    2011-01-01

    An objective lens and a method for using same. The objective lens has a first end, a second end, and a plurality of optical elements. The optical elements are positioned between the first end and the second end and are at least substantially symmetric about a plane centered between the first end and the second end.

  3. Copula Regression Analysis of Simultaneously Recorded Frontal Eye Field and Inferotemporal Spiking Activity during Object-Based Working Memory

    Science.gov (United States)

    Hu, Meng; Clark, Kelsey L.; Gong, Xiajing; Noudoost, Behrad; Li, Mingyao; Moore, Tirin

    2015-01-01

    Inferotemporal (IT) neurons are known to exhibit persistent, stimulus-selective activity during the delay period of object-based working memory tasks. Frontal eye field (FEF) neurons show robust, spatially selective delay period activity during memory-guided saccade tasks. We present a copula regression paradigm to examine neural interaction of these two types of signals between areas IT and FEF of the monkey during a working memory task. This paradigm is based on copula models that can account for both marginal distribution over spiking activity of individual neurons within each area and joint distribution over ensemble activity of neurons between areas. Considering the popular GLMs as marginal models, we developed a general and flexible likelihood framework that uses the copula to integrate separate GLMs into a joint regression analysis. Such joint analysis essentially leads to a multivariate analog of the marginal GLM theory and hence efficient model estimation. In addition, we show that Granger causality between spike trains can be readily assessed via the likelihood ratio statistic. The performance of this method is validated by extensive simulations, and compared favorably to the widely used GLMs. When applied to spiking activity of simultaneously recorded FEF and IT neurons during working memory task, we observed significant Granger causality influence from FEF to IT, but not in the opposite direction, suggesting the role of the FEF in the selection and retention of visual information during working memory. The copula model has the potential to provide unique neurophysiological insights about network properties of the brain. PMID:26063909

  4. Seismic-zonation of Port-au-Prince using pixel- and object-based imaging analysis methods on ASTER GDEM

    Science.gov (United States)

    Yong, A.; Hough, S.E.; Cox, B.R.; Rathje, E.M.; Bachhuber, J.; Dulberg, R.; Hulslander, D.; Christiansen, L.; Abrams, M.J.

    2011-01-01

    We report about a preliminary study to evaluate the use of semi-automated imaging analysis of remotely-sensed DEM and field geophysical measurements to develop a seismic-zonation map of Port-au-Prince, Haiti. For in situ data, Vs30 values are derived from the MASW technique deployed in and around the city. For satellite imagery, we use an ASTER GDEM of Hispaniola. We apply both pixel- and object-based imaging methods on the ASTER GDEM to explore local topography (absolute elevation values) and classify terrain types such as mountains, alluvial fans and basins/near-shore regions. We assign NEHRP seismic site class ranges based on available Vs30 values. A comparison of results from imagery-based methods to results from traditional geologic-based approaches reveals good overall correspondence. We conclude that image analysis of RS data provides reliable first-order site characterization results in the absence of local data and can be useful to refine detailed site maps with sparse local data. ?? 2011 American Society for Photogrammetry and Remote Sensing.

  5. Seismic zonation of Port-Au-Prince using pixel- and object-based imaging analysis methods on ASTER GDEM

    Science.gov (United States)

    Yong, Alan; Hough, Susan E.; Cox, Brady R.; Rathje, Ellen M.; Bachhuber, Jeff; Dulberg, Ranon; Hulslander, David; Christiansen, Lisa; and Abrams, Michael J.

    2011-01-01

    We report about a preliminary study to evaluate the use of semi-automated imaging analysis of remotely-sensed DEM and field geophysical measurements to develop a seismic-zonation map of Port-au-Prince, Haiti. For in situ data, VS30 values are derived from the MASW technique deployed in and around the city. For satellite imagery, we use an ASTER GDEM of Hispaniola. We apply both pixel- and object-based imaging methods on the ASTER GDEM to explore local topography (absolute elevation values) and classify terrain types such as mountains, alluvial fans and basins/near-shore regions. We assign NEHRP seismic site class ranges based on available VS30 values. A comparison of results from imagery-based methods to results from traditional geologic-based approaches reveals good overall correspondence. We conclude that image analysis of RS data provides reliable first-order site characterization results in the absence of local data and can be useful to refine detailed site maps with sparse local data.

  6. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    Science.gov (United States)

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack