WorldWideScience

Sample records for method results revealed

  1. Comparative analyses reveal discrepancies among results of commonly used methods for Anopheles gambiaemolecular form identification

    Directory of Open Access Journals (Sweden)

    Pinto João

    2011-08-01

    Full Text Available Abstract Background Anopheles gambiae M and S molecular forms, the major malaria vectors in the Afro-tropical region, are ongoing a process of ecological diversification and adaptive lineage splitting, which is affecting malaria transmission and vector control strategies in West Africa. These two incipient species are defined on the basis of single nucleotide differences in the IGS and ITS regions of multicopy rDNA located on the X-chromosome. A number of PCR and PCR-RFLP approaches based on form-specific SNPs in the IGS region are used for M and S identification. Moreover, a PCR-method to detect the M-specific insertion of a short interspersed transposable element (SINE200 has recently been introduced as an alternative identification approach. However, a large-scale comparative analysis of four widely used PCR or PCR-RFLP genotyping methods for M and S identification was never carried out to evaluate whether they could be used interchangeably, as commonly assumed. Results The genotyping of more than 400 A. gambiae specimens from nine African countries, and the sequencing of the IGS-amplicon of 115 of them, highlighted discrepancies among results obtained by the different approaches due to different kinds of biases, which may result in an overestimation of MS putative hybrids, as follows: i incorrect match of M and S specific primers used in the allele specific-PCR approach; ii presence of polymorphisms in the recognition sequence of restriction enzymes used in the PCR-RFLP approaches; iii incomplete cleavage during the restriction reactions; iv presence of different copy numbers of M and S-specific IGS-arrays in single individuals in areas of secondary contact between the two forms. Conclusions The results reveal that the PCR and PCR-RFLP approaches most commonly utilized to identify A. gambiae M and S forms are not fully interchangeable as usually assumed, and highlight limits of the actual definition of the two molecular forms, which might

  2. A norm knockout method on indirect reciprocity to reveal indispensable norms

    Science.gov (United States)

    Yamamoto, Hitoshi; Okada, Isamu; Uchida, Satoshi; Sasaki, Tatsuya

    2017-03-01

    Although various norms for reciprocity-based cooperation have been suggested that are evolutionarily stable against invasion from free riders, the process of alternation of norms and the role of diversified norms remain unclear in the evolution of cooperation. We clarify the co-evolutionary dynamics of norms and cooperation in indirect reciprocity and also identify the indispensable norms for the evolution of cooperation. Inspired by the gene knockout method, a genetic engineering technique, we developed the norm knockout method and clarified the norms necessary for the establishment of cooperation. The results of numerical investigations revealed that the majority of norms gradually transitioned to tolerant norms after defectors are eliminated by strict norms. Furthermore, no cooperation emerges when specific norms that are intolerant to defectors are knocked out.

  3. Comparative study of methods on outlying data detection in experimental results

    International Nuclear Information System (INIS)

    Oliveira, P.M.S.; Munita, C.S.; Hazenfratz, R.

    2009-01-01

    The interpretation of experimental results through multivariate statistical methods might reveal the outliers existence, which is rarely taken into account by the analysts. However, their presence can influence the results interpretation, generating false conclusions. This paper shows the importance of the outliers determination for one data base of 89 samples of ceramic fragments, analyzed by neutron activation analysis. The results were submitted to five procedures to detect outliers: Mahalanobis distance, cluster analysis, principal component analysis, factor analysis, and standardized residual. The results showed that although cluster analysis is one of the procedures most used to identify outliers, it can fail by not showing the samples that are easily identified as outliers by other methods. In general, the statistical procedures for the identification of the outliers are little known by the analysts. (author)

  4. Stepwise multiphoton activation fluorescence reveals a new method of melanin detection

    Science.gov (United States)

    Lai, Zhenhua; Kerimo, Josef; Mega, Yair; DiMarzio, Charles A.

    2013-06-01

    The stepwise multiphoton activated fluorescence (SMPAF) of melanin, activated by a continuous-wave mode near infrared (NIR) laser, reveals a broad spectrum extending from the visible spectra to the NIR and has potential application for a low-cost, reliable method of detecting melanin. SMPAF images of melanin in mouse hair and skin are compared with conventional multiphoton fluorescence microscopy and confocal reflectance microscopy (CRM). By combining CRM with SMPAF, we can locate melanin reliably. However, we have the added benefit of eliminating background interference from other components inside mouse hair and skin. The melanin SMPAF signal from the mouse hair is a mixture of a two-photon process and a third-order process. The melanin SMPAF emission spectrum is activated by a 1505.9-nm laser light, and the resulting spectrum has a peak at 960 nm. The discovery of the emission peak may lead to a more energy-efficient method of background-free melanin detection with less photo-bleaching.

  5. A Method to Reveal Fine-Grained and Diverse Conceptual Progressions during Learning

    Science.gov (United States)

    Lombard, François; Merminod, Marie; Widmer, Vincent; Schneider, Daniel K.

    2018-01-01

    Empirical data on learners' conceptual progression is required to design curricula and guide students. In this paper, we present the Reference Map Change Coding (RMCC) method for revealing students' progression at a fine-grained level. The method has been developed and tested through the analysis of successive versions of the productions of eight…

  6. PALEOEARTHQUAKES IN THE PRIBAIKALIE: METHODS AND RESULTS OF DATING

    Directory of Open Access Journals (Sweden)

    Oleg P. Smekalin

    2010-01-01

    Full Text Available In the Pribaikalie and adjacent territories, seismogeological studies have been underway for almost a half of the century and resulted in discovery of more than 70 dislocations of seismic or presumably seismic origin. With commencement of paleoseismic studies, dating of paleo-earthquakes was focused on as an indicator useful for long-term prediction of strong earthquakes. V.P. Solonenko [Solonenko, 1977] distinguished five methods for dating paleoseismogenic deformations, i.e. geological, engineering geological, historico-archeological, dendrochronological and radiocarbon methods. However, ages of the majority of seismic deformations, which were subject to studies at the initial stage of development of seismogeology in Siberia, were defined by methods of relative or correlation age determination.Since the 1980s, studies of seismogenic deformation in the Pribaikalie have been widely conducted with trenching. Mass sampling, followed with radiocarbon analyses and definition of absolute ages of paleo-earthquakes, provided new data on seismic regimes of the territory and rates of and recent displacements along active faults, and enhanced validity of methods of relative dating, in particular morphometry. Capacities of the morphometry method has significantly increased with introduction of laser techniques in surveys and digital processing of 3D relief models.Comprehensive seismogeological studies conducted in the Pribaikalie revealed 43 paleo-events within 16 seismogenic structures. Absolute ages of 18 paleo-events were defined by the radiocarbon age determination method. Judging by their ages, a number of dislocations were related with historical earthquakes which occurred in the 18th and 19th centuries, yet any reliable data on epicenters of such events are not available. The absolute and relative dating methods allowed us to identify sections in some paleoseismogenic structures by differences in ages of activation and thus provided new data for

  7. Kinds of access: different methods for report reveal different kinds of metacognitive access

    Science.gov (United States)

    Overgaard, Morten; Sandberg, Kristian

    2012-01-01

    In experimental investigations of consciousness, participants are asked to reflect upon their own experiences by issuing reports about them in different ways. For this reason, a participant needs some access to the content of her own conscious experience in order to report. In such experiments, the reports typically consist of some variety of ratings of confidence or direct descriptions of one's own experiences. Whereas different methods of reporting are typically used interchangeably, recent experiments indicate that different results are obtained with different kinds of reporting. We argue that there is not only a theoretical, but also an empirical difference between different methods of reporting. We hypothesize that differences in the sensitivity of different scales may reveal that different types of access are used to issue direct reports about experiences and metacognitive reports about the classification process. PMID:22492747

  8. Revealing barriers and facilitators to use a new genetic test: comparison of three user involvement methods.

    Science.gov (United States)

    Rhebergen, Martijn D F; Visser, Maaike J; Verberk, Maarten M; Lenderink, Annet F; van Dijk, Frank J H; Kezic, Sanja; Hulshof, Carel T J

    2012-10-01

    We compared three common user involvement methods in revealing barriers and facilitators from intended users that might influence their use of a new genetic test. The study was part of the development of a new genetic test on the susceptibility to hand eczema for nurses. Eighty student nurses participated in five focus groups (n = 33), 15 interviews (n = 15) or questionnaires (n = 32). For each method, data were collected until saturation. We compared the mean number of items and relevant remarks that could influence the use of the genetic test obtained per method, divided by the number of participants in that method. Thematic content analysis was performed using MAXQDA software. The focus groups revealed 30 unique items compared to 29 in the interviews and 21 in the questionnaires. The interviews produced more items and relevant remarks per participant (1.9 and 8.4 pp) than focus groups (0.9 and 4.8 pp) or questionnaires (0.7 and 2.3 pp). All three involvement methods revealed relevant barriers and facilitators to use a new genetic test. Focus groups and interviews revealed substantially more items than questionnaires. Furthermore, this study suggests a preference for the use of interviews because the number of items per participant was higher than for focus groups and questionnaires. This conclusion may be valid for other genetic tests as well.

  9. The 'revealed preferences' theory: Assumptions and conjectures

    International Nuclear Information System (INIS)

    Green, C.H.

    1983-01-01

    Being kind of intuitive psychology the 'Revealed-Preferences'- theory based approaches towards determining the acceptable risks are a useful method for the generation of hypotheses. In view of the fact that reliability engineering develops faster than methods for the determination of reliability aims the Revealed-Preferences approach is a necessary preliminary help. Some of the assumptions on which the 'Revealed-Preferences' theory is based will be identified and analysed and afterwards compared with experimentally obtained results. (orig./DG) [de

  10. Kinds of access: Different methods for report reveal different kinds of metacognitive access

    DEFF Research Database (Denmark)

    Overgaard, Morten; Sandberg, Kristian

    2012-01-01

    that there is not only a theoretical, but also an empirical difference between different methods of reporting. We hypothesize that differences in the sensitivity of different scales may reveal that different types of access are used to issue direct reports about experiences and metacognitive reports about...

  11. Field trip method as an effort to reveal student environmental literacy on biodiversity issue and context

    Science.gov (United States)

    Rijal, M.; Saefudin; Amprasto

    2018-05-01

    Field trip method through investigation of local biodiversity cases can give educational experiences for students. This learning activity was efforts to reveal students environmental literacy on biodiversity. The aim of study were (1) to describe the activities of students get information about the biodiversity issue and its context through field trip, (2) to describe the students findings during field trip, and (3) to reveal students environmental literacy based on pre test and post test. The research method used weak-experiment and involved 34 participants at senior high school students in Bandung-Indonesia. The research instruments for collecting data were environmental literacy test, observation sheets and questionnaire sheets for students. The analysis of data was quantitative descriptive. The results show that more than 79% of the students gave positive view for each field trip activity, i.e students activity during work (97%-100%); students activity during gather information (79%- 100%); students activity during exchange information with friend (82%-100%); and students interested to Biodiversity after field trip activity (85%-100%). Students gain knowledge about the diversity of animal vertebrate and its characteristics, the status and condition of animals, and the source of animal with the cases of animal diversity. The students environmental literacy tends to be moderate level based on test. Meanwhile, the average of the attitudes and action greater than the components of knowledge and cognitive skills.

  12. Revealed Preference Methods for Studying Bicycle Route Choice—A Systematic Review

    Directory of Open Access Journals (Sweden)

    Ray Pritchard

    2018-03-01

    Full Text Available One fundamental aspect of promoting utilitarian bicycle use involves making modifications to the built environment to improve the safety, efficiency and enjoyability of cycling. Revealed preference data on bicycle route choice can assist greatly in understanding the actual behaviour of a highly heterogeneous group of users, which in turn assists the prioritisation of infrastructure or other built environment initiatives. This systematic review seeks to compare the relative strengths and weaknesses of the empirical approaches for evaluating whole journey route choices of bicyclists. Two electronic databases were systematically searched for a selection of keywords pertaining to bicycle and route choice. In total seven families of methods are identified: GPS devices, smartphone applications, crowdsourcing, participant-recalled routes, accompanied journeys, egocentric cameras and virtual reality. The study illustrates a trade-off in the quality of data obtainable and the average number of participants. Future additional methods could include dockless bikeshare, multiple camera solutions using computer vision and immersive bicycle simulator environments.

  13. RESULTS OF ANALYSIS OF BENCHMARKING METHODS OF INNOVATION SYSTEMS ASSESSMENT IN ACCORDANCE WITH AIMS OF SUSTAINABLE DEVELOPMENT OF SOCIETY

    Directory of Open Access Journals (Sweden)

    A. Vylegzhanina

    2016-01-01

    Full Text Available In this work, we introduce results of comparative analysis of international ratings indexes of innovation systems for their compliance with purposes of sustainable development. Purpose of this research is defining requirements to benchmarking methods of assessing national or regional innovation systems and compare them basing on assumption, that innovation system is aligned with sustainable development concept. Analysis of goal sets and concepts, which underlie observed international composite innovation indexes, comparison of their metrics and calculation techniques, allowed us to reveal opportunities and limitations of using these methods in frames of sustainable development concept. We formulated targets of innovation development on the base of innovation priorities of sustainable socio-economic development. Using comparative analysis of indexes with these targets, we revealed two methods of assessing innovation systems, maximally connected with goals of sustainable development. Nevertheless, today no any benchmarking method, which meets need of innovation systems assessing in compliance with sustainable development concept to a sufficient extent. We suggested practical directions of developing methods, assessing innovation systems in compliance with goals of societal sustainable development.

  14. Antarctic Temperature Extremes from MODIS Land Surface Temperatures: New Processing Methods Reveal Data Quality Puzzles

    Science.gov (United States)

    Grant, G.; Gallaher, D. W.

    2017-12-01

    New methods for processing massive remotely sensed datasets are used to evaluate Antarctic land surface temperature (LST) extremes. Data from the MODIS/Terra sensor (Collection 6) provides a twice-daily look at Antarctic LSTs over a 17 year period, at a higher spatiotemporal resolution than past studies. Using a data condensation process that creates databases of anomalous values, our processes create statistical images of Antarctic LSTs. In general, the results find few significant trends in extremes; however, they do reveal a puzzling picture of inconsistent cloud detection and possible systemic errors, perhaps due to viewing geometry. Cloud discrimination shows a distinct jump in clear-sky detections starting in 2011, and LSTs around the South Pole exhibit a circular cooling pattern, which may also be related to cloud contamination. Possible root causes are discussed. Ongoing investigations seek to determine whether the results are a natural phenomenon or, as seems likely, the results of sensor degradation or processing artefacts. If the unusual LST patterns or cloud detection discontinuities are natural, they point to new, interesting processes on the Antarctic continent. If the data artefacts are artificial, MODIS LST users should be alerted to the potential issues.

  15. Sequencing of the Chlamydophila psittaci ompA Gene Reveals a New Genotype, E/B, and the Need for a Rapid Discriminatory Genotyping Method

    Science.gov (United States)

    Geens, Tom; Desplanques, Ann; Van Loock, Marnix; Bönner, Brigitte M.; Kaleta, Erhard F.; Magnino, Simone; Andersen, Arthur A.; Everett, Karin D. E.; Vanrompay, Daisy

    2005-01-01

    Twenty-one avian Chlamydophila psittaci isolates from different European countries were characterized using ompA restriction fragment length polymorphism, ompA sequencing, and major outer membrane protein serotyping. Results reveal the presence of a new genotype, E/B, in several European countries and stress the need for a discriminatory rapid genotyping method. PMID:15872282

  16. Application of Semiempirical Methods to Transition Metal Complexes: Fast Results but Hard-to-Predict Accuracy.

    KAUST Repository

    Minenkov, Yury

    2018-05-22

    A series of semiempirical PM6* and PM7 methods has been tested in reproducing of relative conformational energies of 27 realistic-size complexes of 16 different transition metals (TMs). An analysis of relative energies derived from single-point energy evaluations on density functional theory (DFT) optimized conformers revealed pronounced deviations between semiempirical and DFT methods indicating fundamental difference in potential energy surfaces (PES). To identify the origin of the deviation, we compared fully optimized PM7 and respective DFT conformers. For many complexes, differences in PM7 and DFT conformational energies have been confirmed often manifesting themselves in false coordination of some atoms (H, O) to TMs and chemical transformations/distortion of coordination center geometry in PM7 structures. Despite geometry optimization with fixed coordination center geometry leads to some improvements in conformational energies, the resulting accuracy is still too low to recommend explored semiempirical methods for out-of-the-box conformational search/sampling: careful testing is always needed.

  17. Visual Display of Scientific Studies, Methods, and Results

    Science.gov (United States)

    Saltus, R. W.; Fedi, M.

    2015-12-01

    The need for efficient and effective communication of scientific ideas becomes more urgent each year.A growing number of societal and economic issues are tied to matters of science - e.g., climate change, natural resource availability, and public health. Societal and political debate should be grounded in a general understanding of scientific work in relevant fields. It is difficult for many participants in these debates to access science directly because the formal method for scientific documentation and dissemination is the journal paper, generally written for a highly technical and specialized audience. Journal papers are very effective and important for documentation of scientific results and are essential to the requirements of science to produce citable and repeatable results. However, journal papers are not effective at providing a quick and intuitive summary useful for public debate. Just as quantitative data are generally best viewed in graphic form, we propose that scientific studies also can benefit from visual summary and display. We explore the use of existing methods for diagramming logical connections and dependencies, such as Venn diagrams, mind maps, flow charts, etc., for rapidly and intuitively communicating the methods and results of scientific studies. We also discuss a method, specifically tailored to summarizing scientific papers that we introduced last year at AGU. Our method diagrams the relative importance and connections between data, methods/models, results/ideas, and implications/importance using a single-page format with connected elements in these four categories. Within each category (e.g., data) the spatial location of individual elements (e.g., seismic, topographic, gravity) indicates relative novelty (e.g., are these new data?) and importance (e.g., how critical are these data to the results of the paper?). The goal is to find ways to rapidly and intuitively share both the results and the process of science, both for communication

  18. The WOMBAT Attack Attribution Method: Some Results

    Science.gov (United States)

    Dacier, Marc; Pham, Van-Hau; Thonnard, Olivier

    In this paper, we present a new attack attribution method that has been developed within the WOMBAT project. We illustrate the method with some real-world results obtained when applying it to almost two years of attack traces collected by low interaction honeypots. This analytical method aims at identifying large scale attack phenomena composed of IP sources that are linked to the same root cause. All malicious sources involved in a same phenomenon constitute what we call a Misbehaving Cloud (MC). The paper offers an overview of the various steps the method goes through to identify these clouds, providing pointers to external references for more detailed information. Four instances of misbehaving clouds are then described in some more depth to demonstrate the meaningfulness of the concept.

  19. Comparison result of inversion of gravity data of a fault by particle swarm optimization and Levenberg-Marquardt methods.

    Science.gov (United States)

    Toushmalani, Reza

    2013-01-01

    The purpose of this study was to compare the performance of two methods for gravity inversion of a fault. First method [Particle swarm optimization (PSO)] is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. Second method [The Levenberg-Marquardt algorithm (LM)] is an approximation to the Newton method used also for training ANNs. In this paper first we discussed the gravity field of a fault, then describes the algorithms of PSO and LM And presents application of Levenberg-Marquardt algorithm, and a particle swarm algorithm in solving inverse problem of a fault. Most importantly the parameters for the algorithms are given for the individual tests. Inverse solution reveals that fault model parameters are agree quite well with the known results. A more agreement has been found between the predicted model anomaly and the observed gravity anomaly in PSO method rather than LM method.

  20. Pathway-based outlier method reveals heterogeneous genomic structure of autism in blood transcriptome.

    Science.gov (United States)

    Campbell, Malcolm G; Kohane, Isaac S; Kong, Sek Won

    2013-09-24

    Decades of research strongly suggest that the genetic etiology of autism spectrum disorders (ASDs) is heterogeneous. However, most published studies focus on group differences between cases and controls. In contrast, we hypothesized that the heterogeneity of the disorder could be characterized by identifying pathways for which individuals are outliers rather than pathways representative of shared group differences of the ASD diagnosis. Two previously published blood gene expression data sets--the Translational Genetics Research Institute (TGen) dataset (70 cases and 60 unrelated controls) and the Simons Simplex Consortium (Simons) dataset (221 probands and 191 unaffected family members)--were analyzed. All individuals of each dataset were projected to biological pathways, and each sample's Mahalanobis distance from a pooled centroid was calculated to compare the number of case and control outliers for each pathway. Analysis of a set of blood gene expression profiles from 70 ASD and 60 unrelated controls revealed three pathways whose outliers were significantly overrepresented in the ASD cases: neuron development including axonogenesis and neurite development (29% of ASD, 3% of control), nitric oxide signaling (29%, 3%), and skeletal development (27%, 3%). Overall, 50% of cases and 8% of controls were outliers in one of these three pathways, which could not be identified using group comparison or gene-level outlier methods. In an independently collected data set consisting of 221 ASD and 191 unaffected family members, outliers in the neurogenesis pathway were heavily biased towards cases (20.8% of ASD, 12.0% of control). Interestingly, neurogenesis outliers were more common among unaffected family members (Simons) than unrelated controls (TGen), but the statistical significance of this effect was marginal (Chi squared P < 0.09). Unlike group difference approaches, our analysis identified the samples within the case and control groups that manifested each expression

  1. Application of Semiempirical Methods to Transition Metal Complexes: Fast Results but Hard-to-Predict Accuracy.

    KAUST Repository

    Minenkov, Yury; Sharapa, Dmitry I.; Cavallo, Luigi

    2018-01-01

    -point energy evaluations on density functional theory (DFT) optimized conformers revealed pronounced deviations between semiempirical and DFT methods indicating fundamental difference in potential energy surfaces (PES). To identify the origin of the deviation

  2. The estimation of the measurement results with using statistical methods

    International Nuclear Information System (INIS)

    Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" data-affiliation=" (State Enterprise Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" >Velychko, O; UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" data-affiliation=" (State Scientific Institution UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" >Gordiyenko, T

    2015-01-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed

  3. The estimation of the measurement results with using statistical methods

    Science.gov (United States)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  4. Non-Destructive Evaluation Method Based On Dynamic Invariant Stress Resultants

    Directory of Open Access Journals (Sweden)

    Zhang Junchi

    2015-01-01

    Full Text Available Most of the vibration based damage detection methods are based on changes in frequencies, mode shapes, mode shape curvature, and flexibilities. These methods are limited and typically can only detect the presence and location of damage. Current methods seldom can identify the exact severity of damage to structures. This paper will present research in the development of a new non-destructive evaluation method to identify the existence, location, and severity of damage for structural systems. The method utilizes the concept of invariant stress resultants (ISR. The basic concept of ISR is that at any given cross section the resultant internal force distribution in a structural member is not affected by the inflicted damage. The method utilizes dynamic analysis of the structure to simulate direct measurements of acceleration, velocity and displacement simultaneously. The proposed dynamic ISR method is developed and utilized to detect the damage of corresponding changes in mass, damping and stiffness. The objectives of this research are to develop the basic theory of the dynamic ISR method, apply it to the specific types of structures, and verify the accuracy of the developed theory. Numerical results that demonstrate the application of the method will reflect the advanced sensitivity and accuracy in characterizing multiple damage locations.

  5. Comparison of multiple-criteria decision-making methods - results of simulation study

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2016-12-01

    Full Text Available Background: Today, both researchers and practitioners have many methods for supporting the decision-making process. Due to the conditions in which supply chains function, the most interesting are multi-criteria methods. The use of sophisticated methods for supporting decisions requires the parameterization and execution of calculations that are often complex. So is it efficient to use sophisticated methods? Methods: The authors of the publication compared two popular multi-criteria decision-making methods: the  Weighted Sum Model (WSM and the Analytic Hierarchy Process (AHP. A simulation study reflects these two decision-making methods. Input data for this study was a set of criteria weights and the value of each in terms of each criterion. Results: The iGrafx Process for Six Sigma simulation software recreated how both multiple-criteria decision-making methods (WSM and AHP function. The result of the simulation was a numerical value defining the preference of each of the alternatives according to the WSM and AHP methods. The alternative producing a result of higher numerical value  was considered preferred, according to the selected method. In the analysis of the results, the relationship between the values of the parameters and the difference in the results presented by both methods was investigated. Statistical methods, including hypothesis testing, were used for this purpose. Conclusions: The simulation study findings prove that the results obtained with the use of two multiple-criteria decision-making methods are very similar. Differences occurred more frequently in lower-value parameters from the "value of each alternative" group and higher-value parameters from the "weight of criteria" group.

  6. Indirect questioning method reveals hidden support for female genital cutting in South Central Ethiopia.

    Science.gov (United States)

    Gibson, Mhairi A; Gurmu, Eshetu; Cobo, Beatriz; Rueda, María M; Scott, Isabel M

    2018-01-01

    Female genital cutting (FGC) has major implications for women's physical, sexual and psychological health, and eliminating the practice is a key target for public health policy-makers. To date one of the main barriers to achieving this has been an inability to infer privately-held views on FGC within communities where it is prevalent. As a sensitive (and often illegal) topic, people are anticipated to hide their true support for the practice when questioned directly. Here we use an indirect questioning method (unmatched count technique) to identify hidden support for FGC in a rural South Central Ethiopian community where the practice is common, but thought to be in decline. Employing a socio-demographic household survey of 1620 Arsi Oromo adults, which incorporated both direct and indirect direct response (unmatched count) techniques we compare directly-stated versus privately-held views in support of FGC, and individual variation in responses by age, gender and education and target female (daughters versus daughters-in-law). Both genders express low support for FGC when questioned directly, while indirect methods reveal substantially higher acceptance (of cutting both daughters and daughters-in-law). Educated adults (those who have attended school) are privately more supportive of the practice than they are prepared to admit openly to an interviewer, indicating that education may heighten secrecy rather than decrease support for FGC. Older individuals hold the strongest views in favour of FGC (particularly educated older males), but they are also more inclined to conceal their support for FGC when questioned directly. As these elders represent the most influential members of society, their hidden support for FGC may constitute a pivotal barrier to eliminating the practice in this community. Our results demonstrate the great potential for indirect questioning methods to advance knowledge and inform policy on culturally-sensitive topics like FGC; providing more

  7. Results of clinical approbation of new local treatment method in the complex therapy of inflammatory parodontium diseases

    Directory of Open Access Journals (Sweden)

    Yu. G. Romanova

    2017-08-01

    Full Text Available Treatment and prevention of inflammatory diseases of parodontium are one of the most difficult problems in stomatology today. Purpose of research: estimation of clinical efficiency of local combined application of developed agent apigel for oral cavity care and low-frequency electromagnetic field magnetotherapy at treatment of inflammatory diseases of parodontium. Materials and methods: 46 patients with chronic generalized catarrhal gingivitis and chronic generalized periodontitis of 1st degree were included into the study. Patients were divided into 2 groups depending on treatment management: basic (n = 23 and control (n = 23. Conventional treatment with the local use of the dental gel with camomile was used in the control group. Patients of the basic group were treated with local combined application of apigel and magnetotherapy. Efficiency was estimated with clinical, laboratory, microbiological and functional (ultrasonic Doppler examination methods of examination. Results: The application of the apigel and pulsating electromagnetic field in the complex medical treatment of patients with chronic generalized periodontitis (GhGP caused positive changes in clinical symptom and condition of parodontal tissues, that was accompanied by decline of hygienic and parodontal indexes. As compared with patients who had traditional anti-inflammatory therapy, patients who were treated with local application of apigel and magnetoterapy had decline of edema incidence. It was revealed that decrease of the pain correlated with improvement of hygienic condition of oral cavity and promoted prevention of bacterial contamination of damaged mucous membranes. Estimation of microvasculatory blood stream with the method of ultrasonic doppler flowmetry revealed more rapid normalization of volume and linear high systole, speed of blood stream in the parodontal tissues in case of use of new complex local method. Conclusions: Effect of the developed local agent in patients

  8. Doppler method leak detection for LMFBR steam generators. Pt. 1. Experimental results of bubble detection using small models

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi

    1999-01-01

    To prevent the expansion of the tube damage and to maintain structural integrity in the steam generators (SGs) of fast breeder reactors (FBRs), it is necessary to detect precisely and immediately the leakage of water from heat transfer tubes. Therefore, an active acoustic method was developed. Previous studies have revealed that in practical steam generators the active acoustic method can detect bubbles of 10 l/s within 10 seconds. To prevent the expansion of damage to neighboring tubes, it is necessary to detect smaller leakages of water from the heat transfer tubes. The Doppler method is designed to detect small leakages and to find the source of the leak before damage spreads to neighboring tubes. To evaluate the relationship between the detection sensitivity of the Doppler method and the bubble volume and bubble size, the structural shapes and bubble flow conditions were investigated experimentally, using a small structural model. The results show that the Doppler method can detect the bubbles under bubble flow conditions, and it is sensitive enough to detect small leakages within a short time. The doppler method thus has strong potential for the detection of water leakage in SGs. (author)

  9. Methods and results of diuresis renography in infants and children. Methodik und Ergebnisse der Diurese-Nephrographie im Kindesalter

    Energy Technology Data Exchange (ETDEWEB)

    Kleinhans, E. (Klinik fuer Nuklearmedizin, RWTH Aachen (Germany)); Rohrmann, D. (Urologische Klinik, RWTH Aachen (Germany)); Stollbrink, C. (Paediatrische Klinik, RWTH Aachen (Germany)); Mertens, R. (Paediatrische Klinik, RWTH Aachen (Germany)); Jakse, G. (Urologische Klinik, RWTH Aachen (Germany)); Buell, U. (Klinik fuer Nuklearmedizin, RWTH Aachen (Germany))

    1994-02-01

    In infants and children with hydronephrosis, the decision-making process for those instances of urinary tract dilatation that require surgical correction and those that do not is based in part on the findings of diuresis renography. Quantitative analysis of renogram curve pattern is a well established tool which, in addition, provides comparable results in follow-up studies. However, standardization of the method including data analysis does not yet exist. In this study, three parameters obtained by mathematical curve analysis were examined: clearance half-time for diuretic response, clearance within 5 minutes and clearance within 16 minutes. As a result, 16 minutes clearance revealed superior results in discriminating obstructive impairments of urine drainage from not obstructive ones. Compared to the clearance halftime, the markedly shorter duration of the examination (16 minutes) is an additional benefit. (orig.)

  10. Life cycle analysis of electricity systems: Methods and results

    International Nuclear Information System (INIS)

    Friedrich, R.; Marheineke, T.

    1996-01-01

    The two methods for full energy chain analysis, process analysis and input/output analysis, are discussed. A combination of these two methods provides the most accurate results. Such a hybrid analysis of the full energy chains of six different power plants is presented and discussed. The results of such analyses depend on time, site and technique of each process step and, therefore have no general validity. For renewable energy systems the emissions form the generation of a back-up system should be added. (author). 7 figs, 1 fig

  11. A revised method of presenting wavenumber-frequency power spectrum diagrams that reveals the asymmetric nature of tropical large-scale waves

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Winston C. [NASA/Goddard Space Flight Center, Global Modeling and Assimilation Office, Mail Code 610.1, Greenbelt, MD (United States); Yang, Bo; Fu, Xiouhua [University of Hawaii at Manoa, School of Ocean and Earth Science and Technology, International Pacific Research Center, Honolulu, HI (United States)

    2009-11-15

    The popular method of presenting wavenumber-frequency power spectrum diagrams for studying tropical large-scale waves in the literature is shown to give an incomplete presentation of these waves. The so-called ''convectively coupled Kelvin (mixed Rossby-gravity) waves'' are presented as existing only in the symmetric (anti-symmetric) component of the diagrams. This is obviously not consistent with the published composite/regression studies of ''convectively coupled Kelvin waves,'' which illustrate the asymmetric nature of these waves. The cause of this inconsistency is revealed in this note and a revised method of presenting the power spectrum diagrams is proposed. When this revised method is used, ''convectively coupled Kelvin waves'' do show anti-symmetric components, and ''convectively coupled mixed Rossby-gravity waves (also known as Yanai waves)'' do show a hint of symmetric components. These results bolster a published proposal that these waves should be called ''chimeric Kelvin waves,'' ''chimeric mixed Rossby-gravity waves,'' etc. This revised method of presenting power spectrum diagrams offers an additional means of comparing the GCM output with observations by calling attention to the capability of GCMs to correctly simulate the asymmetric characteristics of equatorial waves. (orig.)

  12. EQUITY SHARES EQUATING THE RESULTS OF FCFF AND FCFE METHODS

    Directory of Open Access Journals (Sweden)

    Bartłomiej Cegłowski

    2012-06-01

    Full Text Available The aim of the article is to present the method of establishing equity shares in weight average cost of capital (WACC, in which the value of loan capital results from the fixed assumptions accepted in the financial plan (for example a schedule of loan repayment and own equity is evaluated by means of a discount method. The described method causes that, regardless of whether cash flows are calculated as FCFF or FCFE, the result of the company valuation will be identical.

  13. Steady-state transport equation resolution by particle methods, and numerical results

    International Nuclear Information System (INIS)

    Mercier, B.

    1985-10-01

    A method to solve steady-state transport equation has been given. Principles of the method are given. The method is studied in two different cases; estimations given by the theory are compared to numerical results. Results got in 1-D (spherical geometry) and in 2-D (axisymmetric geometry) are given [fr

  14. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  15. Relationships Between Results Of An Internal And External Match Load Determining Method In Male, Singles Badminton Players.

    Science.gov (United States)

    Abdullahi, Yahaya; Coetzee, Ben; Van den Berg, Linda

    2017-07-03

    The study purpose was to determine relationships between results of internal and external match load determining methods. Twenty-one players, who participated in selected badminton championships during the 2014/2015 season served as subjects. The heart rate (HR) values and GPS data of each player were obtained via a fix Polar HR Transmitter Belt and MinimaxX GPS device. Moderate significant Spearman's rank correlations were found between HR and absolute duration (r = 0.43 at a low intensity (LI) and 0.44 at a high intensity (HI)), distance covered (r = 0.42 at a HI) and player load (PL) (r = 0.44 at a HI). Results also revealed an opposite trend for external and internal measures of load as the average relative HR value was found to be the highest for the HI zone (54.1%) compared to the relative measures of external load where average values (1.29-9.89%) were the lowest for the HI zone. In conclusion, our findings show that results of an internal and external badminton match load determining method are more related to each other in the HI zone than other zones and that the strength of relationships depend on the duration of activities that are performed in especially LI and HI zones. Overall, trivial to moderate relationships between results of an internal and external match load determining method in male, singles badminton players reaffirm the conclusions of others that these constructs measure distinctly different demands and should therefore be measured concurrently to fully understand the true requirements of badminton match play.

  16. Comparison of Results according to the treatment Method in Maxillary Sinus Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Woong Ki; Jo, Jae Sik; Ahn, Sung Ja; Nam, Taek Keun; Nah, Byung Sik [Chonnam National University College of Medicine, Kwangju (Korea, Republic of); Park, Seung Jin [Gyeongsang National Univ., Jinju (Korea, Republic of)

    1995-03-15

    Purpose : A retrospective analysis was performed to investigate the proper management of maxillary sinus carcinoma. Materials and Methods : Authors analysed 33 patients of squamous cell carcinoma of maxillary sinus treated at Chonnam University Hospital from January 1986 to December 1992. There were 24 men and 9 women with median age of 55 years. According to AJCC TNM system of 1988, a patient of T2, 10 patients of T3 and 22 patients of T4 were available, respectively. Cervical lymph node metastases was observed in 5 patients(N1;4/33, N2b;1/33). Patients were classified as 3 groups according to management method. The first group, named as 'FAR' (16 patients), was consisted of preoperative intra-arterial chemotherapy with 5-fluorouracil(5-FU;mean of total dosage;3078mg) through the superficial temporal artery with concurrent radiation(mean dose delivered;3433cGy, daily 180-200cGy) and vitamin A(50,000 IU daily), and followed by total maxillectomy and postoperative radiation therapy(mean dose;2351cGy). The second group, named as 'SR'(7 patients), was consisted of total maxillectomy followed by postoperative radiation therapy(mean dose 5920 cGy). Her third group, named as 'R'(6 patients), was treated with radiation alone(mean dose;7164cGy). Kaplan-Meier product limit method was used for survival analysis and Mantel-Cox test was performed for significance of survival difference between two groups. Results : Local recurrence free survival rate in the end of 2 year was 100%, 5-% and 0% in FAR, SR and R group, respectively. Disease free survival rate in 2 years was 88.9%, 40% and 50% in Far, SR and R group, respectively. There were statistically significant difference between FAR and SR or FAR and R group in their local recurrence free, disease free and overall survival rates. But difference of each survival rate between SR and R group was not significant. Conclusion : In this study FAR group revealed better results that SR or R group. In the

  17. Comparison of Results according to the treatment Method in Maxillary Sinus Carcinoma

    International Nuclear Information System (INIS)

    Chung, Woong Ki; Jo, Jae Sik; Ahn, Sung Ja; Nam, Taek Keun; Nah, Byung Sik; Park, Seung Jin

    1995-01-01

    Purpose : A retrospective analysis was performed to investigate the proper management of maxillary sinus carcinoma. Materials and Methods : Authors analysed 33 patients of squamous cell carcinoma of maxillary sinus treated at Chonnam University Hospital from January 1986 to December 1992. There were 24 men and 9 women with median age of 55 years. According to AJCC TNM system of 1988, a patient of T2, 10 patients of T3 and 22 patients of T4 were available, respectively. Cervical lymph node metastases was observed in 5 patients(N1;4/33, N2b;1/33). Patients were classified as 3 groups according to management method. The first group, named as 'FAR' (16 patients), was consisted of preoperative intra-arterial chemotherapy with 5-fluorouracil(5-FU;mean of total dosage;3078mg) through the superficial temporal artery with concurrent radiation(mean dose delivered;3433cGy, daily 180-200cGy) and vitamin A(50,000 IU daily), and followed by total maxillectomy and postoperative radiation therapy(mean dose;2351cGy). The second group, named as 'SR'(7 patients), was consisted of total maxillectomy followed by postoperative radiation therapy(mean dose 5920 cGy). Her third group, named as 'R'(6 patients), was treated with radiation alone(mean dose;7164cGy). Kaplan-Meier product limit method was used for survival analysis and Mantel-Cox test was performed for significance of survival difference between two groups. Results : Local recurrence free survival rate in the end of 2 year was 100%, 5-% and 0% in FAR, SR and R group, respectively. Disease free survival rate in 2 years was 88.9%, 40% and 50% in Far, SR and R group, respectively. There were statistically significant difference between FAR and SR or FAR and R group in their local recurrence free, disease free and overall survival rates. But difference of each survival rate between SR and R group was not significant. Conclusion : In this study FAR group revealed better results that SR or R group. In the future prospective randomized

  18. A Rapid Colorimetric Method Reveals Fraudulent Substitutions in Sea Urchin Roe Marketed in Sardinia (Italy).

    Science.gov (United States)

    Meloni, Domenico; Spina, Antonio; Satta, Gianluca; Chessa, Vittorio

    2016-06-25

    In recent years, besides the consumption of fresh sea urchin specimens, the demand of minimally-processed roe has grown considerably. This product has made frequent consumption in restaurants possible and frauds are becoming widespread with the partial replacement of sea urchin roe with surrogates that are similar in colour. One of the main factors that determines the quality of the roe is its colour and small differences in colour scale cannot be easily discerned by the consumers. In this study we have applied a rapid colorimetric method for reveal the fraudulent partial substitution of semi-solid sea urchin roe with liquid egg yolk. Objective assessment of whiteness (L*), redness (a*), yellowness (b*), hue (h*), and chroma (C*) was carried out with a digital spectrophotometer using the CIE L*a*b* colour measurement system. The colorimetric method highlighted statistically significant differences among sea urchin roe and liquid egg yolk that could be easily discerned quantitatively.

  19. A result-driven minimum blocking method for PageRank parallel computing

    Science.gov (United States)

    Tao, Wan; Liu, Tao; Yu, Wei; Huang, Gan

    2017-01-01

    Matrix blocking is a common method for improving computational efficiency of PageRank, but the blocking rules are hard to be determined, and the following calculation is complicated. In tackling these problems, we propose a minimum blocking method driven by result needs to accomplish a parallel implementation of PageRank algorithm. The minimum blocking just stores the element which is necessary for the result matrix. In return, the following calculation becomes simple and the consumption of the I/O transmission is cut down. We do experiments on several matrixes of different data size and different sparsity degree. The results show that the proposed method has better computational efficiency than traditional blocking methods.

  20. The anchors of steel wire ropes, testing methods and their results

    Directory of Open Access Journals (Sweden)

    J. Krešák

    2012-10-01

    Full Text Available The present paper introduces an application of the acoustic and thermographic method in the defectoscopic testing of immobile steel wire ropes at the most critical point, the anchor. First measurements and their results by these new defectoscopic methods are shown. In defectoscopic tests at the anchor, the widely used magnetic method gives unreliable results, and therefore presents a problem for steel wire defectoscopy. Application of the two new methods in the steel wire defectoscopy at the anchor point will enable increased safety measures at the anchor of steel wire ropes in bridge, roof, tower and aerial cable lift constructions.

  1. Mechanics of Nanostructures: Methods and Results

    Science.gov (United States)

    Ruoff, Rod

    2003-03-01

    We continue to develop and use new tools to measure the mechanics and electromechanics of nanostructures. Here we discuss: (a) methods for making nanoclamps and the resulting: nanoclamp geometry, chemical composition and type of chemical bonding, and nanoclamp strength (effectiveness as a nanoclamp for the mechanics measurements to be made); (b) mechanics of carbon nanocoils. We have received carbon nanocoils from colleagues in Japan [1], measured their spring constants, and have observed extensions exceeding 100% relative to the unloaded length, using our scanning electron microscope nanomanipulator tool; (c) several new devices that are essentially MEMS-based, that allow for improved measurements of the mechanics of psuedo-1D and planar nanostructures. [1] Zhang M., Nakayama Y., Pan L., Japanese J. Appl. Phys. 39, L1242-L1244 (2000).

  2. Android Emotions Revealed

    DEFF Research Database (Denmark)

    Vlachos, Evgenios; Schärfe, Henrik

    2012-01-01

    This work presents a method for designing facial interfaces for sociable android robots with respect to the fundamental rules of human affect expression. Extending the work of Paul Ekman towards a robotic direction, we follow the judgment-based approach for evaluating facial expressions to test...... findings are based on the results derived from a number of judgments, and suggest that before programming the facial expressions of a Geminoid, the Original should pass through the proposed procedure. According to our recommendations, the facial expressions of an android should be tested by judges, even...... in which case an android robot like the Geminoid|DK –a duplicate of an Original person- reveals emotions convincingly; when following an empirical perspective, or when following a theoretical one. The methodology includes the processes of acquiring the empirical data, and gathering feedback on them. Our...

  3. Psychophysical "blinding" methods reveal a functional hierarchy of unconscious visual processing.

    Science.gov (United States)

    Breitmeyer, Bruno G

    2015-09-01

    Numerous non-invasive experimental "blinding" methods exist for suppressing the phenomenal awareness of visual stimuli. Not all of these suppressive methods occur at, and thus index, the same level of unconscious visual processing. This suggests that a functional hierarchy of unconscious visual processing can in principle be established. The empirical results of extant studies that have used a number of different methods and additional reasonable theoretical considerations suggest the following tentative hierarchy. At the highest levels in this hierarchy is unconscious processing indexed by object-substitution masking. The functional levels indexed by crowding, the attentional blink (and other attentional blinding methods), backward pattern masking, metacontrast masking, continuous flash suppression, sandwich masking, and single-flash interocular suppression, fall at progressively lower levels, while unconscious processing at the lowest levels is indexed by eye-based binocular-rivalry suppression. Although unconscious processing levels indexed by additional blinding methods is yet to be determined, a tentative placement at lower levels in the hierarchy is also given for unconscious processing indexed by Troxler fading and adaptation-induced blindness, and at higher levels in the hierarchy indexed by attentional blinding effects in addition to the level indexed by the attentional blink. The full mapping of levels in the functional hierarchy onto cortical activation sites and levels is yet to be determined. The existence of such a hierarchy bears importantly on the search for, and the distinctions between, neural correlates of conscious and unconscious vision. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. The Use of Data Mining Methods to Predict the Result of Infertility Treatment Using the IVF ET Method

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2014-12-01

    Full Text Available The IVF ET method is a scientifically recognized infertility treat- ment method. The problem, however, is this method’s unsatisfactory efficiency. This calls for a more thorough analysis of the information available in the treat- ment process, in order to detect the factors that have an effect on the results, as well as to effectively predict result of treatment. Classical statistical methods have proven to be inadequate in this issue. Only the use of modern methods of data mining gives hope for a more effective analysis of the collected data. This work provides an overview of the new methods used for the analysis of data on infertility treatment, and formulates a proposal for further directions for research into increasing the efficiency of the predicted result of the treatment process.

  5. Two different hematocrit detection methods: Different methods, different results?

    Directory of Open Access Journals (Sweden)

    Schuepbach Reto A

    2010-03-01

    Full Text Available Abstract Background Less is known about the influence of hematocrit detection methodology on transfusion triggers. Therefore, the aim of the present study was to compare two different hematocrit-assessing methods. In a total of 50 critically ill patients hematocrit was analyzed using (1 blood gas analyzer (ABLflex 800 and (2 the central laboratory method (ADVIA® 2120 and compared. Findings Bland-Altman analysis for repeated measurements showed a good correlation with a bias of +1.39% and 2 SD of ± 3.12%. The 24%-hematocrit-group showed a correlation of r2 = 0.87. With a kappa of 0.56, 22.7% of the cases would have been transfused differently. In the-28%-hematocrit group with a similar correlation (r2 = 0.8 and a kappa of 0.58, 21% of the cases would have been transfused differently. Conclusions Despite a good agreement between the two methods used to determine hematocrit in clinical routine, the calculated difference of 1.4% might substantially influence transfusion triggers depending on the employed method.

  6. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    Science.gov (United States)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  7. Estimating the Economic Value of Environmental Amenities of Isfahan Sofeh Highland Park (The Individual Revealed and Expressed Travel Cost Method

    Directory of Open Access Journals (Sweden)

    H. Amirnejad

    2016-03-01

    for the revealed and the expressed travel costs and total travel costs. Results Discussion: Collected data shows that the average age of visitors is 31 years. Most of them are young, 66% of visitors are male and the rest are female. Most of the respondents chose the spring season for visiting Sofeh Park. Results of negative binomial regression estimation showed that age, income, distance and the revealed and total travel cost have a significant effect on the total number of visits in both scenarios. Age and income coefficients are positive. Thus, these variables have a direct effect on the number of visit in both scenarios. But distance and travel cost coefficients are negative. Therefore, these variables have a reverse effect on the total number of visits in both scenarios. These results confirm the demand law. The law of demand states that the quantity demanded and the price of a commodity are inversely related. Travel cost as commodity price for tourism demand function in the first scenario is only revealed -travel- cost of trip to location and in the second scenario is the revealed cost in addition to the opportunity cost of trip to the recreational location. Consumer surplus as the average value of environmental amenities is calculated by1/ , where is the coefficient related to travel cost variable in the tourism demand functions. Also, the average value of environmental amenities for anyone visit in the first and the second scenarios are 797 and 1145 thousand Rials, respectively. The obvious difference between recreational values in the two scenarios is due to opportunity cost. The total recreational value of the Sofeh Highland Park equals to the product of number of annual visits and average recreational value. Finally the total value of annual visits to the park, in the above scenarios is more than 11952 and 17174 billion Rials, respectively. Conclusion: In this study, the value of environmental amenities of the Sofeh Highland Park were estimated. Notice that

  8. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  9. German precursor study: methods and results

    International Nuclear Information System (INIS)

    Hoertner, H.; Frey, W.; von Linden, J.; Reichart, G.

    1985-01-01

    This study has been prepared by the GRS by contract of the Federal Minister of Interior. The purpose of the study is to show how the application of system-analytic tools and especially of probabilistic methods on the Licensee Event Reports (LERs) and on other operating experience can support a deeper understanding of the safety-related importance of the events reported in reactor operation, the identification of possible weak points, and further conclusions to be drawn from the events. Additionally, the study aimed at a comparison of its results for the severe core damage frequency with those of the German Risk Study as far as this is possible and useful. The German Precursor Study is a plant-specific study. The reference plant is Biblis NPP with its very similar Units A and B, whereby the latter was also the reference plant for the German Risk Study

  10. A Filtering Method to Reveal Crystalline Patterns from Atom Probe Microscopy Desorption Maps

    Science.gov (United States)

    2016-03-26

    reveal crystalline patterns from atom probe microscopy desorption maps Lan Yao Department of Materials Science and Engineering, University of Michigan, Ann...reveal the crystallographic information present in Atom Probe Microscopy (APM) data is presented. Themethod filters atoms based on the time difference...between their evaporation and the evaporation of the previous atom . Since this time difference correlates with the location and the local structure of

  11. Numerical proceessing of radioimmunoassay results using logit-log transformation method

    International Nuclear Information System (INIS)

    Textoris, R.

    1983-01-01

    The mathematical model and algorithm are described of the numerical processing of the results of a radioimmunoassay by the logit-log transformation method and by linear regression with weight factors. The limiting value of the curve for zero concentration is optimized with regard to the residual sum by the iterative method by multiple repeats of the linear regression. Typical examples are presented of the approximation of calibration curves. The method proved suitable for all hitherto used RIA sets and is well suited for small computers with internal memory of min. 8 Kbyte. (author)

  12. Evaluating rehabilitation methods - some practical results from Rum Jungle

    International Nuclear Information System (INIS)

    Ryan, P.

    1987-01-01

    Research and analysis of the following aspects of rehabilitation have been conducted at the Rum Jungle mine site over the past three years: drainage structure stability; rock batter stability; soil fauna; tree growth in compacted soils; rehabilitation costs. The results show that, for future rehabilitation projects adopting refined methods, attention to final construction detail and biospheric influences is most important. The mine site offers a unique opportunity to evaluate the success of a variety of rehabilitation methods to the benefit of the industry in Australia overseas. It is intended that practical, economic, research will continue for some considerable time

  13. Multiple predictor smoothing methods for sensitivity analysis: Example results

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described in the first part of this presentation: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. In this, the second and concluding part of the presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  14. Two-step extraction method for lead isotope fractionation to reveal anthropogenic lead pollution.

    Science.gov (United States)

    Katahira, Kenshi; Moriwaki, Hiroshi; Kamura, Kazuo; Yamazaki, Hideo

    2018-05-28

    This study developed the 2-step extraction method which eluted the Pb adsorbing on the surface of sediments in the first solution by aqua regia and extracted the Pb absorbed inside particles into the second solution by mixed acid of nitric acid, hydrofluoric acid and hydrogen peroxide solution. We applied the method to sediments in the enclosed water area and found out that the isotope ratios of Pb in the second solution represented those of natural origin. This advantage of the method makes it possible to distinguish the Pb between natural origin and anthropogenic source on the basis of the isotope ratios. The results showed that the method was useful to discuss the Pb sources and that anthropogenic Pb in the sediment samples analysed was mainly derived from China because of transboundary air pollution.

  15. Barcoded pyrosequencing reveals that consumption of galactooligosaccharides results in a highly specific bifidogenic response in humans.

    Directory of Open Access Journals (Sweden)

    Lauren M G Davis

    Full Text Available Prebiotics are selectively fermented ingredients that allow specific changes in the gastrointestinal microbiota that confer health benefits to the host. However, the effects of prebiotics on the human gut microbiota are incomplete as most studies have relied on methods that fail to cover the breadth of the bacterial community. The goal of this research was to use high throughput multiplex community sequencing of 16S rDNA tags to gain a community wide perspective of the impact of prebiotic galactooligosaccharide (GOS on the fecal microbiota of healthy human subjects. Fecal samples from eighteen healthy adults were previously obtained during a feeding trial in which each subject consumed a GOS-containing product for twelve weeks, with four increasing dosages (0, 2.5, 5, and 10 gram of GOS. Multiplex sequencing of the 16S rDNA tags revealed that GOS induced significant compositional alterations in the fecal microbiota, principally by increasing the abundance of organisms within the Actinobacteria. Specifically, several distinct lineages of Bifidobacterium were enriched. Consumption of GOS led to five- to ten-fold increases in bifidobacteria in half of the subjects. Increases in Firmicutes were also observed, however, these changes were detectable in only a few individuals. The enrichment of bifidobacteria was generally at the expense of one group of bacteria, the Bacteroides. The responses to GOS and the magnitude of the response varied between individuals, were reversible, and were in accordance with dosage. The bifidobacteria were the only bacteria that were consistently and significantly enriched by GOS, although this substrate supported the growth of diverse colonic bacteria in mono-culture experiments. These results suggest that GOS can be used to enrich bifidobacteria in the human gastrointestinal tract with remarkable specificity, and that the bifidogenic properties of GOS that occur in vivo are caused by selective fermentation as well as by

  16. Convergence results for a class of abstract continuous descent methods

    Directory of Open Access Journals (Sweden)

    Sergiu Aizicovici

    2004-03-01

    Full Text Available We study continuous descent methods for the minimization of Lipschitzian functions defined on a general Banach space. We establish convergence theorems for those methods which are generated by approximate solutions to evolution equations governed by regular vector fields. Since the complement of the set of regular vector fields is $sigma$-porous, we conclude that our results apply to most vector fields in the sense of Baire's categories.

  17. Method of vacuum correlation functions: Results and prospects

    International Nuclear Information System (INIS)

    Badalian, A. M.; Simonov, Yu. A.; Shevchenko, V. I.

    2006-01-01

    Basic results obtained within the QCD method of vacuum correlation functions over the past 20 years in the context of investigations into strong-interaction physics at the Institute of Theoretical and Experimental Physics (ITEP, Moscow) are formulated Emphasis is placed primarily on the prospects of the general theory developed within QCD by employing both nonperturbative and perturbative methods. On the basis of ab initio arguments, it is shown that the lowest two field correlation functions play a dominant role in QCD dynamics. A quantitative theory of confinement and deconfinement, as well as of the spectra of light and heavy quarkonia, glueballs, and hybrids, is given in terms of these two correlation functions. Perturbation theory in a nonperturbative vacuum (background perturbation theory) plays a significant role, not possessing drawbacks of conventional perturbation theory and leading to the infrared freezing of the coupling constant α s

  18. A semantics-based method for clustering of Chinese web search results

    Science.gov (United States)

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  19. New method of scoliosis assessment: preliminary results using computerized photogrammetry.

    Science.gov (United States)

    Aroeira, Rozilene Maria Cota; Leal, Jefferson Soares; de Melo Pertence, Antônio Eustáquio

    2011-09-01

    A new method for nonradiographic evaluation of scoliosis was independently compared with the Cobb radiographic method, for the quantification of scoliotic curvature. To develop a protocol for computerized photogrammetry, as a nonradiographic method, for the quantification of scoliosis, and to mathematically relate this proposed method with the Cobb radiographic method. Repeated exposure to radiation of children can be harmful to their health. Nevertheless, no nonradiographic method until now proposed has gained popularity as a routine method for evaluation, mainly due to a low correspondence to the Cobb radiographic method. Patients undergoing standing posteroanterior full-length spine radiographs, who were willing to participate in this study, were submitted to dorsal digital photography in the orthostatic position with special surface markers over the spinous process, specifically the vertebrae C7 to L5. The radiographic and photographic images were sent separately for independent analysis to two examiners, trained in quantification of scoliosis for the types of images received. The scoliosis curvature angles obtained through computerized photogrammetry (the new method) were compared to those obtained through the Cobb radiographic method. Sixteen individuals were evaluated (14 female and 2 male). All presented idiopathic scoliosis, and were between 21.4 ± 6.1 years of age; 52.9 ± 5.8 kg in weight; 1.63 ± 0.05 m in height, with a body mass index of 19.8 ± 0.2. There was no statistically significant difference between the scoliosis angle measurements obtained in the comparative analysis of both methods, and a mathematical relationship was formulated between both methods. The preliminary results presented demonstrate equivalence between the two methods. More studies are needed to firmly assess the potential of this new method as a coadjuvant tool in the routine following of scoliosis treatment.

  20. Soil Particle Size Analysis by Laser Diffractometry: Result Comparison with Pipette Method

    Science.gov (United States)

    Šinkovičová, Miroslava; Igaz, Dušan; Kondrlová, Elena; Jarošová, Miriam

    2017-10-01

    Soil texture as the basic soil physical property provides a basic information on the soil grain size distribution as well as grain size fraction representation. Currently, there are several methods of particle dimension measurement available that are based on different physical principles. Pipette method based on the different sedimentation velocity of particles with different diameter is considered to be one of the standard methods of individual grain size fraction distribution determination. Following the technical advancement, optical methods such as laser diffraction can be also used nowadays for grain size distribution determination in the soil. According to the literature review of domestic as well as international sources related to this topic, it is obvious that the results obtained by laser diffractometry do not correspond with the results obtained by pipette method. The main aim of this paper was to analyse 132 samples of medium fine soil, taken from the Nitra River catchment in Slovakia, from depths of 15-20 cm and 40-45 cm, respectively, using laser analysers: ANALYSETTE 22 MicroTec plus (Fritsch GmbH) and Mastersizer 2000 (Malvern Instruments Ltd). The results obtained by laser diffractometry were compared with pipette method and the regression relationships using linear, exponential, power and polynomial trend were derived. Regressions with the three highest regression coefficients (R2) were further investigated. The fit with the highest tightness was observed for the polynomial regression. In view of the results obtained, we recommend using the estimate of the representation of the clay fraction (analysis is done according to laser diffractometry. The advantages of laser diffraction method comprise the short analysis time, usage of small sample amount, application for the various grain size fraction and soil type classification systems, and a wide range of determined fractions. Therefore, it is necessary to focus on this issue further to address the

  1. Microbial Diversity of Browning Peninsula, Eastern Antarctica Revealed Using Molecular and Cultivation Methods.

    Science.gov (United States)

    Pudasaini, Sarita; Wilson, John; Ji, Mukan; van Dorst, Josie; Snape, Ian; Palmer, Anne S; Burns, Brendan P; Ferrari, Belinda C

    2017-01-01

    Browning Peninsula is an ice-free polar desert situated in the Windmill Islands, Eastern Antarctica. The entire site is described as a barren landscape, comprised of frost boils with soils dominated by microbial life. In this study, we explored the microbial diversity and edaphic drivers of community structure across this site using traditional cultivation methods, a novel approach the soil substrate membrane system (SSMS), and culture-independent 454-tag pyrosequencing. The measured soil environmental and microphysical factors of chlorine, phosphate, aspect and elevation were found to be significant drivers of the bacterial community, while none of the soil parameters analyzed were significantly correlated to the fungal community. Overall, Browning Peninsula soil harbored a distinctive microbial community in comparison to other Antarctic soils comprised of a unique bacterial diversity and extremely limited fungal diversity. Tag pyrosequencing data revealed the bacterial community to be dominated by Actinobacteria (36%), followed by Chloroflexi (18%), Cyanobacteria (14%), and Proteobacteria (10%). For fungi, Ascomycota (97%) dominated the soil microbiome, followed by Basidiomycota. As expected the diversity recovered from culture-based techniques was lower than that detected using tag sequencing. However, in the SSMS enrichments, that mimic the natural conditions for cultivating oligophilic "k-selected" bacteria, a larger proportion of rare bacterial taxa (15%), such as Blastococcus, Devosia, Herbaspirillum, Propionibacterium and Methylocella and fungal (11%) taxa, such as Nigrospora, Exophiala, Hortaea , and Penidiella were recovered at the genus level. At phylum level, a comparison of OTU's showed that the SSMS shared 21% of Acidobacteria, 11% of Actinobacteria and 10% of Proteobacteria OTU's with soil. For fungi, the shared OTUs was 4% (Basidiomycota) and <0.5% (Ascomycota). This was the first known attempt to culture microfungi using the SSMS which resulted in

  2. Processing method and results of meteor shower radar observations

    International Nuclear Information System (INIS)

    Belkovich, O.I.; Suleimanov, N.I.; Tokhtasjev, V.S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed

  3. Trial sequential analysis reveals insufficient information size and potentially false positive results in many meta-analyses

    DEFF Research Database (Denmark)

    Brok, J.; Thorlund, K.; Gluud, C.

    2008-01-01

    in 80% (insufficient information size). TSA(15%) and TSA(LBHIS) found that 95% and 91% had absence of evidence. The remaining nonsignificant meta-analyses had evidence of lack of effect. CONCLUSION: TSA reveals insufficient information size and potentially false positive results in many meta......OBJECTIVES: To evaluate meta-analyses with trial sequential analysis (TSA). TSA adjusts for random error risk and provides the required number of participants (information size) in a meta-analysis. Meta-analyses not reaching information size are analyzed with trial sequential monitoring boundaries...... analogous to interim monitoring boundaries in a single trial. STUDY DESIGN AND SETTING: We applied TSA on meta-analyses performed in Cochrane Neonatal reviews. We calculated information sizes and monitoring boundaries with three different anticipated intervention effects of 30% relative risk reduction (TSA...

  4. Non-invasive genetics outperforms morphological methods in faecal dietary analysis, revealing wild boar as a considerable conservation concern for ground-nesting birds.

    Science.gov (United States)

    Oja, Ragne; Soe, Egle; Valdmann, Harri; Saarma, Urmas

    2017-01-01

    Capercaillie (Tetrao urogallus) and other grouse species represent conservation concerns across Europe due to their negative abundance trends. In addition to habitat deterioration, predation is considered a major factor contributing to population declines. While the role of generalist predators on grouse predation is relatively well known, the impact of the omnivorous wild boar has remained elusive. We hypothesize that wild boar is an important predator of ground-nesting birds, but has been neglected as a bird predator because traditional morphological methods underestimate the proportion of birds in wild boar diet. To distinguish between different mammalian predator species, as well as different grouse prey species, we developed a molecular method based on the analysis of mitochondrial DNA that allows accurate species identification. We collected 109 wild boar faeces at protected capercaillie leks and surrounding areas and analysed bird consumption using genetic methods and classical morphological examination. Genetic analysis revealed that the proportion of birds in wild boar faeces was significantly higher (17.3%; 4.5×) than indicated by morphological examination (3.8%). Moreover, the genetic method allowed considerably more precise taxonomic identification of consumed birds compared to morphological analysis. Our results demonstrate: (i) the value of using genetic approaches in faecal dietary analysis due to their higher sensitivity, and (ii) that wild boar is an important predator of ground-nesting birds, deserving serious consideration in conservation planning for capercaillie and other grouse.

  5. Methods of early revealing, prognosis of further course and complications of pollinosis

    Directory of Open Access Journals (Sweden)

    Chukhrienko N.D.

    2013-10-01

    Full Text Available Under our observation there were 59 patients with pollinosis – 39 females and 20 males at the age from 18 to 68 years. All patients were in the phase of disease exacerbation. General clinical symptoms were: rhinitis, conjunctivitis and bronchial spasm. The results showed that first clinical manifestations appear in persons of young age. Half of the patients had aggravated allergologic anamnesis. Taking into account that pollinosis is a typical representative of diseases having mechanism of immunoglobulin E (IgE-dependent allergic reactions of the first type, the authors have studied in detail level of IgE and its link with other factors. Practically in all patients with pollinosis level of total IgE exceeded the norm. As a result of studies performed, it was established that high IgE level, presence of phagocytosis defect and prolong duration of illness are the criteria which affect disease progress, aggravation of patients’ state, less efficacy of treatment. Due to the fact that development of bronchial obstruction and transformation of pollinosis into bronchial asthma is the most topical issue nowadays, the authors studied its link with other factors and findings. It was established that risk of pollinosis transformation into pollen bronchial asthma increases in the presence of high level of total IgE, aggravation of allergologic anamnesis, decrease of forced expiration volume (FEV, significant duration of disease course. In the course of investigation it was revealed that the highest efficacy of treatment is noted in patients receiving allergen-specific therapy; this confirms data of world scientific literature. The best treatment results are observed in pollinosis patients, with aggravated family history not in parents but in grandparents.

  6. How the RNA isolation method can affect microRNA microarray results

    DEFF Research Database (Denmark)

    Podolska, Agnieszka; Kaczkowski, Bogumil; Litman, Thomas

    2011-01-01

    RNA microarray analysis on porcine brain tissue. One method is a phenol-guanidine isothiocyanate-based procedure that permits isolation of total RNA. The second method, miRVana™ microRNA isolation, is column based and recovers the small RNA fraction alone. We found that microarray analyses give different results...... that depend on the RNA fraction used, in particular because some microRNAs appear very sensitive to the RNA isolation method. We conclude that precautions need to be taken when comparing microarray studies based on RNA isolated with different methods.......The quality of RNA is crucial in gene expression experiments. RNA degradation interferes in the measurement of gene expression, and in this context, microRNA quantification can lead to an incorrect estimation. In the present study, two different RNA isolation methods were used to perform micro...

  7. Comparison of the analysis result between two laboratories using different methods

    International Nuclear Information System (INIS)

    Sri Murniasih; Agus Taftazani

    2017-01-01

    Comparison of the analysis result of volcano ash sample between two laboratories using different analysis methods. The research aims to improve the testing laboratory quality and cooperate with the testing laboratory from other country. Samples were tested at the Center for Accelerator of Science and Technology (CAST)-NAA laboratory using NAA, while at the University of Texas (UT) USA using ICP-MS and ENAA method. From 12 elements of target, CAST-NAA able to present 11 elements of data analysis. The comparison results shows that the analysis of the K, Mn, Ti and Fe elements from both laboratories have a very good comparison and close one to other. It is known from RSD values and correlation coefficients of the both laboratories analysis results. While observed of the results difference known that the analysis results of Al, Na, K, Fe, V, Mn, Ti, Cr and As elements from both laboratories is not significantly different. From 11 elements were reported, only Zn which have significantly different values for both laboratories. (author)

  8. Multiband discrete ordinates method: formalism and results

    International Nuclear Information System (INIS)

    Luneville, L.

    1998-06-01

    The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author)

  9. Testing the usability of the Rapid Impact Assessment Matrix (RIAM) method for comparison of EIA and SEA results

    International Nuclear Information System (INIS)

    Kuitunen, Markku; Jalava, Kimmo; Hirvonen, Kimmo

    2008-01-01

    This study examines how the results of Environmental Impact Assessment (EIA) and Strategic Environmental Assessment (SEA) could be compared using the Rapid Impact Assessment Matrix (RIAM) method. There are many tools and techniques that have been developed for use in impact assessment processes, including scoping, checklists, matrices, qualitative and quantitative models, literature reviews, and decision-support systems. While impact assessment processes have become more technically complicated, it is recognized that approaches including simpler applications of available tools and techniques are also appropriate. The Rapid Impact Assessment Matrix (RIAM) is a tool for organizing, analysing and presenting the results of a holistic EIA. RIAM was originally developed to compare the impact of alternative procedures in a single project. In this study, we used RIAM to compare the environmental and social impact of different projects, plans and programs realized within the same geographical area. RIAM scoring is based on five separate criteria. The RIAM criteria were applied to the impact that was considered to be the most significant in the evaluated cases, and scores were given both on environmental and social impact. Our results revealed that the RIAM method could be used for comparison and ranking of separate and distinct projects, plans, programs and policies, based on their negative or positive impact. Our data included 142 cases from the area of Central Finland that is covered by the Regional Council of Central Finland. This sample consisted of various types of projects, ranging from road construction to education programs that applied for EU funding

  10. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  11. Comparison of two dietary assessment methods by food consumption: results of the German National Nutrition Survey II.

    Science.gov (United States)

    Eisinger-Watzl, Marianne; Straßburg, Andrea; Ramünke, Josa; Krems, Carolin; Heuer, Thorsten; Hoffmann, Ingrid

    2015-04-01

    To further characterise the performance of the diet history method and the 24-h recalls method, both in an updated version, a comparison was conducted. The National Nutrition Survey II, representative for Germany, assessed food consumption with both methods. The comparison was conducted in a sample of 9,968 participants aged 14-80. Besides calculating mean differences, statistical agreement measurements encompass Spearman and intraclass correlation coefficients, ranking participants in quartiles and the Bland-Altman method. Mean consumption of 12 out of 18 food groups was higher assessed with the diet history method. Three of these 12 food groups had a medium to large effect size (e.g., raw vegetables) and seven showed at least a small strength while there was basically no difference for coffee/tea or ice cream. Intraclass correlations were strong only for beverages (>0.50) and revealed the least correlation for vegetables (diet history method to remember consumption of the past 4 weeks may be a source of inaccurateness, especially for inhomogeneous food groups. Additionally, social desirability gains significance. There is no assessment method without errors and attention to specific food groups is a critical issue with every method. Altogether, the 24-h recalls method applied in the presented study, offers advantages approximating food consumption as compared to the diet history method.

  12. A method for data handling numerical results in parallel OpenFOAM simulations

    International Nuclear Information System (INIS)

    nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" data-affiliation=" (Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" >Anton, Alin; th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" data-affiliation=" (Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" >Muntean, Sebastian

    2015-01-01

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit ® [1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms

  13. A method for data handling numerical results in parallel OpenFOAM simulations

    Energy Technology Data Exchange (ETDEWEB)

    Anton, Alin [Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2" n" d Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania); Muntean, Sebastian [Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24" t" h Mihai Viteazu Ave., 300221, TM Timişoara (Romania)

    2015-12-31

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit{sup ®}[1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms.

  14. Comparison results on preconditioned SOR-type iterative method for Z-matrices linear systems

    Science.gov (United States)

    Wang, Xue-Zhong; Huang, Ting-Zhu; Fu, Ying-Ding

    2007-09-01

    In this paper, we present some comparison theorems on preconditioned iterative method for solving Z-matrices linear systems, Comparison results show that the rate of convergence of the Gauss-Seidel-type method is faster than the rate of convergence of the SOR-type iterative method.

  15. Immunoglobulin G (IgG) Fab glycosylation analysis using a new mass spectrometric high-throughput profiling method reveals pregnancy-associated changes.

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H J; Hensbergen, Paul J; Reiding, Karli R; Hazes, Johanna M W; Dolhain, Radboud J E M; Wuhrer, Manfred

    2014-11-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼ 20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Immunoglobulin G (IgG) Fab Glycosylation Analysis Using a New Mass Spectrometric High-throughput Profiling Method Reveals Pregnancy-associated Changes*

    Science.gov (United States)

    Bondt, Albert; Rombouts, Yoann; Selman, Maurice H. J.; Hensbergen, Paul J.; Reiding, Karli R.; Hazes, Johanna M. W.; Dolhain, Radboud J. E. M.; Wuhrer, Manfred

    2014-01-01

    The N-linked glycosylation of the constant fragment (Fc) of immunoglobulin G has been shown to change during pathological and physiological events and to strongly influence antibody inflammatory properties. In contrast, little is known about Fab-linked N-glycosylation, carried by ∼20% of IgG. Here we present a high-throughput workflow to analyze Fab and Fc glycosylation of polyclonal IgG purified from 5 μl of serum. We were able to detect and quantify 37 different N-glycans by means of MALDI-TOF-MS analysis in reflectron positive mode using a novel linkage-specific derivatization of sialic acid. This method was applied to 174 samples of a pregnancy cohort to reveal Fab glycosylation features and their change with pregnancy. Data analysis revealed marked differences between Fab and Fc glycosylation, especially in the levels of galactosylation and sialylation, incidence of bisecting GlcNAc, and presence of high mannose structures, which were all higher in the Fab portion than the Fc, whereas Fc showed higher levels of fucosylation. Additionally, we observed several changes during pregnancy and after delivery. Fab N-glycan sialylation was increased and bisection was decreased relative to postpartum time points, and nearly complete galactosylation of Fab glycans was observed throughout. Fc glycosylation changes were similar to results described before, with increased galactosylation and sialylation and decreased bisection during pregnancy. We expect that the parallel analysis of IgG Fab and Fc, as set up in this paper, will be important for unraveling roles of these glycans in (auto)immunity, which may be mediated via recognition by human lectins or modulation of antigen binding. PMID:25004930

  17. Radioimmunological determination of plasma progesterone. Methods - Results - Indications

    International Nuclear Information System (INIS)

    Gonon-Estrangin, Chantal.

    1978-10-01

    The aim of this work is to describe the radioimmunological determination of plasma progesterone carried out at the hormonology Laboratory of the Grenoble University Hospital Centre (Professor E. Chambaz), to compare our results with those of the literature and to present the main clinical indications of this analysis. The measurement method has proved reproducible, specific (the steroid purification stage is unnecessary) and sensitive (detection: 10 picograms of progesterone per tube). In seven normally menstruating women our results agree with published values: (in nanograms per millilitre: ng/ml) 0.07 ng/ml to 0.9 ng/ml in the follicular phase, from the start of menstruation until ovulation, then rapid increase at ovulation with a maximum in the middle of the luteal phase (our values for this maximum range from 7.9 ng/ml to 21.7 ng/ml) and gradual drop in progesterone secretion until the next menstrual period. In gynecology the radioimmunoassay of plasma progesterone is valuable for diagnostic and therapeutic purposes: - to diagnosis the absence of corpus luteum, - to judge the effectiveness of an ovulation induction treatment [fr

  18. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  19. Investigation of error estimation method of observational data and comparison method between numerical and observational results toward V and V of seismic simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kawakami, Yoshiaki; Nakajima, Norihiro

    2017-01-01

    The method to estimate errors included in observational data and the method to compare numerical results with observational results are investigated toward the verification and validation (V and V) of a seismic simulation. For the method to estimate errors, 144 literatures for the past 5 years (from the year 2010 to 2014) in the structure engineering field and earthquake engineering field where the description about acceleration data is frequent are surveyed. As a result, it is found that some processes to remove components regarded as errors from observational data are used in about 30% of those literatures. Errors are caused by the resolution, the linearity, the temperature coefficient for sensitivity, the temperature coefficient for zero shift, the transverse sensitivity, the seismometer property, the aliasing, and so on. Those processes can be exploited to estimate errors individually. For the method to compare numerical results with observational results, public materials of ASME V and V Symposium 2012-2015, their references, and above 144 literatures are surveyed. As a result, it is found that six methods have been mainly proposed in existing researches. Evaluating those methods using nine items, advantages and disadvantages for those methods are arranged. The method is not well established so that it is necessary to employ those methods by compensating disadvantages and/or to search for a solution to a novel method. (author)

  20. Image restoration by the method of convex projections: part 2 applications and numerical results.

    Science.gov (United States)

    Sezan, M I; Stark, H

    1982-01-01

    The image restoration theory discussed in a previous paper by Youla and Webb [1] is applied to a simulated image and the results compared with the well-known method known as the Gerchberg-Papoulis algorithm. The results show that the method of image restoration by projection onto convex sets, by providing a convenient technique for utilizing a priori information, performs significantly better than the Gerchberg-Papoulis method.

  1. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    Science.gov (United States)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  2. Assessment of Cultivation Method for Energy Beet Based on LCA Method

    OpenAIRE

    Zhang, Chunfeng; Liu, Feng; Zu, Yuangang; Meng, Qingying; Zhu, Baoguo; Wang, Nannan

    2014-01-01

    In order to establish a supply system for energy resource coupled with the environment, the production technology of sugar beets was explored as a biological energy source. The low-humic andosol as the experimental soil, the panting method was direct planting, and cultivation technique was minimum tillage direct planting method. The control was conventional tillage transplant and no tillage direct planting. The results demonstrated that data revealed that the energy cost of no tillage and a d...

  3. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    Energy Technology Data Exchange (ETDEWEB)

    Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.; Hyer, Daniel E. [Department of Radiation Oncology, University of Iowa, 200 Hawkins Drive, Iowa City, Iowa 52242 (United States); Hill, Patrick M. [Department of Human Oncology, University of Wisconsin, 600 Highland Avenue, Madison, Wisconsin 53792 (United States); Gao, Mingcheng; Laub, Steve; Pankuch, Mark [Division of Medical Physics, CDH Proton Center, 4455 Weaver Parkway, Warrenville, Illinois 60555 (United States)

    2015-03-15

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ{sub x1},σ{sub x2},σ{sub y1},σ{sub y2}) together with the spatial location of the maximum dose (μ{sub x},μ{sub y}). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets.

  4. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    International Nuclear Information System (INIS)

    Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.; Hyer, Daniel E.; Hill, Patrick M.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark

    2015-01-01

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ x1 ,σ x2 ,σ y1 ,σ y2 ) together with the spatial location of the maximum dose (μ x ,μ y ). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets

  5. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    Science.gov (United States)

    Gelover, Edgar; Wang, Dongxu; Hill, Patrick M.; Flynn, Ryan T.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark; Hyer, Daniel E.

    2015-01-01

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σx1,σx2,σy1,σy2) together with the spatial location of the maximum dose (μx,μy). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets. PMID:25735287

  6. [Adverse events management. Methods and results of a development project].

    Science.gov (United States)

    Rabøl, Louise Isager; Jensen, Elisabeth Brøgger; Hellebek, Annemarie H; Pedersen, Beth Lilja

    2006-11-27

    This article describes the methods and results of a project in the Copenhagen Hospital Corporation (H:S) on preventing adverse events. The aim of the project was to raise awareness about patients' safety, test a reporting system for adverse events, develop and test methods of analysis of events and propagate ideas about how to prevent adverse events. H:S developed an action plan and a reporting system for adverse events, founded an organization and developed an educational program on theories and methods of learning from adverse events for both leaders and employees. During the three-year period from 1 January 2002 to 31 December 2004, the H:S staff reported 6011 adverse events. In the same period, the organization completed 92 root cause analyses. More than half of these dealt with events that had been optional to report, the other half events that had been mandatory to report. The number of reports and the front-line staff's attitude towards reporting shows that the H:S succeeded in founding a safety culture. Future work should be centred on developing and testing methods that will prevent adverse events from happening. The objective is to suggest and complete preventive initiatives which will help increase patient safety.

  7. Transparency Trade-offs for a 3-channel Controller Revealed by the Bounded Environment Passivity Method

    OpenAIRE

    Willaert, Bert; Corteville, Brecht; Reynaerts, Dominiek; Van Brussel, Hendrik; Vander Poorten, Emmanuel

    2010-01-01

    In this paper, the Bounded Environment Passivity method [1] is applied to a 3-channel controller. This method enables the design of teleoperation controllers that show passive behaviour for interactions with a bounded range of environments. The resulting tuning guidelines, derived analytically, provide interesting tuning flexibility, which allows to focus on different aspects of transparency. As telesurgery is the motivation behind this work, the focus lies on correctly r...

  8. Trojan Horse Method: Recent Results

    International Nuclear Information System (INIS)

    Pizzone, R. G.; Spitaleri, C.

    2008-01-01

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare S b (E) astrophysical factor and electron screening potentials U e for several two body processes are discussed

  9. Method of Check of Statistical Hypotheses for Revealing of “Fraud” Point of Sale

    Directory of Open Access Journals (Sweden)

    T. M. Bolotskaya

    2011-06-01

    Full Text Available Application method checking of statistical hypotheses fraud Point of Sale working with purchasing cards and suspected of accomplishment of unauthorized operations is analyzed. On the basis of the received results the algorithm is developed, allowing receive an assessment of works of terminals in regime off-line.

  10. LOGICAL CONDITIONS ANALYSIS METHOD FOR DIAGNOSTIC TEST RESULTS DECODING APPLIED TO COMPETENCE ELEMENTS PROFICIENCY

    Directory of Open Access Journals (Sweden)

    V. I. Freyman

    2015-11-01

    Full Text Available Subject of Research.Representation features of education results for competence-based educational programs are analyzed. Solution importance of decoding and proficiency estimation for elements and components of discipline parts of competences is shown. The purpose and objectives of research are formulated. Methods. The paper deals with methods of mathematical logic, Boolean algebra, and parametrical analysis of complex diagnostic test results, that controls proficiency of some discipline competence elements. Results. The method of logical conditions analysis is created. It will give the possibility to formulate logical conditions for proficiency determination of each discipline competence element, controlled by complex diagnostic test. Normalized test result is divided into noncrossing zones; a logical condition about controlled elements proficiency is formulated for each of them. Summarized characteristics for test result zones are imposed. An example of logical conditions forming for diagnostic test with preset features is provided. Practical Relevance. The proposed method of logical conditions analysis is applied in the decoding algorithm of proficiency test diagnosis for discipline competence elements. It will give the possibility to automate the search procedure for elements with insufficient proficiency, and is also usable for estimation of education results of a discipline or a component of competence-based educational program.

  11. Monitoring ambient ozone with a passive measurement technique method, field results and strategy

    NARCIS (Netherlands)

    Scheeren, BA; Adema, EH

    1996-01-01

    A low-cost, accurate and sensitive passive measurement method for ozone has been developed and tested. The method is based on the reaction of ozone with indigo carmine which results in colourless reaction products which are detected spectrophotometrically after exposure. Coated glass filters are

  12. [Results of treatment of milk teeth pulp by modified formocresol method].

    Science.gov (United States)

    Wochna-Sobańska, M

    1989-01-01

    The purpose of the study was evaluation of the results of treatment of molar pulp diseases by the formocresol method from the standpoint of the development of inflammatory complications in periapical tissues, disturbances of physiological resorption of roots, disturbances of mineralization of crowns of homologous permanent teeth. For the treatment milk molars were qualified with the diagnosis of grade II pulpopathy in children aged from 3 to 9 years. The treatment was done using formocresol by a modified method of pulp amputation according to Buckley after previous devitalization with parapaste. The status of 143 teeth was examined again 1 to 4 years after completion of treatment. The proportion of positive results after one year was 94%, after two years it was 90%, after three years 87% and after four years 80%. The cause of premature loss of most teeth was root resorption acceleration by 18-24 months. No harmful action of formocresol on the buds of permanent teeth was noted.

  13. [Do different interpretative methods used for evaluation of checkerboard synergy test affect the results?].

    Science.gov (United States)

    Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent

    2012-07-01

    In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in

  14. Surface-enhanced Raman scattering spectra revealing the inter-cultivar differences for Chinese ornamental Flos Chrysanthemum: a new promising method for plant taxonomy

    Directory of Open Access Journals (Sweden)

    Heng Zhang

    2017-10-01

    Full Text Available Abstract Background Flos Chrysanthemi, as a part of Chinese culture for a long history, is valuable for not only environmental decoration but also the medicine and food additive. Due to their voluminously various breeds and extensive distributions worldwide, it is burdensome to make recognition and classification among numerous cultivars with conventional methods which still rest on the level of morphologic observation and description. As a fingerprint spectrum for parsing molecular information, surface-enhanced Raman scattering (SERS could be a suitable candidate technique to characterize and distinguish the inter-cultivar differences at molecular level. Results SERS spectra were used to analyze the inter-cultivar differences among 26 cultivars of Chinese ornamental Flos Chrysanthemum. The characteristic peaks distribution patterns were abstracted from SERS spectra and varied from cultivars to cultivars. For the bands distributed in the pattern map, the similarities in general showed their commonality while in the finer scales, the deviations and especially the particular bands owned by few cultivars revealed their individualities. Since the Raman peaks could characterize specific chemical components, those diversity of patterns could indicate the inter-cultivar differences at the chemical level in fact. Conclusion In this paper, SERS technique is feasible for distinguishing the inter-cultivar differences among Flos Chrysanthemum. The Raman spectral library was built with SERS characteristic peak distribution patterns. A new method was proposed for Flos Chrysanthemum recognition and taxonomy.

  15. A holographic method for investigating cylindrical symmetry plasmas resulting from electric discharges

    International Nuclear Information System (INIS)

    Rosu, N.; Ralea, M.; Foca, M.; Iova, I.

    1992-01-01

    A new method based on holographic interferometry in real time with reference fringes for diagnosing gas electric discharges in cylindrical symmetry tubes is presented. A method for obtaining and quantitatively investigating interferograms obtained with a video camera is described. By studying the resulting images frame by frame and introducing the measurements into an adequate computer programme one gets a graphical recording of the radial distribution of the charged particle concentration in the plasma in any region of the tube at a given time, as well as their axial distribution. The real time evolution of certain phenomena occurring in the discharge tube can also be determined by this non-destructive method. The method is used for electric discharges in Ar at average pressures in a discharge tube with hollow cathode effect. (Author)

  16. Neurochemistry of Alzheimer's disease and related dementias: Results of metabolic imaging and future application of ligand binding methods

    International Nuclear Information System (INIS)

    Frey, K.A.; Koeppe, R.A.; Kuhl, D.E.

    1991-01-01

    Although Alzheimer's disease (AD) has been recognized for over a decade as a leading cause of cognitive decline in the elderly, its etiology remains unknown. Radiotracer imaging studies have revealed characteristic patterns of abnormal energy metabolism and blood flow in AD. A consistent reduction in cerebral glucose metabolism, determined by positron emission tomography, is observed in the parietal, temporal, and frontal association cortices. It is proposed that this occurs on the basis of diffuse cortical pathology, resulting in disproportionate loss of presynaptic input to higher cortical association areas. Postmortem neurochemical studies consistently indicate a severe depletion of cortical presynaptic cholinergic markers in AD. This is accounted for by loss of cholinergic projection neurons in the basal forebrain. In addition, loss of extrinsic serotonergic innervation of the cortex and losses of intrinsic cortical markers such as somatostatin, substance P, glutamate receptors, and glutamate- and GABA-uptake sites are reported. These observations offer the opportunity for study in vivo with the use of radioligand imaging methods under development. The role of tracer imaging studies in the investigation and diagnosis of dementia is likely to become increasingly central, as metabolic imaging provides evidence of abnormality early in the clinical course. New neurochemical imaging methods will allow direct testing of hypotheses of selective neuronal degeneration, and will assist in design of future studies of AD pathophysiology

  17. Experimental Results and Numerical Simulation of the Target RCS using Gaussian Beam Summation Method

    Directory of Open Access Journals (Sweden)

    Ghanmi Helmi

    2018-05-01

    Full Text Available This paper presents a numerical and experimental study of Radar Cross Section (RCS of radar targets using Gaussian Beam Summation (GBS method. The purpose GBS method has several advantages over ray method, mainly on the caustic problem. To evaluate the performance of the chosen method, we started the analysis of the RCS using Gaussian Beam Summation (GBS and Gaussian Beam Launching (GBL, the asymptotic models Physical Optic (PO, Geometrical Theory of Diffraction (GTD and the rigorous Method of Moment (MoM. Then, we showed the experimental validation of the numerical results using experimental measurements which have been executed in the anechoic chamber of Lab-STICC at ENSTA Bretagne. The numerical and experimental results of the RCS are studied and given as a function of various parameters: polarization type, target size, Gaussian beams number and Gaussian beams width.

  18. Analysis of the robustness of network-based disease-gene prioritization methods reveals redundancy in the human interactome and functional diversity of disease-genes.

    Directory of Open Access Journals (Sweden)

    Emre Guney

    Full Text Available Complex biological systems usually pose a trade-off between robustness and fragility where a small number of perturbations can substantially disrupt the system. Although biological systems are robust against changes in many external and internal conditions, even a single mutation can perturb the system substantially, giving rise to a pathophenotype. Recent advances in identifying and analyzing the sequential variations beneath human disorders help to comprehend a systemic view of the mechanisms underlying various disease phenotypes. Network-based disease-gene prioritization methods rank the relevance of genes in a disease under the hypothesis that genes whose proteins interact with each other tend to exhibit similar phenotypes. In this study, we have tested the robustness of several network-based disease-gene prioritization methods with respect to the perturbations of the system using various disease phenotypes from the Online Mendelian Inheritance in Man database. These perturbations have been introduced either in the protein-protein interaction network or in the set of known disease-gene associations. As the network-based disease-gene prioritization methods are based on the connectivity between known disease-gene associations, we have further used these methods to categorize the pathophenotypes with respect to the recoverability of hidden disease-genes. Our results have suggested that, in general, disease-genes are connected through multiple paths in the human interactome. Moreover, even when these paths are disturbed, network-based prioritization can reveal hidden disease-gene associations in some pathophenotypes such as breast cancer, cardiomyopathy, diabetes, leukemia, parkinson disease and obesity to a greater extend compared to the rest of the pathophenotypes tested in this study. Gene Ontology (GO analysis highlighted the role of functional diversity for such diseases.

  19. A method that reveals the multi-level ultrametric tree hidden in p -spin-glass-like systems

    International Nuclear Information System (INIS)

    Baviera, R; Virasoro, M A

    2015-01-01

    In the study of disordered models like spin glasses the key object of interest is the rugged energy hypersurface defined in configuration space. The statistical mechanics calculation of the Gibbs–Boltzmann partition function gives the information necessary to understand the equilibrium behavior of the system as a function of the temperature but is not enough if we are interested in the more general aspects of the hypersurface: it does not give us, for instance, the different degrees of ruggedness at different scales. In the context of the replica symmetry breaking (RSB) approach we discuss here a rather simple extension that can provide a much more detailed picture. The attractiveness of the method relies on the fact that it is conceptually transparent and the additional calculations are rather straightforward. We think that this approach reveals an ultrametric organisation with many levels in models like p-spin glasses when we include saddle points. In this first paper we present detailed calculations for the spherical p-spin glass model where we discover that the corresponding decreasing Parisi function q(x) codes this hidden ultrametric organisation. (paper)

  20. Application of the DSA preconditioned GMRES formalism to the method of characteristics - First results

    International Nuclear Information System (INIS)

    Le Tellier, R.; Hebert, A.

    2004-01-01

    The method of characteristics is well known for its slow convergence; consequently, as it is often done for SN methods, the Generalized Minimal Residual approach (GMRES) has been investigated for its practical implementation and its high reliability. GMRES is one of the most effective Krylov iterative methods to solve large linear systems. Moreover, the system has been 'left preconditioned' with the Algebraic Collapsing Acceleration (ACA) a variant of the Diffusion Synthetic Acceleration (DSA) based on I. Suslov's former works. This paper presents the first numerical results of these methods in 2D geometries with material discontinuities. Indeed, previous investigations have shown a degraded effectiveness of Diffusion Synthetic Accelerations with this kind of geometries. Results are presented for 9 x 9 Cartesian assemblies in terms of the speed of convergence of the inner iterations (fixed source) of the method of characteristics. It shows a significant improvement on the convergence rate. (authors)

  1. Methods uncovering usability issues in medication-related alerting functions: results from a systematic review.

    Science.gov (United States)

    Marcilly, Romaric; Vasseur, Francis; Ammenwerth, Elske; Beuscart-Zephir, Marie-Catherine

    2014-01-01

    This paper aims at listing the methods used to evaluate the usability of medication-related alerting functions and at knowing what type of usability issues those methods allow to detect. A sub-analysis of data from this systematic review has been performed. Methods applied in the included papers were collected. Then, included papers were sorted in four types of evaluation: "expert evaluation", "user- testing/simulation", "on site observation" and "impact studies". The types of usability issues (usability flaws, usage problems and negative outcomes) uncovered by those evaluations were analyzed. Results show that a large set of methods are used. The largest proportion of papers uses "on site observation" evaluation. This is the only evaluation type for which every kind of usability flaws, usage problems and outcomes are detected. It is somehow surprising that, in a usability systematic review, most of the papers included use a method that is not often presented as a usability method. Results are discussed about the opportunity to provide usability information collected after the implementation of the technology during their design process, i.e. before their implementation.

  2. Development and application of a new deterministic method for calculating computer model result uncertainties

    International Nuclear Information System (INIS)

    Maerker, R.E.; Worley, B.A.

    1989-01-01

    Interest in research into the field of uncertainty analysis has recently been stimulated as a result of a need in high-level waste repository design assessment for uncertainty information in the form of response complementary cumulative distribution functions (CCDFs) to show compliance with regulatory requirements. The solution to this problem must obviously rely on the analysis of computer code models, which, however, employ parameters that can have large uncertainties. The motivation for the research presented in this paper is a search for a method involving a deterministic uncertainty analysis approach that could serve as an improvement over those methods that make exclusive use of statistical techniques. A deterministic uncertainty analysis (DUA) approach based on the use of first derivative information is the method studied in the present procedure. The present method has been applied to a high-level nuclear waste repository problem involving use of the codes ORIGEN2, SAS, and BRINETEMP in series, and the resulting CDF of a BRINETEMP result of interest is compared with that obtained through a completely statistical analysis

  3. Application of NUREG-1150 methods and results to accident management

    International Nuclear Information System (INIS)

    Dingman, S.; Sype, T.; Camp, A.; Maloney, K.

    1991-01-01

    The use of NUREG-1150 and similar probabilistic risk assessments in the Nuclear Regulatory Commission (NRC) and industry risk management programs is discussed. Risk management is more comprehensive than the commonly used term accident management. Accident management includes strategies to prevent vessel breach, mitigate radionuclide releases from the reactor coolant system, and mitigate radionuclide releases to the environment. Risk management also addresses prevention of accident initiators, prevention of core damage, and implementation of effective emergency response procedures. The methods and results produced in NUREG-1150 provide a framework within which current risk management strategies can be evaluated, and future risk management programs can be developed and assessed. Examples of the use of the NUREG-1150 framework for identifying and evaluating risk management options are presented. All phases of risk management are discussed, with particular attention given to the early phases of accidents. Plans and methods for evaluating accident management strategies that have been identified in the NRC accident management program are discussed

  4. Application of NUREG-1150 methods and results to accident management

    International Nuclear Information System (INIS)

    Dingman, S.; Sype, T.; Camp, A.; Maloney, K.

    1990-01-01

    The use of NUREG-1150 and similar Probabilistic Risk Assessments in NRC and industry risk management programs is discussed. ''Risk management'' is more comprehensive than the commonly used term ''accident management.'' Accident management includes strategies to prevent vessel breach, mitigate radionuclide releases from the reactor coolant system, and mitigate radionuclide releases to the environment. Risk management also addresses prevention of accident initiators, prevention of core damage, and implementation of effective emergency response procedures. The methods and results produced in NUREG-1150 provide a framework within which current risk management strategies can be evaluated, and future risk management programs can be developed and assessed. Examples of the use of the NUREG-1150 framework for identifying and evaluating risk management options are presented. All phases of risk management are discussed, with particular attention given to the early phases of accidents. Plans and methods for evaluating accident management strategies that have been identified in the NRC accident management program are discussed. 2 refs., 3 figs

  5. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  6. Embodiment of the interpersonal nexus: revealing qualitative research findings on shoulder surgery patients

    Directory of Open Access Journals (Sweden)

    Glass N

    2012-03-01

    Full Text Available Nel Glass, K Robyn OgleSchool of Nursing, Midwifery and Paramedicine, Australian Catholic University, Fitzroy, VIC, AustraliaBackground: The paper reports on the importance of the interpersonal nexus within qualitative research processes, from a recent research project on patient experiences of shoulder surgery. Our aim is to reveal the importance of qualitative research processes and specifically the role of the interpersonal nexus in generating quality data. Literature related to the importance of human interactions and interpersonal communication processes in health-related research remains limited. Shoulder surgery has been reported to be associated with significant postoperative pain. While shoulder surgery research has investigated various analgesic techniques to determine key efficacy and minimization of adverse side effects, little has been reported from the patient perspective.Methods: Following institutional ethics approval, this project was conducted in two private hospitals in Victoria, Australia, in 2010. The methods included a survey questionnaire, semistructured interviews, and researcher-reflective journaling. Researcher-reflective journaling was utilized to highlight and discuss the interpersonal nexus.Results: This research specifically addresses the importance of the contributions of qualitative methods and processes to understanding patient experiences of analgesic efficacy and shoulder surgery. The results reveal the importance of the established research process and the interwoven interpersonal nexus between the researcher and the research participants. The interpersonal skills of presencing and empathetic engagement are particularly highlighted.Conclusion: The authors attest the significance of establishing an interpersonal nexus in order to reveal patient experiences of shoulder surgery. Interpersonal emotional engagement is particularly highlighted in data collection, in what may be otherwise understated and overlooked

  7. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON

    International Nuclear Information System (INIS)

    BEEBE - WANG, J.; LUCCIO, A.U.; D IMPERIO, N.; MACHIDA, S.

    2002-01-01

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed

  8. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON.

    Energy Technology Data Exchange (ETDEWEB)

    BEEBE - WANG,J.; LUCCIO,A.U.; D IMPERIO,N.; MACHIDA,S.

    2002-06-03

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed.

  9. Analytical method and result of radiation exposure for depressurization accident of HTTR

    International Nuclear Information System (INIS)

    Sawa, K.; Shiozawa, S.; Mikami, H.

    1990-01-01

    The Japan Atomic Energy Research Institute (JAERI) is now proceeding with the construction design of the High Temperature Engineering Test Reactor (HTTR). Since the HTTR has some characteristics different from LWRs, analytical method of radiation exposure in accidents provided for LWRs can not be applied directly. This paper describes the analytical method of radiation exposure developed by JAERI for the depressurization accident, which is the severest accident in respect to radiation exposure among the design basis accidents of the HTTR. The result is also described in this paper

  10. Propulsion and launching analysis of variable-mass rockets by analytical methods

    OpenAIRE

    D.D. Ganji; M. Gorji; M. Hatami; A. Hasanpour; N. Khademzadeh

    2013-01-01

    In this study, applications of some analytical methods on nonlinear equation of the launching of a rocket with variable mass are investigated. Differential transformation method (DTM), homotopy perturbation method (HPM) and least square method (LSM) were applied and their results are compared with numerical solution. An excellent agreement with analytical methods and numerical ones is observed in the results and this reveals that analytical methods are effective and convenient. Also a paramet...

  11. Novel personalized pathway-based metabolomics models reveal key metabolic pathways for breast cancer diagnosis

    DEFF Research Database (Denmark)

    Huang, Sijia; Chong, Nicole; Lewis, Nathan

    2016-01-01

    diagnosis. We applied this method to predict breast cancer occurrence, in combination with correlation feature selection (CFS) and classification methods. Results: The resulting all-stage and early-stage diagnosis models are highly accurate in two sets of testing blood samples, with average AUCs (Area Under.......993. Moreover, important metabolic pathways, such as taurine and hypotaurine metabolism and the alanine, aspartate, and glutamate pathway, are revealed as critical biological pathways for early diagnosis of breast cancer. Conclusions: We have successfully developed a new type of pathway-based model to study...... metabolomics data for disease diagnosis. Applying this method to blood-based breast cancer metabolomics data, we have discovered crucial metabolic pathway signatures for breast cancer diagnosis, especially early diagnosis. Further, this modeling approach may be generalized to other omics data types for disease...

  12. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  13. CT-guided percutaneous neurolysis methods. State of the art and first results

    International Nuclear Information System (INIS)

    Schneider, B.; Richter, G.M.; Roeren, T.; Kauffmann, G.W.

    1996-01-01

    We used 21G or 22G fine needles. All CT-guided percutaneous neurolysis methods require a proper blood coagulation. Most common CT scanners are suitable for neurolysis if there is enough room for maintaining sterile conditions. All neurolysis methods involve sterile puncture of the ganglia under local anesthesia, a test block with anesthetic and contrast agent to assess the clinical effect and the definitive block with a mixture of 96% ethanol and local anesthetic. This allows us to correct the position of the needle if we see improper distribution of the test block or unwanted side effects. Though inflammatory complications of the peritoneum due to puncture are rarely seen, we prefer the dorsal approach whenever possible. Results: Seven of 20 legs showed at least transient clinical improvement after CT-guided lumbar sympathectomies; 13 legs had to be amputated. Results of the methods in the literature differ. For lumbar sympathectomy, improved perfusion is reported in 39-89%, depending on the pre-selection of the patient group. Discussion: It was recently proved that sympathectomy not only improves perfusion of the skin but also of the muscle. The hypothesis of a steal effect after sympathectomy towards skin perfusion was disproved. Modern aggressive surgical and interventional treatment often leaves patients to sympathectomy whose reservers of collateralization are nearly exhausted. We presume this is the reason for the different results we found in our patient group. For thoracic sympathectomy the clinical treatment depends very much on the indications. Whereas palmar hyperhidrosis offers nearly 100% success, only 60-70% of patients with disturbance of perfusion have benefited. Results in celiac ganglia block also differ. Patients with carcinoma of the pancreas and other organs of the upper abdomen benefit in 80-100% of all cases, patients with chronic pancreatitis in 60-80%. (orig./VHE) [de

  14. THE RESULTS OF THE ANALYSIS OF THE STUDENTS’ BODY COMPOSITION BY BIOIMPEDANCE METHOD

    Directory of Open Access Journals (Sweden)

    Dmitry S. Blinov

    2016-06-01

    Full Text Available Introduction. Tissues of the human body can conduct electricity. Liquid medium (water, blood, the contents of hollow bodies, have a low impedance, i.e. good conductors, while denser tissue (muscle, nerves, etc. resistance is significantly higher. The biggest impedance have fat and bone tissues. The bioimpendancemetry – a method which allows to determine the composition of the human body by measuring electrical resistance (impedance of its tissues. Relevance. This technique is indispensable to dieticians and fitness trainers. In addition, the results of the study can provide invaluable assistance in the appointment of effective treatment physicians, gynecologists, orthopedists, and other specialists. The bioimpedance method helps to determine the risks of developing diabetes type 2, atherosclerosis, hypertension, diseases of the musculoskeletal system, disorders of the endocrine system, gall-stone disease and etc. Materials and Methods. In the list of parameters of body composition assessed by bioimpedance analysis method, included absolute and relative indicators. Depending on the method of measurement of the absolute rates were determined for the whole body. To absolute performance were: fat and skinny body mass index, active cell and skeletal muscle mass, total body water, cellular and extracellular fluid. Along with them were calculated relatively (normalized to body weight, lean mass, or other variables indicators of body composition. Results. In the result of the comparison of anthropometric and bioimpedance method found that growth performance, vital capacity, weight, waist circumference, circumfer¬ence of waist and hip, basal metabolism, body fat mass, normalized on growth, lean mass, percentage skeletal muscle mass in boys and girls with normal and excessive body weight had statistically significant differences. Discussion and Conclusions. In the present study physical development with consideration of body composition in students

  15. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  16. Tensile strength of concrete under static and intermediate strain rates: Correlated results from different testing methods

    International Nuclear Information System (INIS)

    Wu Shengxing; Chen Xudong; Zhou Jikai

    2012-01-01

    Highlights: ► Tensile strength of concrete increases with increase in strain rate. ► Strain rate sensitivity of tensile strength of concrete depends on test method. ► High stressed volume method can correlate results from various test methods. - Abstract: This paper presents a comparative experiment and analysis of three different methods (direct tension, splitting tension and four-point loading flexural tests) for determination of the tensile strength of concrete under low and intermediate strain rates. In addition, the objective of this investigation is to analyze the suitability of the high stressed volume approach and Weibull effective volume method to the correlation of the results of different tensile tests of concrete. The test results show that the strain rate sensitivity of tensile strength depends on the type of test, splitting tensile strength of concrete is more sensitive to an increase in the strain rate than flexural and direct tensile strength. The high stressed volume method could be used to obtain a tensile strength value of concrete, free from the influence of the characteristics of tests and specimens. However, the Weibull effective volume method is an inadequate method for describing failure of concrete specimens determined by different testing methods.

  17. Development of methods to measure hemoglobin adducts by gel electrophoresis - Preliminary results

    International Nuclear Information System (INIS)

    Sun, J.D.; McBride, S.M.

    1988-01-01

    Chemical adducts formed on blood hemoglobin may be a useful biomarker for assessing human exposures to these compounds. This paper reports preliminary results in the development of methods to measure such adducts that may be generally applicable for a wide variety of chemicals. Male F344/N rats were intraperitoneally injected with 14 C-BaP dissolved in corn oil. Twenty-four hours later, the rats were sacrificed. Blood samples were collected and globin was isolated. Globin protein was then cleaved into peptide fragments using cyanogen bromide and the fragments separated using 2-dimensional gel electrophoresis. The results showed that the adducted 14 C-globin fragments migrated to different areas of the gel than did unadducted fragments. Further research is being conducted to develop methods that will allow quantitation of separated adducted globin fragments from human blood samples without the use of a radiolabel. (author)

  18. Performance of various mathematical methods for calculation of radioimmunoassay results

    International Nuclear Information System (INIS)

    Sandel, P.; Vogt, W.

    1977-01-01

    Interpolation and regression methods are available for computer aided determination of radioimmunological end results. We compared the performance of eight algorithms (weighted and unweighted linear logit-log regression, quadratic logit-log regression, Rodbards logistic model in the weighted and unweighted form, smoothing spline interpolation with a large and small smoothing factor and polygonal interpolation) on the basis of three radioimmunoassays with different reference curve characteristics (digoxin, estriol, human chorionic somatomammotropin = HCS). Great store was set by the accuracy of the approximation at the intermediate points on the curve, ie. those points that lie midway between two standard concentrations. These concentrations were obtained by weighing and inserted as unknown samples. In the case of digoxin and estriol the polygonal interpolation provided the best results while the weighted logit-log regression proved superior in the case of HCS. (orig.) [de

  19. Influence of Meibomian Gland Expression Methods on Human Lipid Analysis Results.

    Science.gov (United States)

    Kunnen, Carolina M E; Brown, Simon H J; Lazon de la Jara, Percy; Holden, Brien A; Blanksby, Stephen J; Mitchell, Todd W; Papas, Eric B

    2016-01-01

    To compare the lipid composition of human meibum across three different meibum expression techniques. Meibum was collected from five healthy non-contact lens wearers (aged 20-35 years) after cleaning the eyelid margin using three meibum expression methods: cotton buds (CB), meibomian gland evaluator (MGE) and meibomian gland forceps (MGF). Meibum was also collected using cotton buds without cleaning the eyelid margin (CBn). Lipids were analyzed by chip-based, nano-electrospray mass spectrometry (ESI-MS). Comparisons were made using linear mixed models. Tandem MS enabled identification and quantification of over 200 lipid species across ten lipid classes. There were significant differences between collection techniques in the relative quantities of polar lipids obtained (P<.05). The MGE method returned smaller polar lipid quantities than the CB approaches. No significant differences were found between techniques for nonpolar lipids. No significant differences were found between cleaned and non-cleaned eyelids for polar or nonpolar lipids. Meibum expression technique influences the relative amount of phospholipids in the resulting sample. The highest amounts of phospholipids were detected with the CB approaches and the lowest with the MGE technique. Cleaning the eyelid margin prior to expression was not found to affect the lipid composition of the sample. This may be a consequence of the more forceful expression resulting in cell membrane contamination or higher risk of tear lipid contamination as a result of reflex tearing. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Radiologically revealed spine osteoporosis in male with hypertension and coronary heart disease

    Directory of Open Access Journals (Sweden)

    P A Chizhov

    2005-01-01

    Full Text Available Radiologically revealed spine osteoporosis in male with hypertension and coronary heart disease Objective. To study prevalence and intensity of spine osteoporosis (OP in men suffering from hypertension (H and coronary heart disease (CHD. Material and methods. 101 men with H and CHD aged 50 to 78 years (mean age 60,6±0,85 years and 37 men of control group without cardiovascular diseases aged 50-66 years (mean age 58,6±0,74 years were examined. Clinical examination, radiological, radiomorphometric spine examination and echocardioscopy were performed. Results. OP was revealed in 34,65% of main group pts what is 3,2 times more frequent than in control group (10,8%, p<0,05. OP intensity in men with H and CHD was significantly higher than in healthy people. Vfertebral fractures were revealed in 12,87+3,3% of main group pts and only in 2,7±2,7% in control group (p<0,05. OP development dependence from cardiac history duration and cardiac pathology severity was demonstrated. Conclusion. The results of the study show significantly higher prevalence of spine OP among men suffering from H and CHD. Long history and severity of cardiovascular pathology clinical signs promote OP frequency and severity increase.

  1. Revealing metabolite biomarkers for acupuncture treatment by linear programming based feature selection.

    Science.gov (United States)

    Wang, Yong; Wu, Qiao-Feng; Chen, Chen; Wu, Ling-Yun; Yan, Xian-Zhong; Yu, Shu-Guang; Zhang, Xiang-Sun; Liang, Fan-Rong

    2012-01-01

    Acupuncture has been practiced in China for thousands of years as part of the Traditional Chinese Medicine (TCM) and has gradually accepted in western countries as an alternative or complementary treatment. However, the underlying mechanism of acupuncture, especially whether there exists any difference between varies acupoints, remains largely unknown, which hinders its widespread use. In this study, we develop a novel Linear Programming based Feature Selection method (LPFS) to understand the mechanism of acupuncture effect, at molecular level, by revealing the metabolite biomarkers for acupuncture treatment. Specifically, we generate and investigate the high-throughput metabolic profiles of acupuncture treatment at several acupoints in human. To select the subsets of metabolites that best characterize the acupuncture effect for each meridian point, an optimization model is proposed to identify biomarkers from high-dimensional metabolic data from case and control samples. Importantly, we use nearest centroid as the prototype to simultaneously minimize the number of selected features and the leave-one-out cross validation error of classifier. We compared the performance of LPFS to several state-of-the-art methods, such as SVM recursive feature elimination (SVM-RFE) and sparse multinomial logistic regression approach (SMLR). We find that our LPFS method tends to reveal a small set of metabolites with small standard deviation and large shifts, which exactly serves our requirement for good biomarker. Biologically, several metabolite biomarkers for acupuncture treatment are revealed and serve as the candidates for further mechanism investigation. Also biomakers derived from five meridian points, Zusanli (ST36), Liangmen (ST21), Juliao (ST3), Yanglingquan (GB34), and Weizhong (BL40), are compared for their similarity and difference, which provide evidence for the specificity of acupoints. Our result demonstrates that metabolic profiling might be a promising method to

  2. Improved Method of Detection Falsification Results the Digital Image in Conditions of Attacks

    Directory of Open Access Journals (Sweden)

    Kobozeva A.A.

    2016-08-01

    Full Text Available The modern level of information technologies development has led to unheard ease embodiments hitherto unauthorized modifications of digital content. At the moment, very important question is the effective expert examination of authenticity of digital images, video, audio, development of the methods for identification and localization of violations of their integrity using these contents for purposes other than entertainment. Present paper deals with the improvement of the detection method of the cloning results in digital images - one of the most frequently used in the software tools falsification realized in all modern graphics editors. The method is intended for clone detection areas and pre-image in terms of additional disturbing influences in the image after the cloning operation for "masking" of the results, which complicates the search process. The improvement is aimed at reducing the number of "false alarms", when the area of the clone / pre-image detected in the original image or the localization of the identified areas do not correspond to the real clone and pre-image. The proposed improvement, based on analysis of different sizes per-pixel image blocks with the least difference from each other, has made it possible efficient functioning of the method, regardless of the specificity of the analyzed digital image.

  3. Comparison of estimation methods for fitting weibull distribution to ...

    African Journals Online (AJOL)

    Comparison of estimation methods for fitting weibull distribution to the natural stand of Oluwa Forest Reserve, Ondo State, Nigeria. ... Journal of Research in Forestry, Wildlife and Environment ... The result revealed that maximum likelihood method was more accurate in fitting the Weibull distribution to the natural stand.

  4. Solving the discrete KdV equation with homotopy analysis method

    International Nuclear Information System (INIS)

    Zou, L.; Zong, Z.; Wang, Z.; He, L.

    2007-01-01

    In this Letter, we apply the homotopy analysis method to differential-difference equations. We take the discrete KdV equation as an example, and successfully obtain double periodic wave solutions and solitary wave solutions. It illustrates the validity and the great potential of the homotopy analysis method in solving discrete KdV equation. Comparisons are made between the results of the proposed method and exact solutions. The results reveal that the proposed method is very effective and convenient

  5. In silico and experimental methods revealed highly diverse bacteria with quorum sensing and aromatics biodegradation systems--a potential broad application on bioremediation.

    Science.gov (United States)

    Huang, Yili; Zeng, Yanhua; Yu, Zhiliang; Zhang, Jing; Feng, Hao; Lin, Xiuchun

    2013-11-01

    Phylogenetic overlaps between aromatics-degrading bacteria and acyl-homoserine-lactone (AHL) or autoinducer (AI) based quorum-sensing (QS) bacteria were evident in literatures; however, the diversity of bacteria with both activities had never been finely described. In-silico searching in NCBI genome database revealed that more than 11% of investigated population harbored both aromatic ring-hydroxylating-dioxygenase (RHD) gene and AHL/AI-synthetase gene. These bacteria were distributed in 10 orders, 15 families, 42 genus and 78 species. Horizontal transfers of both genes were common among them. Using enrichment and culture dependent method, 6 Sphingomonadales and 4 Rhizobiales with phenanthrene- or pyrene-degrading ability and AHL-production were isolated from marine, wetland and soil samples. Thin-layer-chromatography and gas-chromatography-mass-spectrum revealed that these Sphingomonads produced various AHL molecules. This is the first report of highly diverse bacteria that harbored both aromatics-degrading and QS systems. QS regulation may have broad impacts on aromatics biodegradation, and would be a new angle for developing bioremediation technology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    Science.gov (United States)

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  7. MR diffusion histology and micro-tractography reveal mesoscale features of the human cerebellum.

    Science.gov (United States)

    Dell'Acqua, Flavio; Bodi, Istvan; Slater, David; Catani, Marco; Modo, Michel

    2013-12-01

    After 140 years from the discovery of Golgi's black reaction, the study of connectivity of the cerebellum remains a fascinating yet challenging task. Current histological techniques provide powerful methods for unravelling local axonal architecture, but the relatively low volume of data that can be acquired in a reasonable amount of time limits their application to small samples. State-of-the-art in vivo magnetic resonance imaging (MRI) methods, such as diffusion tractography techniques, can reveal trajectories of the major white matter pathways, but their correspondence with underlying anatomy is yet to be established. Hence, a significant gap exists between these two approaches as neither of them can adequately describe the three-dimensional complexity of fibre architecture at the level of the mesoscale (from a few millimetres to micrometres). In this study, we report the application of MR diffusion histology and micro-tractography methods to reveal the combined cytoarchitectural organisation and connectivity of the human cerebellum at a resolution of 100-μm (2 nl/voxel volume). Results show that the diffusion characteristics for each layer of the cerebellar cortex correctly reflect the known cellular composition and its architectural pattern. Micro-tractography also reveals details of the axonal connectivity of individual cerebellar folia and the intra-cortical organisation of the different cerebellar layers. The direct correspondence between MR diffusion histology and micro-tractography with immunohistochemistry indicates that these approaches have the potential to complement traditional histology techniques by providing a non-destructive, quantitative and three-dimensional description of the microstructural organisation of the healthy and pathological tissue.

  8. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  9. Introduction of e-learning in dental radiology reveals significantly improved results in final examination.

    Science.gov (United States)

    Meckfessel, Sandra; Stühmer, Constantin; Bormann, Kai-Hendrik; Kupka, Thomas; Behrends, Marianne; Matthies, Herbert; Vaske, Bernhard; Stiesch, Meike; Gellrich, Nils-Claudius; Rücker, Martin

    2011-01-01

    Because a traditionally instructed dental radiology lecture course is very time-consuming and labour-intensive, online courseware, including an interactive-learning module, was implemented to support the lectures. The purpose of this study was to evaluate the perceptions of students who have worked with web-based courseware as well as the effect on their results in final examinations. Users (n(3+4)=138) had access to the e-program from any networked computer at any time. Two groups (n(3)=71, n(4)=67) had to pass a final exam after using the e-course. Results were compared with two groups (n(1)=42, n(2)=48) who had studied the same content by attending traditional lectures. In addition a survey of the students was statistically evaluated. Most of the respondents reported a positive attitude towards e-learning and would have appreciated more access to computer-assisted instruction. Two years after initiating the e-course the failure rate in the final examination dropped significantly, from 40% to less than 2%. The very positive response to the e-program and improved test scores demonstrated the effectiveness of our e-course as a learning aid. Interactive modules in step with clinical practice provided learning that is not achieved by traditional teaching methods alone. To what extent staff savings are possible is part of a further study. Copyright © 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  10. Accuracy of the hypothetical sky-polarimetric Viking navigation versus sky conditions: revealing solar elevations and cloudinesses favourable for this navigation method

    Science.gov (United States)

    Száz, Dénes; Farkas, Alexandra; Barta, András; Kretzer, Balázs; Blahó, Miklós; Egri, Ádám; Szabó, Gyula; Horváth, Gábor

    2017-09-01

    According to Thorkild Ramskou's theory proposed in 1967, under overcast and foggy skies, Viking seafarers might have used skylight polarization analysed with special crystals called sunstones to determine the position of the invisible Sun. After finding the occluded Sun with sunstones, its elevation angle had to be measured and its shadow had to be projected onto the horizontal surface of a sun compass. According to Ramskou's theory, these sunstones might have been birefringent calcite or dichroic cordierite or tourmaline crystals working as polarizers. It has frequently been claimed that this method might have been suitable for navigation even in cloudy weather. This hypothesis has been accepted and frequently cited for decades without any experimental support. In this work, we determined the accuracy of this hypothetical sky-polarimetric Viking navigation for 1080 different sky situations characterized by solar elevation θ and cloudiness ρ, the sky polarization patterns of which were measured by full-sky imaging polarimetry. We used the earlier measured uncertainty functions of the navigation steps 1, 2 and 3 for calcite, cordierite and tourmaline sunstone crystals, respectively, and the newly measured uncertainty function of step 4 presented here. As a result, we revealed the meteorological conditions under which Vikings could have used this hypothetical navigation method. We determined the solar elevations at which the navigation uncertainties are minimal at summer solstice and spring equinox for all three sunstone types. On average, calcite sunstone ensures a more accurate sky-polarimetric navigation than tourmaline and cordierite. However, in some special cases (generally at 35° ≤ θ ≤ 40°, 1 okta ≤ ρ ≤ 6 oktas for summer solstice, and at 20° ≤ θ ≤ 25°, 0 okta ≤ ρ ≤ 4 oktas for spring equinox), the use of tourmaline and cordierite results in smaller navigation uncertainties than that of calcite. Generally, under clear or less cloudy

  11. Personality psychology: lexical approaches, assessment methods, and trait concepts reveal only half of the story--why it is time for a paradigm shift.

    Science.gov (United States)

    Uher, Jana

    2013-03-01

    This article develops a comprehensive philosophy-of-science for personality psychology that goes far beyond the scope of the lexical approaches, assessment methods, and trait concepts that currently prevail. One of the field's most important guiding scientific assumptions, the lexical hypothesis, is analysed from meta-theoretical viewpoints to reveal that it explicitly describes two sets of phenomena that must be clearly differentiated: 1) lexical repertoires and the representations that they encode and 2) the kinds of phenomena that are represented. Thus far, personality psychologists largely explored only the former, but have seriously neglected studying the latter. Meta-theoretical analyses of these different kinds of phenomena and their distinct natures, commonalities, differences, and interrelations reveal that personality psychology's focus on lexical approaches, assessment methods, and trait concepts entails a) erroneous meta-theoretical assumptions about what the phenomena being studied actually are, and thus how they can be analysed and interpreted, b) that contemporary personality psychology is largely based on everyday psychological knowledge, and c) a fundamental circularity in the scientific explanations used in trait psychology. These findings seriously challenge the widespread assumptions about the causal and universal status of the phenomena described by prominent personality models. The current state of knowledge about the lexical hypothesis is reviewed, and implications for personality psychology are discussed. Ten desiderata for future research are outlined to overcome the current paradigmatic fixations that are substantially hampering intellectual innovation and progress in the field.

  12. Chemical abundances of fast-rotating massive stars. I. Description of the methods and individual results

    Science.gov (United States)

    Cazorla, Constantin; Morel, Thierry; Nazé, Yaël; Rauw, Gregor; Semaan, Thierry; Daflon, Simone; Oey, M. S.

    2017-07-01

    Aims: Recent observations have challenged our understanding of rotational mixing in massive stars by revealing a population of fast-rotating objects with apparently normal surface nitrogen abundances. However, several questions have arisen because of a number of issues, which have rendered a reinvestigation necessary; these issues include the presence of numerous upper limits for the nitrogen abundance, unknown multiplicity status, and a mix of stars with different physical properties, such as their mass and evolutionary state, which are known to control the amount of rotational mixing. Methods: We have carefully selected a large sample of bright, fast-rotating early-type stars of our Galaxy (40 objects with spectral types between B0.5 and O4). Their high-quality, high-resolution optical spectra were then analysed with the stellar atmosphere modelling codes DETAIL/SURFACE or CMFGEN, depending on the temperature of the target. Several internal and external checks were performed to validate our methods; notably, we compared our results with literature data for some well-known objects, studied the effect of gravity darkening, or confronted the results provided by the two codes for stars amenable to both analyses. Furthermore, we studied the radial velocities of the stars to assess their binarity. Results: This first part of our study presents our methods and provides the derived stellar parameters, He, CNO abundances, and the multiplicity status of every star of the sample. It is the first time that He and CNO abundances of such a large number of Galactic massive fast rotators are determined in a homogeneous way. Based on observations obtained with the Heidelberg Extended Range Optical Spectrograph (HEROS) at the Telescopio Internacional de Guanajuato (TIGRE) with the SOPHIE échelle spectrograph at the Haute-Provence Observatory (OHP; Institut Pytheas; CNRS, France), and with the Magellan Inamori Kyocera Echelle (MIKE) spectrograph at the Magellan II Clay telescope

  13. [Methods in neonatal abstinence syndrome (NAS): results of a nationwide survey in Austria].

    Science.gov (United States)

    Bauchinger, S; Sapetschnig, I; Danda, M; Sommer, C; Resch, B; Urlesberger, B; Raith, W

    2015-08-01

    Neonatal abstinence syndrome (NAS) occurs in neonates whose mothers have taken addictive drugs or were under substitution therapy during pregnancy. Incidence numbers of NAS are on the rise globally, even in Austria NAS is not rare anymore. The aim of our survey was to reveal the status quo of dealing with NAS in Austria. A questionnaire was sent to 20 neonatology departments all over Austria, items included questions on scoring, therapy, breast-feeding and follow-up procedures. The response rate was 95%, of which 94.7% had written guidelines concerning NAS. The median number of children being treated per year for NAS was 4. Finnegan scoring system is used in 100% of the responding departments. Morphine is being used most often, in opiate abuse (100%) as well as in multiple substance abuse (44.4%). The most frequent forms of morphine preparation are morphine and diluted tincture of opium. Frequency as well as dosage of medication vary broadly. 61.1% of the departments supported breast-feeding, regulations concerned participation in a substitution programme and general contraindications (HIV, HCV, HBV). Our results revealed that there is a big west-east gradient in patients being treated per year. NAS is not a rare entity anymore in Austria (up to 50 cases per year in Vienna). Our survey showed that most neonatology departments in Austria treat their patients following written guidelines. Although all of them base these guidelines on international recommendations there is no national consensus. © Georg Thieme Verlag KG Stuttgart · New York.

  14. A homologous mapping method for three-dimensional reconstruction of protein networks reveals disease-associated mutations.

    Science.gov (United States)

    Huang, Sing-Han; Lo, Yu-Shu; Luo, Yong-Chun; Tseng, Yu-Yao; Yang, Jinn-Moon

    2018-03-19

    One of the crucial steps toward understanding the associations among molecular interactions, pathways, and diseases in a cell is to investigate detailed atomic protein-protein interactions (PPIs) in the structural interactome. Despite the availability of large-scale methods for analyzing PPI networks, these methods often focused on PPI networks using genome-scale data and/or known experimental PPIs. However, these methods are unable to provide structurally resolved interaction residues and their conservations in PPI networks. Here, we reconstructed a human three-dimensional (3D) structural PPI network (hDiSNet) with the detailed atomic binding models and disease-associated mutations by enhancing our PPI families and 3D-domain interologs from 60,618 structural complexes and complete genome database with 6,352,363 protein sequences across 2274 species. hDiSNet is a scale-free network (γ = 2.05), which consists of 5177 proteins and 19,239 PPIs with 5843 mutations. These 19,239 structurally resolved PPIs not only expanded the number of PPIs compared to present structural PPI network, but also achieved higher agreement with gene ontology similarities and higher co-expression correlation than the ones of 181,868 experimental PPIs recorded in public databases. Among 5843 mutations, 1653 and 790 mutations involved in interacting domains and contacting residues, respectively, are highly related to diseases. Our hDiSNet can provide detailed atomic interactions of human disease and their associated proteins with mutations. Our results show that the disease-related mutations are often located at the contacting residues forming the hydrogen bonds or conserved in the PPI family. In addition, hDiSNet provides the insights of the FGFR (EGFR)-MAPK pathway for interpreting the mechanisms of breast cancer and ErbB signaling pathway in brain cancer. Our results demonstrate that hDiSNet can explore structural-based interactions insights for understanding the mechanisms of disease

  15. Histological Grading of Hepatocellular Carcinomas with Intravoxel Incoherent Motion Diffusion-weighted Imaging: Inconsistent Results Depending on the Fitting Method.

    Science.gov (United States)

    Ichikawa, Shintaro; Motosugi, Utaroh; Hernando, Diego; Morisaka, Hiroyuki; Enomoto, Nobuyuki; Matsuda, Masanori; Onishi, Hiroshi

    2018-04-10

    To compare the abilities of three intravoxel incoherent motion (IVIM) imaging approximation methods to discriminate the histological grade of hepatocellular carcinomas (HCCs). Fifty-eight patients (60 HCCs) underwent IVIM imaging with 11 b-values (0-1000 s/mm 2 ). Slow (D) and fast diffusion coefficients (D * ) and the perfusion fraction (f) were calculated for the HCCs using the mean signal intensities in regions of interest drawn by two radiologists. Three approximation methods were used. First, all three parameters were obtained simultaneously using non-linear fitting (method A). Second, D was obtained using linear fitting (b = 500 and 1000), followed by non-linear fitting for D * and f (method B). Third, D was obtained by linear fitting, f was obtained using the regression line intersection and signals at b = 0, and non-linear fitting was used for D * (method C). A receiver operating characteristic analysis was performed to reveal the abilities of these methods to distinguish poorly-differentiated from well-to-moderately-differentiated HCCs. Inter-reader agreements were assessed using intraclass correlation coefficients (ICCs). The measurements of D, D * , and f in methods B and C (Az-value, 0.658-0.881) had better discrimination abilities than did those in method A (Az-value, 0.527-0.607). The ICCs of D and f were good to excellent (0.639-0.835) with all methods. The ICCs of D * were moderate with methods B (0.580) and C (0.463) and good with method A (0.705). The IVIM parameters may vary depending on the fitting methods, and therefore, further technical refinement may be needed.

  16. Genetic relationships among wild and cultivated accessions of curry leaf plant (Murraya koenigii (L.) Spreng.), as revealed by DNA fingerprinting methods.

    Science.gov (United States)

    Verma, Sushma; Rana, T S

    2013-02-01

    Murraya koenigii (L.) Spreng. (Rutaceae), is an aromatic plant and much valued for its flavor, nutritive and medicinal properties. In this study, three DNA fingerprinting methods viz., random amplification of polymorphic DNA (RAPD), directed amplification of minisatellite DNA (DAMD), and inter-simple sequence repeat (ISSR), were used to unravel the genetic variability and relationships across 92 wild and cultivated M. koenigii accessions. A total of 310, 102, and 184, DNA fragments were amplified using 20 RAPD, 5 DAMD, and 13 ISSR primers, revealing 95.80, 96.07, and 96.73% polymorphism, respectively, across all accessions. The average polymorphic information content value obtained with RAPD, DAMD, and ISSR markers was 0.244, 0.250, and 0.281, respectively. The UPGMA tree, based on Jaccard's similarity coefficient generated from the cumulative (RAPD, DAMD, and ISSR) band data showed two distinct clusters, clearly separating wild and cultivated accessions in the dendrogram. Percentage polymorphism, gene diversity (H), and Shannon information index (I) estimates were higher in cultivated accessions compared to wild accessions. The overall high level of polymorphism and varied range of genetic distances revealed a wide genetic base in M. koenigii accessions. The study suggests that RAPD, DAMD, and ISSR markers are highly useful to unravel the genetic variability in wild and cultivated accessions of M. koenigii.

  17. Revealing effective classifiers through network comparison

    Science.gov (United States)

    Gallos, Lazaros K.; Fefferman, Nina H.

    2014-11-01

    The ability to compare complex systems can provide new insight into the fundamental nature of the processes captured, in ways that are otherwise inaccessible to observation. Here, we introduce the n-tangle method to directly compare two networks for structural similarity, based on the distribution of edge density in network subgraphs. We demonstrate that this method can efficiently introduce comparative analysis into network science and opens the road for many new applications. For example, we show how the construction of a “phylogenetic tree” across animal taxa according to their social structure can reveal commonalities in the behavioral ecology of the populations, or how students create similar networks according to the University size. Our method can be expanded to study many additional properties, such as network classification, changes during time evolution, convergence of growth models, and detection of structural changes during damage.

  18. Single primer amplification reaction methods reveal exotic and ...

    Indian Academy of Sciences (India)

    Unknown

    mulberry varieties using three different PCR based single primer amplification ..... the results of a multi- variate analysis using Mahalanobis D2 statistic in case of .... Rajan M V, Chaturvedi H K and Sarkar A 1997 Multivariate analysis as an aid ...

  19. Raw material consumption of the European Union--concept, calculation method, and results.

    Science.gov (United States)

    Schoer, Karl; Weinzettel, Jan; Kovanda, Jan; Giegrich, Jürgen; Lauwigi, Christoph

    2012-08-21

    This article presents the concept, calculation method, and first results of the "Raw Material Consumption" (RMC) economy-wide material flow indicator for the European Union (EU). The RMC measures the final domestic consumption of products in terms of raw material equivalents (RME), i.e. raw materials used in the complete production chain of consumed products. We employed the hybrid input-output life cycle assessment method to calculate RMC. We first developed a highly disaggregated environmentally extended mixed unit input output table and then applied life cycle inventory data for imported products without appropriate representation of production within the domestic economy. Lastly, we treated capital formation as intermediate consumption. Our results show that services, often considered as a solution for dematerialization, account for a significant part of EU raw material consumption, which emphasizes the need to focus on the full production chains and dematerialization of services. Comparison of the EU's RMC with its domestic extraction shows that the EU is nearly self-sufficient in biomass and nonmetallic minerals but extremely dependent on direct and indirect imports of fossil energy carriers and metal ores. This implies an export of environmental burden related to extraction and primary processing of these materials to the rest of the world. Our results demonstrate that internalizing capital formation has significant influence on the calculated RMC.

  20. On Calculation Methods and Results for Straight Cylindrical Roller Bearing Deflection, Stiffness, and Stress

    Science.gov (United States)

    Krantz, Timothy L.

    2011-01-01

    The purpose of this study was to assess some calculation methods for quantifying the relationships of bearing geometry, material properties, load, deflection, stiffness, and stress. The scope of the work was limited to two-dimensional modeling of straight cylindrical roller bearings. Preparations for studies of dynamic response of bearings with damaged surfaces motivated this work. Studies were selected to exercise and build confidence in the numerical tools. Three calculation methods were used in this work. Two of the methods were numerical solutions of the Hertz contact approach. The third method used was a combined finite element surface integral method. Example calculations were done for a single roller loaded between an inner and outer raceway for code verification. Next, a bearing with 13 rollers and all-steel construction was used as an example to do additional code verification, including an assessment of the leading order of accuracy of the finite element and surface integral method. Results from that study show that the method is at least first-order accurate. Those results also show that the contact grid refinement has a more significant influence on precision as compared to the finite element grid refinement. To explore the influence of material properties, the 13-roller bearing was modeled as made from Nitinol 60, a material with very different properties from steel and showing some potential for bearing applications. The codes were exercised to compare contact areas and stress levels for steel and Nitinol 60 bearings operating at equivalent power density. As a step toward modeling the dynamic response of bearings having surface damage, static analyses were completed to simulate a bearing with a spall or similar damage.

  1. Aircrew Exposure To Cosmic Radiation Evaluated By Means Of Several Methods; Results Obtained In 2006

    International Nuclear Information System (INIS)

    Ploc, Ondrej; Spurny, Frantisek; Jadrnickova, Iva; Turek, Karel

    2008-01-01

    Routine evaluation of aircraft crew exposure to cosmic radiation in the Czech Republic is performed by means of calculation method. Measurements onboard aircraft work as a control tool of the routine method, as well as a possibility of comparison of results measured by means of several methods. The following methods were used in 2006: (1) mobile dosimetry unit (MDU) type Liulin--a spectrometer of energy deposited in Si-detector; (2) two types of LET spectrometers based on the chemically etched track detectors (TED); (3) two types of thermoluminescent detectors; and (4) two calculation methods. MDU represents currently one of the most reliable equipments for evaluation of the aircraft crew exposure to cosmic radiation. It is an active device which measures total energy depositions (E dep ) in the semiconductor unit, and, after appropriate calibration, is able to give a separate estimation for non-neutron and neutron-like components of H*(10). This contribution consists mostly of results acquired by means of this equipment; measurements with passive detectors and calculations are mentioned because of comparison. Reasonably good agreement of all data sets could be stated

  2. Application of a hierarchical enzyme classification method reveals the role of gut microbiome in human metabolism

    Science.gov (United States)

    2015-01-01

    Background Enzymes are known as the molecular machines that drive the metabolism of an organism; hence identification of the full enzyme complement of an organism is essential to build the metabolic blueprint of that species as well as to understand the interplay of multiple species in an ecosystem. Experimental characterization of the enzymatic reactions of all enzymes in a genome is a tedious and expensive task. The problem is more pronounced in the metagenomic samples where even the species are not adequately cultured or characterized. Enzymes encoded by the gut microbiota play an essential role in the host metabolism; thus, warranting the need to accurately identify and annotate the full enzyme complements of species in the genomic and metagenomic projects. To fulfill this need, we develop and apply a method called ECemble, an ensemble approach to identify enzymes and enzyme classes and study the human gut metabolic pathways. Results ECemble method uses an ensemble of machine-learning methods to accurately model and predict enzymes from protein sequences and also identifies the enzyme classes and subclasses at the finest resolution. A tenfold cross-validation result shows accuracy between 97 and 99% at different levels in the hierarchy of enzyme classification, which is superior to comparable methods. We applied ECemble to predict the entire complements of enzymes from ten sequenced proteomes including the human proteome. We also applied this method to predict enzymes encoded by the human gut microbiome from gut metagenomic samples, and to study the role played by the microbe-derived enzymes in the human metabolism. After mapping the known and predicted enzymes to canonical human pathways, we identified 48 pathways that have at least one bacteria-encoded enzyme, which demonstrates the complementary role of gut microbiome in human gut metabolism. These pathways are primarily involved in metabolizing dietary nutrients such as carbohydrates, amino acids, lipids

  3. Ripple formation in unilamellar-supported lipid bilayer revealed by FRAPP.

    Science.gov (United States)

    Harb, Frédéric; Simon, Anne; Tinland, Bernard

    2013-12-01

    The mechanisms of formation and conditions of the existence of the ripple phase are fundamental thermodynamic questions with practical implications for medicine and pharmaceuticals. We reveal a new case of ripple formation occurring in unilamellar-supported bilayers in water, which results solely from the bilayer/support interaction, without using lipid mixtures or specific ions. This ripple phase is detected by FRAPP using diffusion coefficient measurements as a function of temperature: a diffusivity plateau is observed. It occurs in the same temperature range where ripple phase existence has been observed using other methods. When AFM experiments are performed in the appropriate temperature range the ripple phase is confirmed.

  4. A novel method for RNA extraction from FFPE samples reveals significant differences in biomarker expression between orthotopic and subcutaneous pancreatic cancer patient-derived xenografts.

    Science.gov (United States)

    Hoover, Malachia; Adamian, Yvess; Brown, Mark; Maawy, Ali; Chang, Alexander; Lee, Jacqueline; Gharibi, Armen; Katz, Matthew H; Fleming, Jason; Hoffman, Robert M; Bouvet, Michael; Doebler, Robert; Kelber, Jonathan A

    2017-01-24

    Next-generation sequencing (NGS) can identify and validate new biomarkers of cancer onset, progression and therapy resistance. Substantial archives of formalin-fixed, paraffin-embedded (FFPE) cancer samples from patients represent a rich resource for linking molecular signatures to clinical data. However, performing NGS on FFPE samples is limited by poor RNA purification methods. To address this hurdle, we developed an improved methodology for extracting high-quality RNA from FFPE samples. By briefly integrating a newly-designed micro-homogenizing (mH) tool with commercially available FFPE RNA extraction protocols, RNA recovery is increased by approximately 3-fold while maintaining standard A260/A280 ratios and RNA quality index (RQI) values. Furthermore, we demonstrate that the mH-purified FFPE RNAs are longer and of higher integrity. Previous studies have suggested that pancreatic ductal adenocarcinoma (PDAC) gene expression signatures vary significantly under in vitro versus in vivo and in vivo subcutaneous versus orthotopic conditions. By using our improved mH-based method, we were able to preserve established expression patterns of KRas-dependency genes within these three unique microenvironments. Finally, expression analysis of novel biomarkers in KRas mutant PDAC samples revealed that PEAK1 decreases and MST1R increases by over 100-fold in orthotopic versus subcutaneous microenvironments. Interestingly, however, only PEAK1 levels remain elevated in orthotopically grown KRas wild-type PDAC cells. These results demonstrate the critical nature of the orthotopic tumor microenvironment when evaluating the clinical relevance of new biomarkers in cells or patient-derived samples. Furthermore, this new mH-based FFPE RNA extraction method has the potential to enhance and expand future FFPE-RNA-NGS cancer biomarker studies.

  5. Comparison Of Simulation Results When Using Two Different Methods For Mold Creation In Moldflow Simulation

    Directory of Open Access Journals (Sweden)

    Kaushikbhai C. Parmar

    2017-04-01

    Full Text Available Simulation gives different results when using different methods for the same simulation. Autodesk Moldflow Simulation software provide two different facilities for creating mold for the simulation of injection molding process. Mold can be created inside the Moldflow or it can be imported as CAD file. The aim of this paper is to study the difference in the simulation results like mold temperature part temperature deflection in different direction time for the simulation and coolant temperature for this two different methods.

  6. MULTICRITERIA METHODS IN PERFORMING COMPANIES’ RESULTS USING ELECTRONIC RECRUITING, CORPORATE COMMUNICATION AND FINANCIAL RATIOS

    Directory of Open Access Journals (Sweden)

    Ivana Bilić

    2011-02-01

    Full Text Available Human resources represent one of the most important companies’ resources responsible in creation of companies’ competitive advantage. In search for the most valuable resources, companies use different methods. Lately, one of the growing methods is electronic recruiting, not only as a recruitment tool, but also as a mean of external communication. Additionally, in the process of corporate communication, companies nowadays use the electronic corporate communication as the easiest, the cheapest and the simplest form of business communication. The aim of this paper is to investigate relationship between three groups of different criteria; including main characteristics of performed electronic recruiting, corporate communication and selected financial performances. Selected companies were ranked separately by each group of criteria by usage of multicriteria decision making method PROMETHEE II. The main idea is to research whether companies which are the highest performers by certain group of criteria obtain the similar results regarding other group of criteria or performing results.

  7. The Trojan Horse method for nuclear astrophysics: Recent results for direct reactions

    International Nuclear Information System (INIS)

    Tumino, A.; Gulino, M.; Spitaleri, C.; Cherubini, S.; Romano, S.; Cognata, M. La; Pizzone, R. G.; Rapisarda, G. G.; Lamia, L.

    2014-01-01

    The Trojan Horse method is a powerful indirect technique to determine the astrophysical factor for binary rearrangement processes A+x→b+B at astrophysical energies by measuring the cross section for the Trojan Horse (TH) reaction A+a→B+b+s in quasi free kinematics. The Trojan Horse Method has been successfully applied to many reactions of astrophysical interest, both direct and resonant. In this paper, we will focus on direct sub-processes. The theory of the THM for direct binary reactions will be shortly presented based on a few-body approach that takes into account the off-energy-shell effects and initial and final state interactions. Examples of recent results will be presented to demonstrate how THM works experimentally

  8. The Trojan Horse method for nuclear astrophysics: Recent results for direct reactions

    Energy Technology Data Exchange (ETDEWEB)

    Tumino, A.; Gulino, M. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania, Italy and Università degli Studi di Enna Kore, Enna (Italy); Spitaleri, C.; Cherubini, S.; Romano, S. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania, Italy and Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy); Cognata, M. La; Pizzone, R. G.; Rapisarda, G. G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Lamia, L. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy)

    2014-05-09

    The Trojan Horse method is a powerful indirect technique to determine the astrophysical factor for binary rearrangement processes A+x→b+B at astrophysical energies by measuring the cross section for the Trojan Horse (TH) reaction A+a→B+b+s in quasi free kinematics. The Trojan Horse Method has been successfully applied to many reactions of astrophysical interest, both direct and resonant. In this paper, we will focus on direct sub-processes. The theory of the THM for direct binary reactions will be shortly presented based on a few-body approach that takes into account the off-energy-shell effects and initial and final state interactions. Examples of recent results will be presented to demonstrate how THM works experimentally.

  9. Results of an interlaboratory comparison of analytical methods for contaminants of emerging concern in water.

    Science.gov (United States)

    Vanderford, Brett J; Drewes, Jörg E; Eaton, Andrew; Guo, Yingbo C; Haghani, Ali; Hoppe-Jones, Christiane; Schluesener, Michael P; Snyder, Shane A; Ternes, Thomas; Wood, Curtis J

    2014-01-07

    An evaluation of existing analytical methods used to measure contaminants of emerging concern (CECs) was performed through an interlaboratory comparison involving 25 research and commercial laboratories. In total, 52 methods were used in the single-blind study to determine method accuracy and comparability for 22 target compounds, including pharmaceuticals, personal care products, and steroid hormones, all at ng/L levels in surface and drinking water. Method biases ranged from caffeine, NP, OP, and triclosan had false positive rates >15%. In addition, some methods reported false positives for 17β-estradiol and 17α-ethynylestradiol in unspiked drinking water and deionized water, respectively, at levels higher than published predicted no-effect concentrations for these compounds in the environment. False negative rates were also generally contamination, misinterpretation of background interferences, and/or inappropriate setting of detection/quantification levels for analysis at low ng/L levels. The results of both comparisons were collectively assessed to identify parameters that resulted in the best overall method performance. Liquid chromatography-tandem mass spectrometry coupled with the calibration technique of isotope dilution were able to accurately quantify most compounds with an average bias of <10% for both matrixes. These findings suggest that this method of analysis is suitable at environmentally relevant levels for most of the compounds studied. This work underscores the need for robust, standardized analytical methods for CECs to improve data quality, increase comparability between studies, and help reduce false positive and false negative rates.

  10. Cocaine Hydrochloride Structure in Solution Revealed by Three Chiroptical Methods

    Czech Academy of Sciences Publication Activity Database

    Fagan, P.; Kocourková, L.; Tatarkovič, M.; Králík, F.; Kuchař, M.; Setnička, V.; Bouř, Petr

    2017-01-01

    Roč. 18, č. 16 (2017), s. 2258-2265 ISSN 1439-4235 R&D Projects: GA ČR(CZ) GA16-05935S; GA MŠk(CZ) LTC17012 Institutional support: RVO:61388963 Keywords : analytical methods * circular dichroism * density functional calculations * Raman spectroscopy * structure elucidation Subject RIV: CF - Physical ; Theoretical Chemistry OBOR OECD: Physical chemistry Impact factor: 3.075, year: 2016

  11. A comparison of short-term dispersion estimates resulting from various atmospheric stability classification methods

    International Nuclear Information System (INIS)

    Mitchell, A.E. Jr.

    1982-01-01

    Four methods of classifying atmospheric stability class are applied at four sites to make short-term (1-h) dispersion estimates from a ground-level source based on a model consistent with U.S. Nuclear Regulatory Commission practice. The classification methods include vertical temperature gradient, standard deviation of horizontal wind direction fluctuations (sigma theta), Pasquill-Turner, and modified sigma theta which accounts for meander. Results indicate that modified sigma theta yields reasonable dispersion estimates compared to those produced using methods of vertical temperature gradient and Pasquill-Turner, and can be considered as a potential economic alternative in establishing onsite monitoring programs. (author)

  12. Results of Lindgren-Turan Operation in Hallux Valgus

    Directory of Open Access Journals (Sweden)

    İstemi YÜCEL,

    2010-05-01

    Full Text Available Purpose: We evaluated the results of the Lindgren-Turan operation in the treatment of halluxvalgus.Methods: 24 feet of 18 patients were operated by the Lindgren-Turan osteotomy. Radiological,functional and pain assessments were applied to all patients.Results: Treatment produced a statistically highly significant difference in the hallux valgusangle and 1.-2.intermetatarsal angle (p0.001. Themean subjective evaluations of the patients were 8.43±0.72.Conclusion: We conclude that Lindgren-Turan osteotomy which revealed successful results onpain, deformity correction and bone healing and also provided high personal satisfaction, is areliable technique in the surgical correction of hallux valgus

  13. Methodics of computing the results of monitoring the exploratory gallery

    Directory of Open Access Journals (Sweden)

    Krúpa Víazoslav

    2000-09-01

    Full Text Available At building site of motorway tunnel Višòové-Dubná skala , the priority is given to driving of exploration galley that secures in detail: geologic, engineering geology, hydrogeology and geotechnics research. This research is based on gathering information for a supposed use of the full profile driving machine that would drive the motorway tunnel. From a part of the exploration gallery which is driven by the TBM method, a fulfilling information is gathered about the parameters of the driving process , those are gathered by a computer monitoring system. The system is mounted on a driving machine. This monitoring system is based on the industrial computer PC 104. It records 4 basic values of the driving process: the electromotor performance of the driving machine Voest-Alpine ATB 35HA, the speed of driving advance, the rotation speed of the disintegrating head TBM and the total head pressure. The pressure force is evaluated from the pressure in the hydraulic cylinders of the machine. Out of these values, the strength of rock mass, the angle of inner friction, etc. are mathematically calculated. These values characterize rock mass properties as their changes. To define the effectivity of the driving process, the value of specific energy and the working ability of driving head is used. The article defines the methodics of computing the gathered monitoring information, that is prepared for the driving machine Voest – Alpine ATB 35H at the Institute of Geotechnics SAS. It describes the input forms (protocols of the developed method created by an EXCEL program and shows selected samples of the graphical elaboration of the first monitoring results obtained from exploratory gallery driving process in the Višòové – Dubná skala motorway tunnel.

  14. Facial cellulitis revealing choreo-acanthocytosis: A case report ...

    African Journals Online (AJOL)

    We report a 62 year-old-man with facial cellulitis revealing choreo-acanthocytosis (ChAc). He showed chorea that started 20 years ago. The orofacial dyskinisia with tongue and cheek biting resulted in facial cellulitis. The peripheral blood smear revealed acanthocytosis of 25%. The overall of chorea, orofacial dyskinetic ...

  15. Testing the ISP method with the PARIO device: Accuracy of results and influence of homogenization technique

    Science.gov (United States)

    Durner, Wolfgang; Huber, Magdalena; Yangxu, Li; Steins, Andi; Pertassek, Thomas; Göttlein, Axel; Iden, Sascha C.; von Unold, Georg

    2017-04-01

    The particle-size distribution (PSD) is one of the main properties of soils. To determine the proportions of the fine fractions silt and clay, sedimentation experiments are used. Most common are the Pipette and Hydrometer method. Both need manual sampling at specific times. Both are thus time-demanding and rely on experienced operators. Durner et al. (Durner, W., S.C. Iden, and G. von Unold (2017): The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830) recently developed the integral suspension method (ISP) method, which is implemented in the METER Group device PARIOTM. This new method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel. The aim of this study was to test the precision and accuracy of new method with a variety of materials, to answer the following research questions: (1) Are the results obtained by PARIO reliable and stable? (2) Are the results affected by the initial mixing technique to homogenize the suspension, or by the presence of sand in the experiment? (3) Are the results identical to the one that are obtained with the Pipette method as reference method? The experiments were performed with a pure quartz silt material and four real soil materials. PARIO measurements were done repetitively on the same samples in a temperature-controlled lab to characterize the repeatability of the measurements. Subsequently, the samples were investigated by the pipette method to validate the results. We found that the statistical error for silt fraction from replicate and repetitive measurements was in the range of 1% for the quartz material to 3% for soil materials. Since the sand fractions, as in any sedimentation method, must

  16. The review and results of different methods for facial recognition

    Science.gov (United States)

    Le, Yifan

    2017-09-01

    In recent years, facial recognition draws much attention due to its wide potential applications. As a unique technology in Biometric Identification, facial recognition represents a significant improvement since it could be operated without cooperation of people under detection. Hence, facial recognition will be taken into defense system, medical detection, human behavior understanding, etc. Several theories and methods have been established to make progress in facial recognition: (1) A novel two-stage facial landmark localization method is proposed which has more accurate facial localization effect under specific database; (2) A statistical face frontalization method is proposed which outperforms state-of-the-art methods for face landmark localization; (3) It proposes a general facial landmark detection algorithm to handle images with severe occlusion and images with large head poses; (4) There are three methods proposed on Face Alignment including shape augmented regression method, pose-indexed based multi-view method and a learning based method via regressing local binary features. The aim of this paper is to analyze previous work of different aspects in facial recognition, focusing on concrete method and performance under various databases. In addition, some improvement measures and suggestions in potential applications will be put forward.

  17. Conjunctival lymphangioma in a 4-year-old girl revealed tuberous sclerosis complex

    Directory of Open Access Journals (Sweden)

    Freiberg, Florentina Joyce

    2016-09-01

    Full Text Available Background: To present a case of conjunctival lymphangioma in a girl with tuberous sclerosis complex.Methods/results: A 4-year-old girl presented with a relapsing cystic lesion of the bulbar conjunctiva in the right eye with string-of-pearl-like dilation of lymphatic vessels and right-sided facial swelling with mild pain. Best-corrected vision was not impaired. Examination of the skin revealed three hypomelanotic macules and a lumbal Shagreen patch. Magnetic resonance imaging (MRI findings displayed minimal enhancement of buccal fat on the right side. Cranial and orbital MRI showed signal enhancement in the right cortical and subcortical areas. Genetic analysis revealed a heterozygous deletion encompassing exon 1 and 2 of the gene (tuberous sclerosis complex 1 gene, confirming the diagnosis of tuberous sclerosis complex.Conclusion: In conjunctival lymphangioma, tuberous sclerosis complex should be considered as the primary disease.

  18. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    Science.gov (United States)

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  19. Decision making with consonant belief functions: Discrepancy resulting with the probability transformation method used

    Directory of Open Access Journals (Sweden)

    Cinicioglu Esma Nur

    2014-01-01

    Full Text Available Dempster−Shafer belief function theory can address a wider class of uncertainty than the standard probability theory does, and this fact appeals the researchers in operations research society for potential application areas. However, the lack of a decision theory of belief functions gives rise to the need to use the probability transformation methods for decision making. For representation of statistical evidence, the class of consonant belief functions is used which is not closed under Dempster’s rule of combination but is closed under Walley’s rule of combination. In this research, it is shown that the outcomes obtained using both Dempster’s and Walley’s rules do result in different probability distributions when pignistic transformation is used. However, when plausibility transformation is used, they do result in the same probability distribution. This result shows that the choice of the combination rule and probability transformation method may have a significant effect on decision making since it may change the choice of the decision alternative selected. This result is illustrated via an example of missile type identification.

  20. Gaussian graphical modeling reveals specific lipid correlations in glioblastoma cells

    Science.gov (United States)

    Mueller, Nikola S.; Krumsiek, Jan; Theis, Fabian J.; Böhm, Christian; Meyer-Bäse, Anke

    2011-06-01

    Advances in high-throughput measurements of biological specimens necessitate the development of biologically driven computational techniques. To understand the molecular level of many human diseases, such as cancer, lipid quantifications have been shown to offer an excellent opportunity to reveal disease-specific regulations. The data analysis of the cell lipidome, however, remains a challenging task and cannot be accomplished solely based on intuitive reasoning. We have developed a method to identify a lipid correlation network which is entirely disease-specific. A powerful method to correlate experimentally measured lipid levels across the various samples is a Gaussian Graphical Model (GGM), which is based on partial correlation coefficients. In contrast to regular Pearson correlations, partial correlations aim to identify only direct correlations while eliminating indirect associations. Conventional GGM calculations on the entire dataset can, however, not provide information on whether a correlation is truly disease-specific with respect to the disease samples and not a correlation of control samples. Thus, we implemented a novel differential GGM approach unraveling only the disease-specific correlations, and applied it to the lipidome of immortal Glioblastoma tumor cells. A large set of lipid species were measured by mass spectrometry in order to evaluate lipid remodeling as a result to a combination of perturbation of cells inducing programmed cell death, while the other perturbations served solely as biological controls. With the differential GGM, we were able to reveal Glioblastoma-specific lipid correlations to advance biomedical research on novel gene therapies.

  1. Comparison of sampling methods for the assessment of indoor microbial exposure

    DEFF Research Database (Denmark)

    Frankel, M; Timm, Michael; Hansen, E W

    2012-01-01

    revealed. This study thus facilitates comparison between methods and may therefore be used as a frame of reference when studying the literature or when conducting further studies on indoor microbial exposure. Results also imply that the relatively simple EDC method for the collection of settled dust may...

  2. European external quality control study on the competence of laboratories to recognize rare sequence variants resulting in unusual genotyping results.

    Science.gov (United States)

    Márki-Zay, János; Klein, Christoph L; Gancberg, David; Schimmel, Heinz G; Dux, László

    2009-04-01

    Depending on the method used, rare sequence variants adjacent to the single nucleotide polymorphism (SNP) of interest may cause unusual or erroneous genotyping results. Because such rare variants are known for many genes commonly tested in diagnostic laboratories, we organized a proficiency study to assess their influence on the accuracy of reported laboratory results. Four external quality control materials were processed and sent to 283 laboratories through 3 EQA organizers for analysis of the prothrombin 20210G>A mutation. Two of these quality control materials contained sequence variants introduced by site-directed mutagenesis. One hundred eighty-nine laboratories participated in the study. When samples gave a usual result with the method applied, the error rate was 5.1%. Detailed analysis showed that more than 70% of the failures were reported from only 9 laboratories. Allele-specific amplification-based PCR had a much higher error rate than other methods (18.3% vs 2.9%). The variants 20209C>T and [20175T>G; 20179_20180delAC] resulted in unusual genotyping results in 67 and 85 laboratories, respectively. Eighty-three (54.6%) of these unusual results were not recognized, 32 (21.1%) were attributed to technical issues, and only 37 (24.3%) were recognized as another sequence variant. Our findings revealed that some of the participating laboratories were not able to recognize and correctly interpret unusual genotyping results caused by rare SNPs. Our study indicates that the majority of the failures could be avoided by improved training and careful selection and validation of the methods applied.

  3. Effect of tidal triggering on seismicity in Taiwan revealed by the empirical mode decomposition method

    Directory of Open Access Journals (Sweden)

    H.-J. Chen

    2012-07-01

    Full Text Available The effect of tidal triggering on earthquake occurrence has been controversial for many years. This study considered earthquakes that occurred near Taiwan between 1973 and 2008. Because earthquake data are nonlinear and non-stationary, we applied the empirical mode decomposition (EMD method to analyze the temporal variations in the number of daily earthquakes to investigate the effect of tidal triggering. We compared the results obtained from the non-declustered catalog with those from two kinds of declustered catalogs and discuss the aftershock effect on the EMD-based analysis. We also investigated stacking the data based on in-phase phenomena of theoretical Earth tides with statistical significance tests. Our results show that the effects of tidal triggering, particularly the lunar tidal effect, can be extracted from the raw seismicity data using the approach proposed here. Our results suggest that the lunar tidal force is likely a factor in the triggering of earthquakes.

  4. Lesion insertion in the projection domain: Methods and initial results

    International Nuclear Information System (INIS)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia

    2015-01-01

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically

  5. Lesion insertion in the projection domain: Methods and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia, E-mail: mccollough.cynthia@mayo.edu [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States)

    2015-12-15

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically

  6. RESULTS OF THE QUESTIONNAIRE: ANALYSIS METHODS

    CERN Multimedia

    Staff Association

    2014-01-01

    Five-yearly review of employment conditions   Article S V 1.02 of our Staff Rules states that the CERN “Council shall periodically review and determine the financial and social conditions of the members of the personnel. These periodic reviews shall consist of a five-yearly general review of financial and social conditions;” […] “following methods […] specified in § I of Annex A 1”. Then, turning to the relevant part in Annex A 1, we read that “The purpose of the five-yearly review is to ensure that the financial and social conditions offered by the Organization allow it to recruit and retain the staff members required for the execution of its mission from all its Member States. […] these staff members must be of the highest competence and integrity.” And for the menu of such a review we have: “The five-yearly review must include basic salaries and may include any other financial or soc...

  7. A statistical method for testing epidemiological results, as applied to the Hanford worker population

    International Nuclear Information System (INIS)

    Brodsky, A.

    1979-01-01

    Some recent reports of Mancuso, Stewart and Kneale claim findings of radiation-produced cancer in the Hanford worker population. These claims are based on statistical computations that use small differences in accumulated exposures between groups dying of cancer and groups dying of other causes; actual mortality and longevity were not reported. This paper presents a statistical method for evaluation of actual mortality and longevity longitudinally over time, as applied in a primary analysis of the mortality experience of the Hanford worker population. Although available, this method was not utilized in the Mancuso-Stewart-Kneale paper. The author's preliminary longitudinal analysis shows that the gross mortality experience of persons employed at Hanford during 1943-70 interval did not differ significantly from that of certain controls, when both employees and controls were selected from families with two or more offspring and comparison were matched by age, sex, race and year of entry into employment. This result is consistent with findings reported by Sanders (Health Phys. vol.35, 521-538, 1978). The method utilizes an approximate chi-square (1 D.F.) statistic for testing population subgroup comparisons, as well as the cumulation of chi-squares (1 D.F.) for testing the overall result of a particular type of comparison. The method is available for computer testing of the Hanford mortality data, and could also be adapted to morbidity or other population studies. (author)

  8. Multiband discrete ordinates method: formalism and results; Methode multibande aux ordonnees discretes: formalisme et resultats

    Energy Technology Data Exchange (ETDEWEB)

    Luneville, L

    1998-06-01

    The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author) 15 refs.

  9. Application of Nemerow Index Method and Integrated Water Quality Index Method in Water Quality Assessment of Zhangze Reservoir

    Science.gov (United States)

    Zhang, Qian; Feng, Minquan; Hao, Xiaoyan

    2018-03-01

    [Objective] Based on the water quality historical data from the Zhangze Reservoir from the last five years, the water quality was assessed by the integrated water quality identification index method and the Nemerow pollution index method. The results of different evaluation methods were analyzed and compared and the characteristics of each method were identified.[Methods] The suitability of the water quality assessment methods were compared and analyzed, based on these results.[Results] the water quality tended to decrease over time with 2016 being the year with the worst water quality. The sections with the worst water quality were the southern and northern sections.[Conclusion] The results produced by the traditional Nemerow index method fluctuated greatly in each section of water quality monitoring and therefore could not effectively reveal the trend of water quality at each section. The combination of qualitative and quantitative measures of the comprehensive pollution index identification method meant it could evaluate the degree of water pollution as well as determine that the river water was black and odorous. However, the evaluation results showed that the water pollution was relatively low.The results from the improved Nemerow index evaluation were better as the single indicators and evaluation results are in strong agreement; therefore the method is able to objectively reflect the water quality of each water quality monitoring section and is more suitable for the water quality evaluation of the reservoir.

  10. Learning phacoemulsification. Results of different teaching methods.

    Directory of Open Access Journals (Sweden)

    Hennig Albrecht

    2004-01-01

    Full Text Available We report the learning curves of three eye surgeons converting from sutureless extracapsular cataract extraction to phacoemulsification using different teaching methods. Posterior capsule rupture (PCR as a per-operative complication and visual outcome of the first 100 operations were analysed. The PCR rate was 4% and 15% in supervised and unsupervised surgery respectively. Likewise, an uncorrected visual acuity of > or = 6/18 on the first postoperative day was seen in 62 (62% of patients and in 22 (22% in supervised and unsupervised surgery respectively.

  11. Structuring scientific works in the “Introduction, Methods, Results and Discussion” format – what a beginner ought to know

    Directory of Open Access Journals (Sweden)

    N. V. Avdeeva

    2016-01-01

    Full Text Available Reference materials about the “Introduction, Methods, Results and Discussion”, which is a commonly used international format for scientific works, have become available for Russian authors nowadays, still lack of knowledge about the format would pop up here or there, especially when we speak about beginners. The faults which would appear regularly in work structuring prompted the present research, the aim of which is to compare the information about the IMRAD format with the specific difficulties beginning authors would often face when preparing their works for publication.The main materials to be studied were sources in Russian and in English published mostly in 2010s and devoted to the problems of structuring works according to the meant above format. Besides, the present research considered the results of plagiarism tests (such tests used to be carried out at the Russia State Library within the period of 2013 – 2015 with the help of software “Automated system of specialized processing of textual documents”. The main methods of our research would remain structural and comparative analysis of texts.As a result, our research revealed the fact of inconsistency of the available information on the IMRAD structure. It would often demand deep thinking and explanations. Different authors of reference editions would as a rule differ one from another in their interpretation of the degree of necessity of this or that composition element, of the amount of details in descriptions, etc. Moreover, the very structure of scientific work looks differently for different authors. More often the structure supposes the integrity of the contents and its form, still sometimes its description would be replaced by outer elements, such as, for example, language clichés. The analysis of the most common faults in text structuring points that authors do not often have a clear idea of how to understand the different demands which are so obscurely described

  12. Neuroanatomical heterogeneity of schizophrenia revealed by semi-supervised machine learning methods.

    Science.gov (United States)

    Honnorat, Nicolas; Dong, Aoyan; Meisenzahl-Lechner, Eva; Koutsouleris, Nikolaos; Davatzikos, Christos

    2017-12-20

    Schizophrenia is associated with heterogeneous clinical symptoms and neuroanatomical alterations. In this work, we aim to disentangle the patterns of neuroanatomical alterations underlying a heterogeneous population of patients using a semi-supervised clustering method. We apply this strategy to a cohort of patients with schizophrenia of varying extends of disease duration, and we describe the neuroanatomical, demographic and clinical characteristics of the subtypes discovered. We analyze the neuroanatomical heterogeneity of 157 patients diagnosed with Schizophrenia, relative to a control population of 169 subjects, using a machine learning method called CHIMERA. CHIMERA clusters the differences between patients and a demographically-matched population of healthy subjects, rather than clustering patients themselves, thereby specifically assessing disease-related neuroanatomical alterations. Voxel-Based Morphometry was conducted to visualize the neuroanatomical patterns associated with each group. The clinical presentation and the demographics of the groups were then investigated. Three subgroups were identified. The first two differed substantially, in that one involved predominantly temporal-thalamic-peri-Sylvian regions, whereas the other involved predominantly frontal regions and the thalamus. Both subtypes included primarily male patients. The third pattern was a mix of these two and presented milder neuroanatomic alterations and comprised a comparable number of men and women. VBM and statistical analyses suggest that these groups could correspond to different neuroanatomical dimensions of schizophrenia. Our analysis suggests that schizophrenia presents distinct neuroanatomical variants. This variability points to the need for a dimensional neuroanatomical approach using data-driven, mathematically principled multivariate pattern analysis methods, and should be taken into account in clinical studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Comparison of microstickies measurement methods. Part II, Results and discussion

    Science.gov (United States)

    Mahendra R. Doshi; Angeles Blanco; Carlos Negro; Concepcion Monte; Gilles M. Dorris; Carlos C. Castro; Axel Hamann; R. Daniel Haynes; Carl Houtman; Karen Scallon; Hans-Joachim Putz; Hans Johansson; R. A. Venditti; K. Copeland; H.-M. Chang

    2003-01-01

    In part I of the article we discussed sample preparation procedure and described various methods used for the measurement of microstickies. Some of the important features of different methods are highlighted in Table 1. Temperatures used in the measurement methods vary from room temperature in some cases, 45 °C to 65 °C in other cases. Sample size ranges from as low as...

  14. Generalized differential transform method to differential-difference equation

    International Nuclear Information System (INIS)

    Zou Li; Wang Zhen; Zong Zhi

    2009-01-01

    In this Letter, we generalize the differential transform method to solve differential-difference equation for the first time. Two simple but typical examples are applied to illustrate the validity and the great potential of the generalized differential transform method in solving differential-difference equation. A Pade technique is also introduced and combined with GDTM in aim of extending the convergence area of presented series solutions. Comparisons are made between the results of the proposed method and exact solutions. Then we apply the differential transform method to the discrete KdV equation and the discrete mKdV equation, and successfully obtain solitary wave solutions. The results reveal that the proposed method is very effective and simple. We should point out that generalized differential transform method is also easy to be applied to other nonlinear differential-difference equation.

  15. Project Oriented Immersion Learning: Method and Results

    DEFF Research Database (Denmark)

    Icaza, José I.; Heredia, Yolanda; Borch, Ole M.

    2005-01-01

    A pedagogical approach called “project oriented immersion learning” is presented and tested on a graduate online course. The approach combines the Project Oriented Learning method with immersion learning in a virtual enterprise. Students assumed the role of authors hired by a fictitious publishing...... house that develops digital products including e-books, tutorials, web sites and so on. The students defined the problem that their product was to solve; choose the type of product and the content; and built the product following a strict project methodology. A wiki server was used as a platform to hold...

  16. Assessment of four different methods for selecting biosurfactant ...

    African Journals Online (AJOL)

    ... and ease of use to screen biosurfactant producing six extremely halophilic bacteria isolated from saline soil of Chott El Hodna-M'sila (Algeria), which is considered as a thalassohaline environment. Results from screening methods revealed that, CH2 and CH5 strains are potential candidates for biosurfactant production.

  17. COMPARATIVE STUDY ON MILK CASEIN ASSAY METHODS

    Directory of Open Access Journals (Sweden)

    RODICA CĂPRIłĂ

    2008-05-01

    Full Text Available Casein, the main milk protein was determined by different assay methods: the gravimetric method, the method based on the neutralization of the NaOH excess used for the casein precipitate solving and the method based on the titration of the acetic acid used for the casein precipitation. The last method is the simplest one, with the fewer steps, and also with the lowest error degree. The results of the experiment revealed that the percentage of casein from the whole milk protein represents between 72.6–81.3% in experiment 1, between 73.6–81.3% in experiment 2 and between 74.3–81% in experiment 3.

  18. A MITE-based genotyping method to reveal hundreds of DNA polymorphisms in an animal genome after a few generations of artificial selection

    Directory of Open Access Journals (Sweden)

    Tetreau Guillaume

    2008-10-01

    Full Text Available Abstract Background For most organisms, developing hundreds of genetic markers spanning the whole genome still requires excessive if not unrealistic efforts. In this context, there is an obvious need for methodologies allowing the low-cost, fast and high-throughput genotyping of virtually any species, such as the Diversity Arrays Technology (DArT. One of the crucial steps of the DArT technique is the genome complexity reduction, which allows obtaining a genomic representation characteristic of the studied DNA sample and necessary for subsequent genotyping. In this article, using the mosquito Aedes aegypti as a study model, we describe a new genome complexity reduction method taking advantage of the abundance of miniature inverted repeat transposable elements (MITEs in the genome of this species. Results Ae. aegypti genomic representations were produced following a two-step procedure: (1 restriction digestion of the genomic DNA and simultaneous ligation of a specific adaptor to compatible ends, and (2 amplification of restriction fragments containing a particular MITE element called Pony using two primers, one annealing to the adaptor sequence and one annealing to a conserved sequence motif of the Pony element. Using this protocol, we constructed a library comprising more than 6,000 DArT clones, of which at least 5.70% were highly reliable polymorphic markers for two closely related mosquito strains separated by only a few generations of artificial selection. Within this dataset, linkage disequilibrium was low, and marker redundancy was evaluated at 2.86% only. Most of the detected genetic variability was observed between the two studied mosquito strains, but individuals of the same strain could still be clearly distinguished. Conclusion The new complexity reduction method was particularly efficient to reveal genetic polymorphisms in Ae. egypti. Overall, our results testify of the flexibility of the DArT genotyping technique and open new

  19. Standardization of glycohemoglobin results and reference values in whole blood studied in 103 laboratories using 20 methods.

    Science.gov (United States)

    Weykamp, C W; Penders, T J; Miedema, K; Muskiet, F A; van der Slik, W

    1995-01-01

    We investigated the effect of calibration with lyophilized calibrators on whole-blood glycohemoglobin (glyHb) results. One hundred three laboratories, using 20 different methods, determined glyHb in two lyophilized calibrators and two whole-blood samples. For whole-blood samples with low (5%) and high (9%) glyHb percentages, respectively, calibration decreased overall interlaboratory variation (CV) from 16% to 9% and from 11% to 6% and decreased intermethod variation from 14% to 6% and from 12% to 5%. Forty-seven laboratories, using 14 different methods, determined mean glyHb percentages in self-selected groups of 10 nondiabetic volunteers each. With calibration their overall mean (2SD) was 5.0% (0.5%), very close to the 5.0% (0.3%) derived from the reference method used in the Diabetes Control and Complications Trial. In both experiments the Abbott IMx and Vision showed deviating results. We conclude that, irrespective of the analytical method used, calibration enables standardization of glyHb results, reference values, and interpretation criteria.

  20. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  1. Why conventional detection methods fail in identifying the existence of contamination events.

    Science.gov (United States)

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Hydrogen storage in single-walled carbon nanotubes: methods and results

    International Nuclear Information System (INIS)

    Poirier, E.; Chahine, R.; Tessier, A.; Cossement, D.; Lafi, L.; Bose, T.K.

    2004-01-01

    We present high sensitivity gravimetric and volumetric hydrogen sorption measurement systems adapted for in situ conditioning under high temperature and high vacuum. These systems, which allow for precise measurements on small samples and thorough degassing, are used for sorption measurements on carbon nanostructures. We developed one volumetric system for the pressure range 0-1 bar, and two gravimetric systems for 0-1 bar and 0-100 bars. The use of both gravimetric and volumetric methods allows for the cross-checking of the results. The accuracy of the systems has been determined from hydrogen absorption measurements on palladium. The accuracies of the 0-1 bar volumetric and gravimetric systems are about 10 μg and 20 μg respectively. The accuracy of the 0-100 bars gravimetric system is about 20 μg. Hydrogen sorption measurements on single-walled carbon nanotubes (SWNTs) and metal-incorporated- SWNTs are presented. (author)

  3. Revealing low-energy part of the beta spectra

    International Nuclear Information System (INIS)

    Selvi, S.; Celiktas, C.

    2002-01-01

    An effective method is proposed to separate electronic noise from the beta-particle spectra revealing lower energy part of the spectra. The available methods for reducing the noise problem cut the noise along with the low-energy part of the beta spectra by using a discriminator. Our setup eliminates this undesirable effect by shifting the noise toward the lowest energy scale leaving the low-energy part of spectra undisturbed. We achieved this noise-pulse-separation by treating the noise as a pulse so that we can exploit the application of the pulse-shape analyzer equipment used for pulse shape identification of particles and rejection of defective pulses. To the best of our knowledge this method of the noise separation is a novel approach

  4. Paleomagnetic intensity of Aso pyroclastic flows: Additional results with LTD-DHT Shaw method, Thellier method with pTRM-tail check

    Science.gov (United States)

    Maruuchi, T.; Shibuya, H.

    2009-12-01

    , and 42 specimens were submitted to Thellier experiments. Twelve specimens from 4 sites passed the same criteria as Aso-2, and yield a mean paleointensity of 43.1±1.4uT. It again agrees with the value (45.6±1.7uT) of Takai et al. (2002). LTD-DHT Shaw method experiment is also applied for 12 specimens from 3 sites, and 4 passed the criteria giving 38.2±1.7. Although it is a little smaller than Thellier results, it is way larger than the Sint-800 at the time of Aso-4. Aso-1 result in this study is more consistent with the Sint-800 at that time than Takai et al. (2002). But for Aso-2 and Aso-4, their new reliable paleointensity results suggest that the discrepancy from the Sint-800 is not attributed to the experimental problems.

  5. Effect of Chemistry Triangle Oriented Learning Media on Cooperative, Individual and Conventional Method on Chemistry Learning Result

    Science.gov (United States)

    Latisma D, L.; Kurniawan, W.; Seprima, S.; Nirbayani, E. S.; Ellizar, E.; Hardeli, H.

    2018-04-01

    The purpose of this study was to see which method are well used with the Chemistry Triangle-oriented learning media. This quasi experimental research involves first grade of senior high school students in six schools namely each two SMA N in Solok city, in Pasaman and two SMKN in Pariaman. The sampling technique was done by Cluster Random Sampling. Data were collected by test and analyzed by one-way anova and Kruskall Wallish test. The results showed that the high school students in Solok learning taught by cooperative method is better than the results of student learning taught by conventional and Individual methods, both for students who have high initial ability and low-ability. Research in SMK showed that the overall student learning outcomes taught by conventional method is better than the student learning outcomes taught by cooperative and individual methods. Student learning outcomes that have high initial ability taught by individual method is better than student learning outcomes that are taught by cooperative method and for students who have low initial ability, there is no difference in student learning outcomes taught by cooperative, individual and conventional methods. Learning in high school in Pasaman showed no significant difference in learning outcomes of the three methods undertaken.

  6. Accuracy of the hypothetical sky-polarimetric Viking navigation versus sky conditions: revealing solar elevations and cloudinesses favourable for this navigation method.

    Science.gov (United States)

    Száz, Dénes; Farkas, Alexandra; Barta, András; Kretzer, Balázs; Blahó, Miklós; Egri, Ádám; Szabó, Gyula; Horváth, Gábor

    2017-09-01

    According to Thorkild Ramskou's theory proposed in 1967, under overcast and foggy skies, Viking seafarers might have used skylight polarization analysed with special crystals called sunstones to determine the position of the invisible Sun. After finding the occluded Sun with sunstones, its elevation angle had to be measured and its shadow had to be projected onto the horizontal surface of a sun compass. According to Ramskou's theory, these sunstones might have been birefringent calcite or dichroic cordierite or tourmaline crystals working as polarizers. It has frequently been claimed that this method might have been suitable for navigation even in cloudy weather. This hypothesis has been accepted and frequently cited for decades without any experimental support. In this work, we determined the accuracy of this hypothetical sky-polarimetric Viking navigation for 1080 different sky situations characterized by solar elevation θ and cloudiness ρ , the sky polarization patterns of which were measured by full-sky imaging polarimetry. We used the earlier measured uncertainty functions of the navigation steps 1, 2 and 3 for calcite, cordierite and tourmaline sunstone crystals, respectively, and the newly measured uncertainty function of step 4 presented here. As a result, we revealed the meteorological conditions under which Vikings could have used this hypothetical navigation method. We determined the solar elevations at which the navigation uncertainties are minimal at summer solstice and spring equinox for all three sunstone types. On average, calcite sunstone ensures a more accurate sky-polarimetric navigation than tourmaline and cordierite. However, in some special cases (generally at 35° ≤  θ  ≤ 40°, 1 okta ≤  ρ  ≤ 6 oktas for summer solstice, and at 20° ≤  θ  ≤ 25°, 0 okta ≤  ρ  ≤ 4 oktas for spring equinox), the use of tourmaline and cordierite results in smaller navigation uncertainties than that of calcite

  7. SOLVING NONLINEAR KLEIN-GORDON EQUATION WITH A QUADRATIC NONLINEAR TERM USING HOMOTOPY ANALYSIS METHOD

    Directory of Open Access Journals (Sweden)

    H. Jafari

    2010-07-01

    Full Text Available In this paper, nonlinear Klein-Gordon equation with quadratic term is solved by means of an analytic technique, namely the Homotopy analysis method (HAM.Comparisons are made between the Adomian decomposition method (ADM, the exact solution and homotopy analysis method. The results reveal that the proposed method is very effective and simple.

  8. Enhancing activated-peroxide formulations for porous materials: Test methods and results

    Energy Technology Data Exchange (ETDEWEB)

    Krauter, Paula [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tucker, Mark D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tezak, Matthew S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boucher, Raymond [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-12-01

    During an urban wide-area incident involving the release of a biological warfare agent, the recovery/restoration effort will require extensive resources and will tax the current capabilities of the government and private contractors. In fact, resources may be so limited that decontamination by facility owners/occupants may become necessary and a simple decontamination process and material should be available for this use. One potential process for use by facility owners/occupants would be a liquid sporicidal decontaminant, such as pHamended bleach or activated-peroxide, and simple application devices. While pH-amended bleach is currently the recommended low-tech decontamination solution, a less corrosive and toxic decontaminant is desirable. The objective of this project is to provide an operational assessment of an alternative to chlorine bleach for low-tech decontamination applications activated hydrogen peroxide. This report provides the methods and results for activatedperoxide evaluation experiments. The results suggest that the efficacy of an activated-peroxide decontaminant is similar to pH-amended bleach on many common materials.

  9. Locating previously unknown patterns in data-mining results: a dual data- and knowledge-mining method

    Directory of Open Access Journals (Sweden)

    Knaus William A

    2006-03-01

    Full Text Available Abstract Background Data mining can be utilized to automate analysis of substantial amounts of data produced in many organizations. However, data mining produces large numbers of rules and patterns, many of which are not useful. Existing methods for pruning uninteresting patterns have only begun to automate the knowledge acquisition step (which is required for subjective measures of interestingness, hence leaving a serious bottleneck. In this paper we propose a method for automatically acquiring knowledge to shorten the pattern list by locating the novel and interesting ones. Methods The dual-mining method is based on automatically comparing the strength of patterns mined from a database with the strength of equivalent patterns mined from a relevant knowledgebase. When these two estimates of pattern strength do not match, a high "surprise score" is assigned to the pattern, identifying the pattern as potentially interesting. The surprise score captures the degree of novelty or interestingness of the mined pattern. In addition, we show how to compute p values for each surprise score, thus filtering out noise and attaching statistical significance. Results We have implemented the dual-mining method using scripts written in Perl and R. We applied the method to a large patient database and a biomedical literature citation knowledgebase. The system estimated association scores for 50,000 patterns, composed of disease entities and lab results, by querying the database and the knowledgebase. It then computed the surprise scores by comparing the pairs of association scores. Finally, the system estimated statistical significance of the scores. Conclusion The dual-mining method eliminates more than 90% of patterns with strong associations, thus identifying them as uninteresting. We found that the pruning of patterns using the surprise score matched the biomedical evidence in the 100 cases that were examined by hand. The method automates the acquisition of

  10. Some results about the dating of pre hispanic mexican ceramics by the thermoluminescence method

    International Nuclear Information System (INIS)

    Gonzalez M, P.; Mendoza A, D.; Ramirez L, A.; Schaaf, P.

    2004-01-01

    One of the most frequently recurring questions in Archaeometry concerns the age of the studied objects. The some first dating methods were based in historical narrations, style of buildings manufacture techniques. However, has been observed that as consequence the continuous irradiation from naturally occurring radioisotopes and from cosmic rays some materials, such as archaeological ceramic, accumulate certain quantity of energy. These types of material can, in principle, be dated through the analysis of these accumulate energy. In that case, ceramic dating can be realized by thermoluminescence (TL) dating. In this work, results obtained by our research group about TL dating of ceramic belonging to several archaeological zones like to Edzna (Campeche), Calixtlahuaca and Teotenango (Mexico State) and Hervideros (Durango) are presented. The analysis was realized using the fine grained mode in a Daybreak model 1100 reader TL system. The radioisotopes that contribute in the accumulate annual dose in ceramic samples ( 40 K, 238 U, 232 Th) were determined by means of techniques such as Energy Dispersive X-ray Spectroscopy (EDS) and Neutron Activation Analysis (AAN). Our results are agree with results obtained through other methods. (Author) 7 refs., 2 tabs., 5 figs

  11. Improving the accuracy of myocardial perfusion scintigraphy results by machine learning method

    International Nuclear Information System (INIS)

    Groselj, C.; Kukar, M.

    2002-01-01

    Full text: Machine learning (ML) as rapidly growing artificial intelligence subfield has already proven in last decade to be a useful tool in many fields of decision making, also in some fields of medicine. Its decision accuracy usually exceeds the human one. To assess applicability of ML in interpretation the results of stress myocardial perfusion scintigraphy for CAD diagnosis. The 327 patient's data of planar stress myocardial perfusion scintigraphy were reevaluated in usual way. Comparing them with the results of coronary angiography the sensitivity, specificity and accuracy for the investigation was computed. The data were digitized and the decision procedure repeated by ML program 'Naive Bayesian classifier'. As the ML is able to simultaneously manipulate of whatever number of data, all reachable disease connected data (regarding history, habitus, risk factors, stress results) were added. The sensitivity, specificity and accuracy for scintigraphy were expressed in this way. The results of both decision procedures were compared. With ML method 19 patients more out of 327 (5.8 %) were correctly diagnosed by stress myocardial perfusion scintigraphy. ML could be an important tool for decision making in myocardial perfusion scintigraphy. (author)

  12. Phylogenetic analysis of a spontaneous cocoa bean fermentation metagenome reveals new insights into its bacterial and fungal community diversity.

    Directory of Open Access Journals (Sweden)

    Koen Illeghems

    Full Text Available This is the first report on the phylogenetic analysis of the community diversity of a single spontaneous cocoa bean box fermentation sample through a metagenomic approach involving 454 pyrosequencing. Several sequence-based and composition-based taxonomic profiling tools were used and evaluated to avoid software-dependent results and their outcome was validated by comparison with previously obtained culture-dependent and culture-independent data. Overall, this approach revealed a wider bacterial (mainly γ-Proteobacteria and fungal diversity than previously found. Further, the use of a combination of different classification methods, in a software-independent way, helped to understand the actual composition of the microbial ecosystem under study. In addition, bacteriophage-related sequences were found. The bacterial diversity depended partially on the methods used, as composition-based methods predicted a wider diversity than sequence-based methods, and as classification methods based solely on phylogenetic marker genes predicted a more restricted diversity compared with methods that took all reads into account. The metagenomic sequencing analysis identified Hanseniaspora uvarum, Hanseniaspora opuntiae, Saccharomyces cerevisiae, Lactobacillus fermentum, and Acetobacter pasteurianus as the prevailing species. Also, the presence of occasional members of the cocoa bean fermentation process was revealed (such as Erwinia tasmaniensis, Lactobacillus brevis, Lactobacillus casei, Lactobacillus rhamnosus, Lactococcus lactis, Leuconostoc mesenteroides, and Oenococcus oeni. Furthermore, the sequence reads associated with viral communities were of a restricted diversity, dominated by Myoviridae and Siphoviridae, and reflecting Lactobacillus as the dominant host. To conclude, an accurate overview of all members of a cocoa bean fermentation process sample was revealed, indicating the superiority of metagenomic sequencing over previously used techniques.

  13. Nondestructive methods for the structural evaluation of wood floor systems in historic buildings : preliminary results : [abstract

    Science.gov (United States)

    Zhiyong Cai; Michael O. Hunt; Robert J. Ross; Lawrence A. Soltis

    1999-01-01

    To date, there is no standard method for evaluating the structural integrity of wood floor systems using nondestructive techniques. Current methods of examination and assessment are often subjective and therefore tend to yield imprecise or variable results. For this reason, estimates of allowable wood floor loads are often conservative. The assignment of conservatively...

  14. The numerical method of inverse Laplace transform for calculation of overvoltages in power transformers and test results

    Directory of Open Access Journals (Sweden)

    Mikulović Jovan Č.

    2014-01-01

    Full Text Available A methodology for calculation of overvoltages in transformer windings, based on a numerical method of inverse Laplace transform, is presented. Mathematical model of transformer windings is described by partial differential equations corresponding to distributed parameters electrical circuits. The procedure of calculating overvoltages is applied to windings having either isolated neutral point, or grounded neutral point, or neutral point grounded through impedance. A comparative analysis of the calculation results obtained by the proposed numerical method and by analytical method of calculation of overvoltages in transformer windings is presented. The results computed by the proposed method and measured voltage distributions, when a voltage surge is applied to a three-phase 30 kVA power transformer, are compared. [Projekat Ministartsva nauke Republike Srbije, br. TR-33037 i br. TR-33020

  15. Demand effects of consumers’ stated and revealed preferences

    OpenAIRE

    Engström, Per; Forsell, Eskil

    2013-01-01

    Knowledge of how consumers react to different quality signals is fundamental for understanding how markets work. We study the online market- place for Android apps where we compare the causal effects on demand from two quality related signals; other consumers' stated and revealed preferences toward an app. Our main result is that consumers are much more responsive to other consumers' revealed preferences, compared to others' stated preferences. A 10 percentile increase in displayed average ra...

  16. Results of the determination of He in cenozoic aquifers using the GC method.

    Science.gov (United States)

    Kotowski, Tomasz; Najman, Joanna

    2015-04-01

    Applications of the Helium (He) method known so far consisted mainly of 4He measurements using a special mass spectrometer. 4He measurements for groundwater dating purposes can be replaced by total He (3He+4He) concentration measurements because the content of 3He can be ignored. The concentrations of 3He are very low and 3He/4 He ratios do not exceed 1.0·10(-5) in most cases. In this study, the total He concentrations in groundwater were determined using the gas chromatographic (GC) method as an alternative to methods based on spectrometry measurement. He concentrations in groundwater were used for the determination of residence time and groundwater circulation. Additionally, the radiocarbon method was used to determine the value of the external He flux (JHe) in the study area. Obtained low He concentrations and their small variation within the ca. 65 km long section along which groundwater flows indicate that it is likely there is relatively short residence time and a strong hydraulic connection between the aquifers. The estimated residence time (ca. 3000 years) is heavily dependent on the great uncertainty of the He concentration resulting from the low concentrations of He, the external 4He flux value adopted for calculation purposes and the 14C ages used to estimate the external 4He flux. © 2015, National Ground Water Association.

  17. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  18. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    Science.gov (United States)

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  19. Estimated H-atom anisotropic displacement parameters: a comparison between different methods and with neutron diffraction results

    DEFF Research Database (Denmark)

    Munshi, Parthapratim; Madsen, Anders Ø; Spackman, Mark A

    2008-01-01

    systems and identify systematic discrepancies for several atom types. A revised and extended library of internal H-atom mean-square displacements is presented for use with Madsen's SHADE web server [J. Appl. Cryst. (2006), 39, 757-758; http://shade.ki.ku.dk], and the improvement over the original SHADE...... in the agreement with neutron results. The SHADE2 library, now incorporated in the SHADE web server, is recommended as a routine procedure for deriving estimates of H-atom ADPs suitable for use in charge-density studies on molecular crystals, and its widespread use should reveal remaining deficiencies and perhaps...... results is substantial, suggesting that this is now the most readily and widely applicable of the three approximate procedures. Using this new library--SHADE2--it is shown that, in line with expectations, a segmented rigid-body description of the heavy atoms yields only a small improvement...

  20. Analytical Evaluation of Beam Deformation Problem Using Approximate Methods

    DEFF Research Database (Denmark)

    Barari, Amin; Kimiaeifar, A.; Domairry, G.

    2010-01-01

    The beam deformation equation has very wide applications in structural engineering. As a differential equation, it has its own problem concerning existence, uniqueness and methods of solutions. Often, original forms of governing differential equations used in engineering problems are simplified......, and this process produces noise in the obtained answers. This paper deals with the solution of second order of differential equation governing beam deformation using four analytical approximate methods, namely the Perturbation, Homotopy Perturbation Method (HPM), Homotopy Analysis Method (HAM) and Variational...... Iteration Method (VIM). The comparisons of the results reveal that these methods are very effective, convenient and quite accurate for systems of non-linear differential equation....

  1. Some new results on correlation-preserving factor scores prediction methods

    NARCIS (Netherlands)

    Ten Berge, J.M.F.; Krijnen, W.P.; Wansbeek, T.J.; Shapiro, A.

    1999-01-01

    Anderson and Rubin and McDonald have proposed a correlation-preserving method of factor scores prediction which minimizes the trace of a residual covariance matrix for variables. Green has proposed a correlation-preserving method which minimizes the trace of a residual covariance matrix for factors.

  2. Results from the FIN-2 formal comparison

    Science.gov (United States)

    Connolly, Paul; Hoose, Corinna; Liu, Xiaohong; Moehler, Ottmar; Cziczo, Daniel; DeMott, Paul

    2017-04-01

    During the Fifth International Ice Nucleation Workshop (FIN-2) at the AIDA Ice Nucleation facility in Karlsruhe, Germany in March 2015, a formal comparison of ice nucleation measurement methods was conducted. During the experiments the samples of ice nucleating particles were not revealed to the instrument scientists, hence this was referred to as a "blind comparison". The two samples used were later revealed to be Arizona Test Dust and an Argentina soil sample. For these two samples seven mobile ice nucleating particle counters sampled directly from the AIDA chamber or from the aerosol preparation chamber at specified temperatures, whereas filter samples were taken for two offline deposition nucleation instruments. Wet suspension methods for determining IN concentrations were also used with 10 different methods employed. For the wet suspension methods experiments were conducted using INPs collected from the air inside the chambers (impinger sampling) and INPs taken from the bulk samples (vial sampling). Direct comparisons of the ice nucleating particle concentrations are reported as well as derived ice nucleation active site densities. The study highlights the difficulties in performing such analyses, but generally indicates that there is reasonable agreement between the wet suspension techniques. It is noted that ice nucleation efficiency derived from the AIDA chamber (quantified using the ice active surface site density approach) is higher than that for the cold stage techniques. This is both true for the Argentina soil sample and, to a lesser extent, for the Arizona Test Dust sample too. Other interesting effects were noted: for the ATD the impinger sampling demonstrated higher INP efficiency at higher temperatures (>255 K) than the vial sampling, but agreed at the lower temperatures (<255K), whereas the opposite was true for the Argentina soil sample. The results are analysed to better understand the performance of the various techniques and to address any

  3. Review of Calibration Methods for Scheimpflug Camera

    Directory of Open Access Journals (Sweden)

    Cong Sun

    2018-01-01

    Full Text Available The Scheimpflug camera offers a wide range of applications in the field of typical close-range photogrammetry, particle image velocity, and digital image correlation due to the fact that the depth-of-view of Scheimpflug camera can be greatly extended according to the Scheimpflug condition. Yet, the conventional calibration methods are not applicable in this case because the assumptions used by classical calibration methodologies are not valid anymore for cameras undergoing Scheimpflug condition. Therefore, various methods have been investigated to solve the problem over the last few years. However, no comprehensive review exists that provides an insight into recent calibration methods of Scheimpflug cameras. This paper presents a survey of recent calibration methods of Scheimpflug cameras with perspective lens, including the general nonparametric imaging model, and analyzes in detail the advantages and drawbacks of the mainstream calibration models with respect to each other. Real data experiments including calibrations, reconstructions, and measurements are performed to assess the performance of the models. The results reveal that the accuracies of the RMM, PLVM, PCIM, and GNIM are basically equal, while the accuracy of GNIM is slightly lower compared with the other three parametric models. Moreover, the experimental results reveal that the parameters of the tangential distortion are likely coupled with the tilt angle of the sensor in Scheimpflug calibration models. The work of this paper lays the foundation of further research of Scheimpflug cameras.

  4. A kernel-based multivariate feature selection method for microarray data classification.

    Directory of Open Access Journals (Sweden)

    Shiquan Sun

    Full Text Available High dimensionality and small sample sizes, and their inherent risk of overfitting, pose great challenges for constructing efficient classifiers in microarray data classification. Therefore a feature selection technique should be conducted prior to data classification to enhance prediction performance. In general, filter methods can be considered as principal or auxiliary selection mechanism because of their simplicity, scalability, and low computational complexity. However, a series of trivial examples show that filter methods result in less accurate performance because they ignore the dependencies of features. Although few publications have devoted their attention to reveal the relationship of features by multivariate-based methods, these methods describe relationships among features only by linear methods. While simple linear combination relationship restrict the improvement in performance. In this paper, we used kernel method to discover inherent nonlinear correlations among features as well as between feature and target. Moreover, the number of orthogonal components was determined by kernel Fishers linear discriminant analysis (FLDA in a self-adaptive manner rather than by manual parameter settings. In order to reveal the effectiveness of our method we performed several experiments and compared the results between our method and other competitive multivariate-based features selectors. In our comparison, we used two classifiers (support vector machine, [Formula: see text]-nearest neighbor on two group datasets, namely two-class and multi-class datasets. Experimental results demonstrate that the performance of our method is better than others, especially on three hard-classify datasets, namely Wang's Breast Cancer, Gordon's Lung Adenocarcinoma and Pomeroy's Medulloblastoma.

  5. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    Science.gov (United States)

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  6. Ti α - ω phase transformation and metastable structure, revealed by the solid-state nudged elastic band method

    Science.gov (United States)

    Zarkevich, Nikolai; Johnson, Duane D.

    Titanium is on of the four most utilized structural metals, and, hence, its structural changes and potential metastable phases under stress are of considerable importance. Using DFT+U combined with the generalized solid-state nudged elastic band (SS-NEB) method, we consider the pressure-driven transformation between Ti α and ω phases, and find an intermediate metastable body-centered orthorhombic (bco) structure of lower density. We verify its stability, assess the phonons and electronic structure, and compare computational results to experiment. Interestingly, standard density functional theory (DFT) yields the ω phase as the Ti ground state, in contradiction to the observed α phase at low pressure and temperature. We correct this by proper consideration of the strongly correlated d-electrons, and utilize DFT+U method in the SS-NEB to obtain the relevant transformation pathway and structures. We use methods developed with support by the U.S. Department of Energy (DE-FG02-03ER46026 and DE-AC02-07CH11358). Ames Laboratory is operated for the DOE by Iowa State University under Contract DE-AC02-07CH11358.

  7. Comments on Brodsky's statistical methods for evaluating epidemiological results, and reply by Brodsky, A

    International Nuclear Information System (INIS)

    Frome, E.L.; Khare, M.

    1980-01-01

    Brodsky's paper 'A Statistical Method for Testing Epidemiological Results, as applied to the Hanford Worker Population', (Health Phys., 36, 611-628, 1979) proposed two test statistics for use in comparing the survival experience of a group of employees and controls. This letter states that both of the test statistics were computed using incorrect formulas and concludes that the results obtained using these statistics may also be incorrect. In his reply Brodsky concurs with the comments on the proper formulation of estimates of pooled standard errors in constructing test statistics but believes that the erroneous formulation does not invalidate the major points, results and discussions of his paper. (author)

  8. Study of Tip-loss Using an Inverse 3D Navier-Stokes Method

    DEFF Research Database (Denmark)

    Mikkelsen, Robert; Sørensen, Jens Nørkær; Shen, Wen Zhong

    2003-01-01

    the 3D Navier-Stokes equations combined with the actuator line technique where blade loading is applied using an inverse method. The numerical simulations shows that the method captures the tip-correction when comparing with the theories of Prandtl and Goldstein, however, the accuracy of the obtained...... results reveal that further refinements still is needed. Keywords: Tip-loss; Actuator line; 3D Navier-Stokes methods....

  9. Radiochemical studies of some preparation methods for phosphorus

    International Nuclear Information System (INIS)

    Loos-Neskovic, C.; Fedoroff, M.

    1983-01-01

    Various methods of radiochemical separation were tested for the determination of phosphorus in metals and alloys by neutron activation analysis. Classical methods of separation revealed some defects when they were applied to this problem. Methods using liquid extraction gave low yields and were not reproducible. Methods based on precipitation gave better results, but were not selective enough in most cases. Retention on alumina was not possible without preliminary separations. Authors studied a new radiochemical separation based on the extraction of elemental phosphorus in the gaseous phase after reduction at high temperature with carbon. Measurements with radioactive phosphorus showed that the extraction yield is better than 99%. (author)

  10. 3D ultrasound computer tomography: Hardware setup, reconstruction methods and first clinical results

    Science.gov (United States)

    Gemmeke, Hartmut; Hopp, Torsten; Zapf, Michael; Kaiser, Clemens; Ruiter, Nicole V.

    2017-11-01

    A promising candidate for improved imaging of breast cancer is ultrasound computer tomography (USCT). Current experimental USCT systems are still focused in elevation dimension resulting in a large slice thickness, limited depth of field, loss of out-of-plane reflections, and a large number of movement steps to acquire a stack of images. 3D USCT emitting and receiving spherical wave fronts overcomes these limitations. We built an optimized 3D USCT, realizing for the first time the full benefits of a 3D system. The point spread function could be shown to be nearly isotropic in 3D, to have very low spatial variability and fit the predicted values. The contrast of the phantom images is very satisfactory in spite of imaging with a sparse aperture. The resolution and imaged details of the reflectivity reconstruction are comparable to a 3 T MRI volume. Important for the obtained resolution are the simultaneously obtained results of the transmission tomography. The KIT 3D USCT was then tested in a pilot study on ten patients. The primary goals of the pilot study were to test the USCT device, the data acquisition protocols, the image reconstruction methods and the image fusion techniques in a clinical environment. The study was conducted successfully; the data acquisition could be carried out for all patients with an average imaging time of six minutes per breast. The reconstructions provide promising images. Overlaid volumes of the modalities show qualitative and quantitative information at a glance. This paper gives a summary of the involved techniques, methods, and first results.

  11. Precise charge density studies by maximum entropy method

    CERN Document Server

    Takata, M

    2003-01-01

    For the production research and development of nanomaterials, their structural information is indispensable. Recently, a sophisticated analytical method, which is based on information theory, the Maximum Entropy Method (MEM) using synchrotron radiation powder data, has been successfully applied to determine precise charge densities of metallofullerenes and nanochannel microporous compounds. The results revealed various endohedral natures of metallofullerenes and one-dimensional array formation of adsorbed gas molecules in nanochannel microporous compounds. The concept of MEM analysis was also described briefly. (author)

  12. Application of a hierarchical enzyme classification method reveals the role of gut microbiome in human metabolism.

    Science.gov (United States)

    Mohammed, Akram; Guda, Chittibabu

    2015-01-01

    Enzymes are known as the molecular machines that drive the metabolism of an organism; hence identification of the full enzyme complement of an organism is essential to build the metabolic blueprint of that species as well as to understand the interplay of multiple species in an ecosystem. Experimental characterization of the enzymatic reactions of all enzymes in a genome is a tedious and expensive task. The problem is more pronounced in the metagenomic samples where even the species are not adequately cultured or characterized. Enzymes encoded by the gut microbiota play an essential role in the host metabolism; thus, warranting the need to accurately identify and annotate the full enzyme complements of species in the genomic and metagenomic projects. To fulfill this need, we develop and apply a method called ECemble, an ensemble approach to identify enzymes and enzyme classes and study the human gut metabolic pathways. ECemble method uses an ensemble of machine-learning methods to accurately model and predict enzymes from protein sequences and also identifies the enzyme classes and subclasses at the finest resolution. A tenfold cross-validation result shows accuracy between 97 and 99% at different levels in the hierarchy of enzyme classification, which is superior to comparable methods. We applied ECemble to predict the entire complements of enzymes from ten sequenced proteomes including the human proteome. We also applied this method to predict enzymes encoded by the human gut microbiome from gut metagenomic samples, and to study the role played by the microbe-derived enzymes in the human metabolism. After mapping the known and predicted enzymes to canonical human pathways, we identified 48 pathways that have at least one bacteria-encoded enzyme, which demonstrates the complementary role of gut microbiome in human gut metabolism. These pathways are primarily involved in metabolizing dietary nutrients such as carbohydrates, amino acids, lipids, cofactors and

  13. Midface swelling reveals nasofrontal dermal sinus

    International Nuclear Information System (INIS)

    Houneida, Zaghouani Ben Alaya; Manel, Limeme; Latifa, Harzallah; Habib, Amara; Dejla, Bakir; Chekib, Kraiem

    2012-01-01

    Nasofrontal dermal sinuses are very rare and generally occur in children. This congenital malformation can be revealed by midface swelling, which can be complicated by local infection or neuromeningitis. Such complications make the dermal sinus a life-threatening disease. Two cases of nasofrontal dermal sinuses are reported in this work. The first case is an 11-month-old girl who presented with left orbitonasal soft tissue swelling accompanied by inflammation. Physical examination found fever, left orbitonasal thickening, and a puncture hole letting out pus. Computed tomography revealed microabscesses located at the left orbitonasal soft tissues, a frontal bone defect, and an intracranial cyst. Magnetic resonance imaging showed the transosseous tract between the glabella and the brain and affirmed the epidermoid nature of the intracranial cyst. The second case is a 7-year-old girl who presented with a nasofrontal non-progressive mass that intermittently secreted a yellow liquid through an external orifice located at the glabella. MRI revealed a cystic mass located in the deep layer of the glabellar skin related to an epidermoid cyst with a nasofrontal dermal sinus tract. In both cases, surgical excision was performed, and pathological confirmation was made for the diagnoses of dermal sinuses. The postoperative course was favorable. Through these cases, the authors stress the role of imaging methods in confirming the diagnosis and looking for associated cysts (dermoid and epidermoid) to improve recognition of this rare disease. Knowledge of the typical clinical presentations, imaging manifestations, and most common sites of occurrence of this malformation are needed to formulate a differential diagnosis.

  14. He's homotopy perturbation method for solving systems of Volterra integral equations of the second kind

    International Nuclear Information System (INIS)

    Biazar, J.; Ghazvini, H.

    2009-01-01

    In this paper, the He's homotopy perturbation method is applied to solve systems of Volterra integral equations of the second kind. Some examples are presented to illustrate the ability of the method for linear and non-linear such systems. The results reveal that the method is very effective and simple.

  15. Integrate life-cycle assessment and risk analysis results, not methods.

    Science.gov (United States)

    Linkov, Igor; Trump, Benjamin D; Wender, Ben A; Seager, Thomas P; Kennedy, Alan J; Keisler, Jeffrey M

    2017-08-04

    Two analytic perspectives on environmental assessment dominate environmental policy and decision-making: risk analysis (RA) and life-cycle assessment (LCA). RA focuses on management of a toxicological hazard in a specific exposure scenario, while LCA seeks a holistic estimation of impacts of thousands of substances across multiple media, including non-toxicological and non-chemically deleterious effects. While recommendations to integrate the two approaches have remained a consistent feature of environmental scholarship for at least 15 years, the current perception is that progress is slow largely because of practical obstacles, such as a lack of data, rather than insurmountable theoretical difficulties. Nonetheless, the emergence of nanotechnology presents a serious challenge to both perspectives. Because the pace of nanomaterial innovation far outstrips acquisition of environmentally relevant data, it is now clear that a further integration of RA and LCA based on dataset completion will remain futile. In fact, the two approaches are suited for different purposes and answer different questions. A more pragmatic approach to providing better guidance to decision-makers is to apply the two methods in parallel, integrating only after obtaining separate results.

  16. Re-Computation of Numerical Results Contained in NACA Report No. 496

    Science.gov (United States)

    Perry, Boyd, III

    2015-01-01

    An extensive examination of NACA Report No. 496 (NACA 496), "General Theory of Aerodynamic Instability and the Mechanism of Flutter," by Theodore Theodorsen, is described. The examination included checking equations and solution methods and re-computing interim quantities and all numerical examples in NACA 496. The checks revealed that NACA 496 contains computational shortcuts (time- and effort-saving devices for engineers of the time) and clever artifices (employed in its solution methods), but, unfortunately, also contains numerous tripping points (aspects of NACA 496 that have the potential to cause confusion) and some errors. The re-computations were performed employing the methods and procedures described in NACA 496, but using modern computational tools. With some exceptions, the magnitudes and trends of the original results were in fair-to-very-good agreement with the re-computed results. The exceptions included what are speculated to be computational errors in the original in some instances and transcription errors in the original in others. Independent flutter calculations were performed and, in all cases, including those where the original and re-computed results differed significantly, were in excellent agreement with the re-computed results. Appendix A contains NACA 496; Appendix B contains a Matlab(Reistered) program that performs the re-computation of results; Appendix C presents three alternate solution methods, with examples, for the two-degree-of-freedom solution method of NACA 496; Appendix D contains the three-degree-of-freedom solution method (outlined in NACA 496 but never implemented), with examples.

  17. [131I therapy in hyperthyroidism. Results of treatment from 1960-1974].

    Science.gov (United States)

    Heinze, H G; Schenk, F

    1977-02-01

    488 PATIENTS WITH Graves' disease were treated by 131Iodine between 1960 and 1974. 427 (87,5%) of these patients were reexamined several times (clinically, 131I-uptake, PB127I, T4 (CPB-A), T3-uptake, and since 1973 TRH-test). The 131I was given as an individually calculated single dose treatment, using 7 000 -- 10 000 rd before 1965 and 6 000 rd thereafter. Two thirds of the patients became euthyroid after a single 131I-dose. In 20% the treatment had to be repeated. These patients show evidently a different biological behaviour of their disease, since multiple treatments revealed a higher rate of failure (33--35%). There is no principal difference between the out-come after 131I-therapy and surgery concerning the rate of failure, respectively relapse (3--4%) and hypothyroidism. Early incidence of hypothyrodism is dose--dependent, as could be shown in patients treated with higher doses before 1965. The reduction of the irradiation dose to 6 000 rd was followed by a drop of hypothyroidism from 18% to 7%. The reasons of late incidence of hypothyroidism are discussed. The incidence of hypothroidism was calculated by three different methods (over-all incidence, incidence within the observed interval after therapy, life-table method). All three methods revealed different results. This has to be taken into account comparing results after radioiodine as well as after surgery. Radioiodine therapy for hyperthyroidism offers a true alternative to surgery.

  18. Vehicle Speed Determination in Case of Road Accident by Software Method and Comparing of Results with the Mathematical Model

    Directory of Open Access Journals (Sweden)

    Hoxha Gezim

    2017-11-01

    Full Text Available The paper addresses the problem to vehicle speed calculation at road accidents. To determine the speed are used the PC Crash software and Virtual Crash. With both methods are analysed concrete cases of road accidents. Calculation methods and comparing results are present for analyse. These methods consider several factors such are: the front part of the vehicle, the technical feature of the vehicle, car angle, remote relocation after the crash, road conditions etc. Expected results with PC Crash software and Virtual Crash are shown in tabular graphics and compared in mathematical methods.

  19. Rheology of transgenic switchgrass reveals practical aspects of biomass processing.

    Science.gov (United States)

    Wan, Guigui; Frazier, Taylor; Jorgensen, Julianne; Zhao, Bingyu; Frazier, Charles E

    2018-01-01

    Mechanical properties of transgenic switchgrass have practical implications for biorefinery technologies. Presented are fundamentals for simple (thermo)mechanical measurements of genetically transformed switchgrass. Experimental basics are provided for the novice, where the intention is to promote collaboration between plant biologists and materials scientists. Stem sections were subjected to two stress modes: (1) torsional oscillation in the linear response region, and (2) unidirectional torsion to failure. Specimens were analyzed while submerged/saturated in ethylene glycol, simulating natural hydration and allowing experimental temperatures above 100 °C for an improved view of the lignin glass transition. Down-regulation of the 4-Coumarate:coenzyme A ligase gene (reduced lignin content and altered monomer composition) generally resulted in less stiff and weaker stems. These observations were associated with a reduction in the temperature and activation energy of the lignin glass transition, but surprisingly with no difference in the breadth and intensity of the tan  δ signal. The results showed promise in further investigations of how rheological methods relate to stem lignin content, composition, and functional properties in the field and in bioprocessing. Measurements such as these are complicated by small specimen size; however, torsional rheometers (relatively common in polymer laboratories) are well suited for this task. As opposed to the expense and complication of relative humidity control, solvent-submersion rheological methods effectively reveal fundamental structure/property relationships in plant tissues. Demonstrated are low-strain linear methods, and also nonlinear yield and failure analysis; the latter is very uncommon for typical rheological equipment.

  20. A Comparison of Result Reliability for Investigation of Milk Composition by Alternative Analytical Methods in Czech Republic

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2014-01-01

    Full Text Available The milk analyse result reliability is important for assurance of foodstuff chain quality. There are more direct and indirect methods for milk composition measurement (fat (F, protein (P, lactose (L and solids non fat (SNF content. The goal was to evaluate some reference and routine milk analytical procedures on result basis. The direct reference analyses were: F, fat content (Röse–Gottlieb method; P, crude protein content (Kjeldahl method; L, lactose (monohydrate, polarimetric method; SNF, solids non fat (gravimetric method. F, P, L and SNF were determined also by various indirect methods: – MIR (infrared (IR technology with optical filters, 7 instruments in 4 labs; – MIR–FT (IR spectroscopy with Fourier’s transformations, 10 in 6; – ultrasonic method (UM, 3 in 1; – analysis by the blue and red box (BRB, 1 v 1. There were used 10 reference milk samples. Coefficient of determination (R2, correlation coefficient (r and standard deviation of the mean of individual differences (MDsd, for n were evaluated. All correlations (r; for all indirect and alternative methods and all milk components were significant (P ≤ 0.001. MIR and MIR–FT (conventional methods explained considerably higher proportion of the variability in reference results than the UM and BRB methods (alternative. All r average values (x minus 1.64 × sd for 95% confidence interval can be used as standards for calibration quality evaluation (MIR, MIR–FT, UM and BRB: – for F 0.997, 0.997, 0.99 and 0.995; – for P 0.986, 0.981, 0.828 and 0.864; – for L 0.968, 0.871, 0.705 and 0.761; – for SNF 0.992, 0.993, 0.911 and 0.872. Similarly ​MDsd (x plus 1.64 × sd: – for F 0.071, 0.068, 0.132 and 0.101%; – for P 0.051, 0.054, 0.202 and 0.14%; – for L 0.037, 0.074, 0.113 and 0.11%; – for SNF 0.052, 0.068, 0.141 and 0.204.

  1. Task-Related Edge Density (TED)-A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain.

    Science.gov (United States)

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach "Task-related Edge Density" (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function.

  2. Task-Related Edge Density (TED-A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain.

    Directory of Open Access Journals (Sweden)

    Gabriele Lohmann

    Full Text Available The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach "Task-related Edge Density" (TED. TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function.

  3. The Trojan Horse method for nuclear astrophysics: Recent results on resonance reactions

    Energy Technology Data Exchange (ETDEWEB)

    Cognata, M. La; Pizzone, R. G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Spitaleri, C.; Cherubini, S.; Romano, S. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania, Italy and Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Gulino, M.; Tumino, A. [Kore University, Enna, Italy and Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Lamia, L. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy)

    2014-05-09

    Nuclear astrophysics aims to measure nuclear-reaction cross sections of astrophysical interest to be included into models to study stellar evolution and nucleosynthesis. Low energies, < 1 MeV or even < 10 keV, are requested for this is the window where these processes are more effective. Two effects have prevented to achieve a satisfactory knowledge of the relevant nuclear processes, namely, the Coulomb barrier exponentially suppressing the cross section and the presence of atomic electrons. These difficulties have triggered theoretical and experimental investigations to extend our knowledge down to astrophysical energies. For instance, indirect techniques such as the Trojan Horse Method have been devised yielding new cutting-edge results. In particular, I will focus on the application of this indirect method to resonance reactions. Resonances might dramatically enhance the astrophysical S(E)-factor so, when they occur right at astrophysical energies, their measurement is crucial to pin down the astrophysical scenario. Unknown or unpredicted resonances might introduce large systematic errors in nucleosynthesis models. These considerations apply to low-energy resonances and to sub-threshold resonances as well, as they may produce sizable modifications of the S-factor due to, for instance, destructive interference with another resonance.

  4. The Trojan Horse method for nuclear astrophysics: Recent results on resonance reactions

    International Nuclear Information System (INIS)

    Cognata, M. La; Pizzone, R. G.; Spitaleri, C.; Cherubini, S.; Romano, S.; Gulino, M.; Tumino, A.; Lamia, L.

    2014-01-01

    Nuclear astrophysics aims to measure nuclear-reaction cross sections of astrophysical interest to be included into models to study stellar evolution and nucleosynthesis. Low energies, < 1 MeV or even < 10 keV, are requested for this is the window where these processes are more effective. Two effects have prevented to achieve a satisfactory knowledge of the relevant nuclear processes, namely, the Coulomb barrier exponentially suppressing the cross section and the presence of atomic electrons. These difficulties have triggered theoretical and experimental investigations to extend our knowledge down to astrophysical energies. For instance, indirect techniques such as the Trojan Horse Method have been devised yielding new cutting-edge results. In particular, I will focus on the application of this indirect method to resonance reactions. Resonances might dramatically enhance the astrophysical S(E)-factor so, when they occur right at astrophysical energies, their measurement is crucial to pin down the astrophysical scenario. Unknown or unpredicted resonances might introduce large systematic errors in nucleosynthesis models. These considerations apply to low-energy resonances and to sub-threshold resonances as well, as they may produce sizable modifications of the S-factor due to, for instance, destructive interference with another resonance

  5. Qualitative Analysis Results for Applications of a New Fire Probabilistic Safety Assessment Method to Ulchin Unit 3

    International Nuclear Information System (INIS)

    Kang, Daeil; Kim, Kilyoo; Jang, Seungcheol

    2013-01-01

    The fire PRA Implementation Guide has been used for performing a fire PSA for NPPs in Korea. Recently, US NRC and EPRI developed a new fire PSA method, NUREG/CR-6850, to provide state-of-the-art methods, tools, and data for the conduct of a fire PSA for a commercial nuclear power plant (NPP). Due to the limited budget and man powers for the development of KSRP, hybrid PSA approaches, using NUREG/CR-6850 and Fire PRA Implementation Guide, will be employed for conducting a fire PSA of Ulchin Unit 3. In this paper, the qualitative analysis results for applications of a new fire PSA method to Ulchin Unit 3 are presented. This paper introduces the qualitative analysis results for applications of a new fire PSA method to Ulchin Unit 3. Compared with the previous industry, the number of fire areas for quantification identified and the number of equipment selected has increased

  6. Rapid Presentation of Emotional Expressions Reveals New Emotional Impairments in Tourette’s Syndrome

    Directory of Open Access Journals (Sweden)

    Martial eMermillod

    2013-04-01

    Full Text Available Objective:Based on a variety of empirical evidence obtained within the theoretical framework of embodiment theory, we considered it likely that motor disorders in Tourette’s syndrome (TS would have emotional consequences for TS patients. However, previous research using emotional facial categorization tasks suggests that these consequences are limited to TS patients with obsessive-compulsive behaviors(OCB.Method:These studies used long stimulus presentations which allowed the participants to categorize the different emotional facial expressions (EFEs on the basis of a perceptual analysis that might potentially hide a lack of emotional feeling for certain emotions. In order to reduce this perceptual bias, we used a rapid visual presentation procedure.Results:Using this new experimental method, we revealed different and surprising impairments on several EFEs in TS patients compared to matched healthy control participants. Moreover, a spatial frequency analysis of the visual signal processed by the patients suggests that these impairments may be located at a cortical level.Conclusions:The current study indicates that the rapid visual presentation paradigm makes it possible to identify various potential emotional disorders that were not revealed by the standard visual presentation procedures previously reported in the literature. Moreover, the spatial frequency analysis performed in our study suggests that emotional deficit in TS might lie at the level of temporal cortical areas dedicated to the processing of HSF visual information.

  7. Concurrent growth rate and transcript analyses reveal essential gene stringency in Escherichia coli.

    Directory of Open Access Journals (Sweden)

    Shan Goh

    Full Text Available BACKGROUND: Genes essential for bacterial growth are of particular scientific interest. Many putative essential genes have been identified or predicted in several species, however, little is known about gene expression requirement stringency, which may be an important aspect of bacterial physiology and likely a determining factor in drug target development. METHODOLOGY/PRINCIPAL FINDINGS: Working from the premise that essential genes differ in absolute requirement for growth, we describe silencing of putative essential genes in E. coli to obtain a titration of declining growth rates and transcript levels by using antisense peptide nucleic acids (PNA and expressed antisense RNA. The relationship between mRNA decline and growth rate decline reflects the degree of essentiality, or stringency, of an essential gene, which is here defined by the minimum transcript level for a 50% reduction in growth rate (MTL(50. When applied to four growth essential genes, both RNA silencing methods resulted in MTL(50 values that reveal acpP as the most stringently required of the four genes examined, with ftsZ the next most stringently required. The established antibacterial targets murA and fabI were less stringently required. CONCLUSIONS: RNA silencing can reveal stringent requirements for gene expression with respect to growth. This method may be used to validate existing essential genes and to quantify drug target requirement.

  8. Algorithms for monitoring warfarin use: Results from Delphi Method.

    Science.gov (United States)

    Kano, Eunice Kazue; Borges, Jessica Bassani; Scomparini, Erika Burim; Curi, Ana Paula; Ribeiro, Eliane

    2017-10-01

    Warfarin stands as the most prescribed oral anticoagulant. New oral anticoagulants have been approved recently; however, their use is limited and the reversibility techniques of the anticoagulation effect are little known. Thus, our study's purpose was to develop algorithms for therapeutic monitoring of patients taking warfarin based on the opinion of physicians who prescribe this medicine in their clinical practice. The development of the algorithm was performed in two stages, namely: (i) literature review and (ii) algorithm evaluation by physicians using a Delphi Method. Based on the articles analyzed, two algorithms were developed: "Recommendations for the use of warfarin in anticoagulation therapy" and "Recommendations for the use of warfarin in anticoagulation therapy: dose adjustment and bleeding control." Later, these algorithms were analyzed by 19 medical doctors that responded to the invitation and agreed to participate in the study. Of these, 16 responded to the first round, 11 to the second and eight to the third round. A 70% consensus or higher was reached for most issues and at least 50% for six questions. We were able to develop algorithms to monitor the use of warfarin by physicians using a Delphi Method. The proposed method is inexpensive and involves the participation of specialists, and it has proved adequate for the intended purpose. Further studies are needed to validate these algorithms, enabling them to be used in clinical practice.

  9. Variation in Results of Volume Measurements of Stumps of Lower-Limb Amputees : A Comparison of 4 Methods

    NARCIS (Netherlands)

    de Boer-Wilzing, Vera G.; Bolt, Arjen; Geertzen, Jan H.; Emmelot, Cornelis H.; Baars, Erwin C.; Dijkstra, Pieter U.

    de Boer-Wilzing VG, Bolt A, Geertzen JH, Emmelot CH, Baars EC, Dijkstra PU. Variation in results of volume measurements of stumps of lower-limb amputees: a comparison of 4 methods. Arch Phys Med Rehabil 2011;92:941-6. Objective: To analyze the reliability of 4 methods (water immersion,

  10. Long-term results of forearm lengthening and deformity correction by the Ilizarov method.

    Science.gov (United States)

    Orzechowski, Wiktor; Morasiewicz, Leszek; Krawczyk, Artur; Dragan, Szymon; Czapiński, Jacek

    2002-06-30

    Background. Shortening and deformity of the forearm is most frequently caused by congenital disorders or posttraumatic injury. Given its complex anatomy and biomechanics, the forearm is clearly the most difficult segment for lengthening and deformity correction. Material and methods. We analyzed 16 patients with shortening and deformity of the forearm, treated surgically, using the Ilizarov method in our Department from 1989 to 2001. in 9 cases 1-stage surgery was sufficient, while the remaining 7 patients underwent 2-5 stages of treatment. At total of 31 surgical operations were performed. The extent of forearm shortening ranged from 1,5 to 14,5 cm (5-70%). We development a new fixator based on Schanz half-pins. Results. The length of forearm lengthening per operative stage averaged 2,35 cm. the proportion of lengthening ranged from 6% to 48% with an average of 18,3%. The mean lengthening index was 48,15 days/cm. the per-patient rate of complications was 88% compared 45% per stage of treatment, mostly limited rotational mobility and abnormal consolidation of regenerated bone. Conclusions. Despite the high complication rate, the Ilizarov method is the method of choice for patients with forearm shortenings and deformities. Treatment is particularly indicated in patients with shortening caused by disproportionate length of the ulnar and forearm bones. Treatment should be managed so as cause the least possible damage to arm function, even at the cost of limited lengthening. Our new stabilizer based on Schanz half-pins makes it possible to preserve forearm rotation.

  11. Labelling of blood cells with radioactive indium-201: method, results, indications

    International Nuclear Information System (INIS)

    Ducassou, D.; Brendel, A.; Nouel, J.P.

    1978-01-01

    A modification of the method of Thakur et al. for labelling polynuclear cells with 8-hydroxyquinolein-indium-complexe utilising the water soluble sulfate of the substance was applied. The labelling procedure gave a yield over 98% with erthrocytes and over 80% with platelets and polynuclear cells using at least 1 x 10 8 plasma free cells. Functional capacity of the labelled cells remained unaltered. Injection double labelled ( 111 In, 51 Cr) red cells correlation of values for the red cell volume amounted to r = 0,98 (n=20); red cell life-spane measurements gave comparable results in 5 patients. After injecting labelled platelets a life-spane between 6,5 and 11 days was measured. Scintigraphic visualisation of pulmonary embolism was obtained 30 minutes after injecting labelled platelets. Injection of labelled polynuclear cells allows life-spane measurements as well as detection of abscesses. (author)

  12. Three magnetic particles solid phase radioimmunoassay for T4: Comparison of their results with established methods

    International Nuclear Information System (INIS)

    Bashir, T.

    1996-01-01

    The introduction of solid phase separation techniques is an important improvement in radioimmunoassays and immunoradiometric assays. Magnetic particle solid phase method has additional advantages over others, as the separation is rapid and centrifugation is not required. Three types of magnetic particles have been studied in T 4 RIA and the results have been compared with commercial kits and other established methods. (author). 4 refs, 9 figs, 2 tabs

  13. Live cell imaging reveals marked variability in myoblast proliferation and fate

    Science.gov (United States)

    2013-01-01

    Background During the process of muscle regeneration, activated stem cells termed satellite cells proliferate, and then differentiate to form new myofibers that restore the injured area. Yet not all satellite cells contribute to muscle repair. Some continue to proliferate, others die, and others become quiescent and are available for regeneration following subsequent injury. The mechanisms that regulate the adoption of different cell fates in a muscle cell precursor population remain unclear. Methods We have used live cell imaging and lineage tracing to study cell fate in the C2 myoblast line. Results Analyzing the behavior of individual myoblasts revealed marked variability in both cell cycle duration and viability, but similarities between cells derived from the same parental lineage. As a consequence, lineage sizes and outcomes differed dramatically, and individual lineages made uneven contributions toward the terminally differentiated population. Thus, the cohort of myoblasts undergoing differentiation at the end of an experiment differed dramatically from the lineages present at the beginning. Treatment with IGF-I increased myoblast number by maintaining viability and by stimulating a fraction of cells to complete one additional cell cycle in differentiation medium, and as a consequence reduced the variability of the terminal population compared with controls. Conclusion Our results reveal that heterogeneity of responses to external cues is an intrinsic property of cultured myoblasts that may be explained in part by parental lineage, and demonstrate the power of live cell imaging for understanding how muscle differentiation is regulated. PMID:23638706

  14. A Novel Method Describing the Space Charge Limited Region in a Planar Diode

    Directory of Open Access Journals (Sweden)

    Mitra Ghergherehchi

    2017-11-01

    Full Text Available A novel and rather simple method is presented to describe the physics of space-charge region in a planar diode. The method deals with the issue in the time domain and as a consequence transient time behavior can be achieved. Potential distributions and currents obtained using this technique, supposing zero initial velocity for electrons, reveal absolute agreement with Child's results. Moreover, applying the method for non-zero uniform initial velocity for electrons, gives results which are in good agreement with previous works

  15. Vehicle Speed Determination in Case of Road Accident by Software Method and Comparing of Results with the Mathematical Model

    OpenAIRE

    Hoxha Gezim; Shala Ahmet; Likaj Rame

    2017-01-01

    The paper addresses the problem to vehicle speed calculation at road accidents. To determine the speed are used the PC Crash software and Virtual Crash. With both methods are analysed concrete cases of road accidents. Calculation methods and comparing results are present for analyse. These methods consider several factors such are: the front part of the vehicle, the technical feature of the vehicle, car angle, remote relocation after the crash, road conditions etc. Expected results with PC Cr...

  16. Comparison of biosurfactant detection methods reveals hydrophobic surfactants and contact-regulated production

    Science.gov (United States)

    Biosurfactants are diverse molecules with numerous biological functions and industrial applications. A variety of environments were examined for biosurfactant-producing bacteria using a versatile new screening method. The utility of an atomized oil assay was assessed for a large number of bacteria...

  17. Short overview of PSA quantification methods, pitfalls on the road from approximate to exact results

    International Nuclear Information System (INIS)

    Banov, Reni; Simic, Zdenko; Sterc, Davor

    2014-01-01

    Over time the Probabilistic Safety Assessment (PSA) models have become an invaluable companion in the identification and understanding of key nuclear power plant (NPP) vulnerabilities. PSA is an effective tool for this purpose as it assists plant management to target resources where the largest benefit for plant safety can be obtained. PSA has quickly become an established technique to numerically quantify risk measures in nuclear power plants. As complexity of PSA models increases, the computational approaches become more or less feasible. The various computational approaches can be basically classified in two major groups: approximate and exact (BDD based) methods. In recent time modern commercially available PSA tools started to provide both methods for PSA model quantification. Besides availability of both methods in proven PSA tools the usage must still be taken carefully since there are many pitfalls which can drive to wrong conclusions and prevent efficient usage of PSA tool. For example, typical pitfalls involve the usage of higher precision approximation methods and getting a less precise result, or mixing minimal cuts and prime implicants in the exact computation method. The exact methods are sensitive to selected computational paths in which case a simple human assisted rearrangement may help and even switch from computationally non-feasible to feasible methods. Further improvements to exact method are possible and desirable which opens space for a new research. In this paper we will show how these pitfalls may be detected and how carefully actions must be done especially when working with large PSA models. (authors)

  18. Social network extraction based on Web: 1. Related superficial methods

    Science.gov (United States)

    Khairuddin Matyuso Nasution, Mahyuddin

    2018-01-01

    Often the nature of something affects methods to resolve the related issues about it. Likewise, methods to extract social networks from the Web, but involve the structured data types differently. This paper reveals several methods of social network extraction from the same sources that is Web: the basic superficial method, the underlying superficial method, the description superficial method, and the related superficial methods. In complexity we derive the inequalities between methods and so are their computations. In this case, we find that different results from the same tools make the difference from the more complex to the simpler: Extraction of social network by involving co-occurrence is more complex than using occurrences.

  19. Wide Binaries in TGAS: Search Method and First Results

    Science.gov (United States)

    Andrews, Jeff J.; Chanamé, Julio; Agüeros, Marcel A.

    2018-04-01

    Half of all stars reside in binary systems, many of which have orbital separations in excess of 1000 AU. Such binaries are typically identified in astrometric catalogs by matching the proper motions vectors of close stellar pairs. We present a fully Bayesian method that properly takes into account positions, proper motions, parallaxes, and their correlated uncertainties to identify widely separated stellar binaries. After applying our method to the >2 × 106 stars in the Tycho-Gaia astrometric solution from Gaia DR1, we identify over 6000 candidate wide binaries. For those pairs with separations less than 40,000 AU, we determine the contamination rate to be ~5%. This sample has an orbital separation (a) distribution that is roughly flat in log space for separations less than ~5000 AU and follows a power law of a -1.6 at larger separations.

  20. A Literature Study of Matrix Element Influenced to the Result of Analysis Using Absorption Atomic Spectroscopy Method (AAS)

    International Nuclear Information System (INIS)

    Tyas-Djuhariningrum

    2004-01-01

    The gold sample analysis can be deviated more than >10% to those thrue value caused by the matrix element. So that the matrix element character need to be study in order to reduce the deviation. In rock samples, the matrix elements can cause self quenching, self absorption and ionization process, so there is a result analysis error. In the rock geochemical process, the elements of the same group at the periodic system have the tendency to be together because of their same characteristic. In absorption Atomic Spectroscopy analysis, the elements associate can absorb primer energy with similar wave length so that it can cause deviation in the result interpretation. The aim of study is to predict matrix element influences from rock sample with application standard method for reducing deviation. In quantitative way, assessment of primer light intensity that will be absorbed is proportional to the concentration atom in the sample that relationship between photon intensity with concentration in part per million is linier (ppm). These methods for eliminating matrix elements influence consist of three methods : external standard method, internal standard method, and addition standard method. External standard method for all matrix element, internal standard method for elimination matrix element that have similar characteristics, addition standard methods for elimination matrix elements in Au, Pt samples. The third of standard posess here accuracy are about 95-97%. (author)

  1. Cobalt Coordination and Clustering in α-Co(OH)2 Revealed by Synchrotron X-ray Total Scattering

    International Nuclear Information System (INIS)

    Neilson, James R.; Kurzman, Joshua A.; Seshadri, Ram; Morse, Daniel E.

    2010-01-01

    Structures of layered metal hydroxides are not well described by traditional crystallography. Total scattering from a synthesis-controlled subset of these materials, as described here, reveals that different cobalt coordination polyhedra cluster within each layer on short length scales, offering new insights and approaches for understanding the properties of these and related layered materials. Structures related to that of brucite (Mg(OH) 2 ) are ubiquitous in the mineral world and offer a variety of useful functions ranging from catalysis and ion-exchange to sequestration and energy transduction, including applications in batteries. However, it has been difficult to resolve the atomic structure of these layered compounds because interlayer disorder disrupts the long-range periodicity necessary for diffraction-based structure determination. For this reason, traditional unit-cell-based descriptions have remained inaccurate. Here we apply, for the first time to such layered hydroxides, synchrotron X-ray total scattering methods - analyzing both the Bragg and diffuse components - to resolve the intralayer structure of three different α-cobalt hydroxides, revealing the nature and distribution of metal site coordination. The different compounds with incorporated chloride ions have been prepared with kinetic control of hydrolysis to yield different ratios of octahedrally and tetrahedrally coordinated cobalt ions within the layers, as confirmed by total scattering. Real-space analyses indicate local clustering of polyhedra within the layers, manifested in the weighted average of different ordered phases with fixed fractions of tetrahedrally coordinated cobalt sites. These results, hidden from an averaged unit-cell description, reveal new structural characteristics that are essential to understanding the origin of fundamental material properties such as color, anion exchange capacity, and magnetic behavior. Our results also provide further insights into the detailed

  2. Cobalt coordination and clustering in alpha-Co(OH)(2) revealed by synchrotron X-ray total scattering.

    Science.gov (United States)

    Neilson, James R; Kurzman, Joshua A; Seshadri, Ram; Morse, Daniel E

    2010-09-03

    Structures of layered metal hydroxides are not well described by traditional crystallography. Total scattering from a synthesis-controlled subset of these materials, as described here, reveals that different cobalt coordination polyhedra cluster within each layer on short length scales, offering new insights and approaches for understanding the properties of these and related layered materials. Structures related to that of brucite [Mg(OH)(2)] are ubiquitous in the mineral world and offer a variety of useful functions ranging from catalysis and ion-exchange to sequestration and energy transduction, including applications in batteries. However, it has been difficult to resolve the atomic structure of these layered compounds because interlayer disorder disrupts the long-range periodicity necessary for diffraction-based structure determination. For this reason, traditional unit-cell-based descriptions have remained inaccurate. Here we apply, for the first time to such layered hydroxides, synchrotron X-ray total scattering methods-analyzing both the Bragg and diffuse components-to resolve the intralayer structure of three different alpha-cobalt hydroxides, revealing the nature and distribution of metal site coordination. The different compounds with incorporated chloride ions have been prepared with kinetic control of hydrolysis to yield different ratios of octahedrally and tetrahedrally coordinated cobalt ions within the layers, as confirmed by total scattering. Real-space analyses indicate local clustering of polyhedra within the layers, manifested in the weighted average of different ordered phases with fixed fractions of tetrahedrally coordinated cobalt sites. These results, hidden from an averaged unit-cell description, reveal new structural characteristics that are essential to understanding the origin of fundamental material properties such as color, anion exchange capacity, and magnetic behavior. Our results also provide further insights into the detailed

  3. The use of principal component, discriminate and rough sets analysis methods of radiological data

    International Nuclear Information System (INIS)

    Seddeek, M.K.; Kozae, A.M.; Sharshar, T.; Badran, H.M.

    2006-01-01

    In this work, computational methods of finding clusters of multivariate data points were explored using principal component analysis (PCA), discriminate analysis (DA) and rough set analysis (RSA) methods. The variables were the concentrations of four natural isotopes and the texture characteristics of 100 sand samples from the coast of North Sinai, Egypt. Beach and dune sands are the two types of samples included. These methods were used to reduce the dimensionality of multivariate data and as classification and clustering methods. The results showed that the classification of sands in the environment of North Sinai is dependent upon the radioactivity contents of the naturally occurring radioactive materials and not upon the characteristics of the sand. The application of DA enables the creation of a classification rule for sand type and it revealed that samples with high negatively values of the first score have the highest contamination of black sand. PCA revealed that radioactivity concentrations alone can be considered to predict the classification of other samples. The results of RSA showed that only one of the concentrations of 238 U, 226 Ra and 232 Th with 40 K content, can characterize the clusters together with characteristics of the sand. Both PCA and RSA result in the following conclusion: 238 U, 226 Ra and 232 Th behave similarly. RSA revealed that one/two of them may not be considered without affecting the body of knowledge

  4. The method of lines solution of discrete ordinates method for non-grey media

    International Nuclear Information System (INIS)

    Cayan, Fatma Nihan; Selcuk, Nevin

    2007-01-01

    A radiation code based on method of lines (MOL) solution of discrete ordinates method (DOM) for radiative heat transfer in non-grey absorbing-emitting media was developed by incorporation of a gas spectral radiative property model, namely wide band correlated-k (WBCK) model, which is compatible with MOL solution of DOM. Predictive accuracy of the code was evaluated by applying it to 1-D parallel plate and 2-D axisymmetric cylindrical enclosure problems containing absorbing-emitting medium and benchmarking its predictions against line-by-line solutions available in the literature. Comparisons reveal that MOL solution of DOM with WBCK model produces accurate results for radiative heat fluxes and source terms and can be used with confidence in conjunction with computational fluid dynamics codes based on the same approach

  5. The Vermont oxford neonatal encephalopathy registry: rationale, methods, and initial results

    Science.gov (United States)

    2012-01-01

    Background In 2006, the Vermont Oxford Network (VON) established the Neonatal Encephalopathy Registry (NER) to characterize infants born with neonatal encephalopathy, describe evaluations and medical treatments, monitor hypothermic therapy (HT) dissemination, define clinical research questions, and identify opportunities for improved care. Methods Eligible infants were ≥ 36 weeks with seizures, altered consciousness (stupor, coma) during the first 72 hours of life, a 5 minute Apgar score of ≤ 3, or receiving HT. Infants with central nervous system birth defects were excluded. Results From 2006–2010, 95 centers registered 4232 infants. Of those, 59% suffered a seizure, 50% had a 5 minute Apgar score of ≤ 3, 38% received HT, and 18% had stupor/coma documented on neurologic exam. Some infants experienced more than one eligibility criterion. Only 53% had a cord gas obtained and only 63% had a blood gas obtained within 24 hours of birth, important components for determining HT eligibility. Sixty-four percent received ventilator support, 65% received anticonvulsants, 66% had a head MRI, 23% had a cranial CT, 67% had a full channel encephalogram (EEG) and 33% amplitude integrated EEG. Of all infants, 87% survived. Conclusions The VON NER describes the heterogeneous population of infants with NE, the subset that received HT, their patterns of care, and outcomes. The optimal routine care of infants with neonatal encephalopathy is unknown. The registry method is well suited to identify opportunities for improvement in the care of infants affected by NE and study interventions such as HT as they are implemented in clinical practice. PMID:22726296

  6. Gravimetric method for in vitro calibration of skin hydration measurements.

    Science.gov (United States)

    Martinsen, Ørjan G; Grimnes, Sverre; Nilsen, Jon K; Tronstad, Christian; Jang, Wooyoung; Kim, Hongsig; Shin, Kunsoo; Naderi, Majid; Thielmann, Frank

    2008-02-01

    A novel method for in vitro calibration of skin hydration measurements is presented. The method combines gravimetric and electrical measurements and reveals an exponential dependency of measured electrical susceptance to absolute water content in the epidermal stratum corneum. The results also show that absorption of water into the stratum corneum exhibits three different phases with significant differences in absorption time constant. These phases probably correspond to bound, loosely bound, and bulk water.

  7. The implicit effect of texturizing field on the elastic properties of magnetic elastomers revealed by SANS

    Energy Technology Data Exchange (ETDEWEB)

    Balasoiu, M., E-mail: balas@jinr.ru [Joint Institute of Nuclear Research, Dubna (Russian Federation); Horia Hulubei National Institute for Physics and Nuclear Engineering, Bucharest (Romania); Lebedev, V.T. [St.Petersburg Nuclear Physics Institute NRC KI, Gatchina (Russian Federation); Raikher, Yu.L. [Institute of Continuous Media Mechanics, Russian Academy of Sciences, Ural Branch, Perm (Russian Federation); Bica, I.; Bunoiu, M. [West University of Timisoara, Department of Physics (Romania)

    2017-06-01

    Small angle neutron scattering method (SANS) is used to characterize the structure properties of the polymer matrix of magnetic elastomers (MEs) of the same material content but with different magnetic textures. For that, series of silicone-rubber elastomers mixed with a ferrofluid and polymerized with/without external magnetic field were studied. In the species of pure rubber and the ME samples synthesized without field, SANS reveals a substantial number of large polymer coils (blobs) which are vertically prolate. The case of MEs polymerized under the magnetic field that is also vertically directed, is different. SANS data indicates that there the blobs are preferably elongated in the direction normal to the field. - Highlights: • SANS method is used to determine the structure of SR elastomers polymerized with ferrofluid in/no external magnetic field. • In the rubber and ME samples synthesized without field, SANS reveals a substantial number of vertically prolate blobs. • For MEs polymerized in vertical magnetic field, results that the blobs are elongated in the direction normal to the field. • Isotropic and texturized MEs differ by the filler structure and by intrinsic elastic properties of the matrix as well.

  8. Propulsion and launching analysis of variable-mass rockets by analytical methods

    Directory of Open Access Journals (Sweden)

    D.D. Ganji

    2013-09-01

    Full Text Available In this study, applications of some analytical methods on nonlinear equation of the launching of a rocket with variable mass are investigated. Differential transformation method (DTM, homotopy perturbation method (HPM and least square method (LSM were applied and their results are compared with numerical solution. An excellent agreement with analytical methods and numerical ones is observed in the results and this reveals that analytical methods are effective and convenient. Also a parametric study is performed here which includes the effect of exhaust velocity (Ce, burn rate (BR of fuel and diameter of cylindrical rocket (d on the motion of a sample rocket, and contours for showing the sensitivity of these parameters are plotted. The main results indicate that the rocket velocity and altitude are increased with increasing the Ce and BR and decreased with increasing the rocket diameter and drag coefficient.

  9. Project Deep Drilling KLX02 - Phase 2. Methods, scope of activities and results. Summary report

    International Nuclear Information System (INIS)

    Ekman, L.

    2001-04-01

    Geoscientific investigations performed by SKB, including those at the Aespoe Hard Rock Laboratory, have so far comprised the bedrock horizon down to about 1000 m. The primary purposes with the c. 1700 m deep, φ76 mm, sub vertical core borehole KLX02, drilled during the autumn 1992 at Laxemar, Oskarshamn, was to test core drilling technique at large depths and with a relatively large diameter and to enable geoscientific investigations beyond 1000 m. Drilling of borehole KLX02 was fulfilled very successfully. Results of the drilling commission and the borehole investigations conducted in conjunction with drilling have been reported earlier. The present report provides a summary of the investigations made during a five year period after completion of drilling. Results as well as methods applied are described. A variety of geoscientific investigations to depths exceeding 1600 m were successfully performed. However, the investigations were not entirely problem-free. For example, borehole equipment got stuck in the borehole at several occasions. Special investigations, among them a fracture study, were initiated in order to reveal the mechanisms behind this problem. Different explanations seem possible, e.g. breakouts from the borehole wall, which may be a specific problem related to the stress situation in deep boreholes. The investigation approach for borehole KLX02 followed, in general outline, the SKB model for site investigations, where a number of key issues for site characterization are studied. For each of those, a number of geoscientific parameters are investigated and determined. One important aim is to erect a lithological-structural model of the site, which constitutes the basic requirement for modelling mechanical stability, thermal properties, groundwater flow, groundwater chemistry and transport of solutes. The investigations in borehole KLX02 resulted in a thorough lithological-structural characterization of the rock volume near the borehole. In order to

  10. Project Deep Drilling KLX02 - Phase 2. Methods, scope of activities and results. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    Ekman, L. [GEOSIGMA AB/LE Geokonsult AB, Uppsala (Sweden)

    2001-04-01

    Geoscientific investigations performed by SKB, including those at the Aespoe Hard Rock Laboratory, have so far comprised the bedrock horizon down to about 1000 m. The primary purposes with the c. 1700 m deep, {phi}76 mm, sub vertical core borehole KLX02, drilled during the autumn 1992 at Laxemar, Oskarshamn, was to test core drilling technique at large depths and with a relatively large diameter and to enable geoscientific investigations beyond 1000 m. Drilling of borehole KLX02 was fulfilled very successfully. Results of the drilling commission and the borehole investigations conducted in conjunction with drilling have been reported earlier. The present report provides a summary of the investigations made during a five year period after completion of drilling. Results as well as methods applied are described. A variety of geoscientific investigations to depths exceeding 1600 m were successfully performed. However, the investigations were not entirely problem-free. For example, borehole equipment got stuck in the borehole at several occasions. Special investigations, among them a fracture study, were initiated in order to reveal the mechanisms behind this problem. Different explanations seem possible, e.g. breakouts from the borehole wall, which may be a specific problem related to the stress situation in deep boreholes. The investigation approach for borehole KLX02 followed, in general outline, the SKB model for site investigations, where a number of key issues for site characterization are studied. For each of those, a number of geoscientific parameters are investigated and determined. One important aim is to erect a lithological-structural model of the site, which constitutes the basic requirement for modelling mechanical stability, thermal properties, groundwater flow, groundwater chemistry and transport of solutes. The investigations in borehole KLX02 resulted in a thorough lithological-structural characterization of the rock volume near the borehole. In order

  11. A nuclear method to authenticate Buddha images

    International Nuclear Information System (INIS)

    Khaweerat, S; Ratanatongchai, W; Channuie, J; Wonglee, S; Picha, R; Promping, J; Silva, K; Liamsuwan, T

    2015-01-01

    The value of Buddha images in Thailand varies dramatically depending on authentication and provenance. In general, people use their individual skills to make the justification which frequently leads to obscurity, deception and illegal activities. Here, we propose two non-destructive techniques of neutron radiography (NR) and neutron activation autoradiography (NAAR) to reveal respectively structural and elemental profiles of small Buddha images. For NR, a thermal neutron flux of 10 5 n cm -2 s -1 was applied. NAAR needed a higher neutron flux of 10 12 n cm -2 s -1 to activate the samples. Results from NR and NAAR revealed unique characteristic of the samples. Similarity of the profile played a key role in the classification of the samples. The results provided visual evidence to enhance the reliability of authenticity approval. The method can be further developed for routine practice which impact thousands of customers in Thailand. (paper)

  12. A nuclear method to authenticate Buddha images

    Science.gov (United States)

    Khaweerat, S.; Ratanatongchai, W.; Channuie, J.; Wonglee, S.; Picha, R.; Promping, J.; Silva, K.; Liamsuwan, T.

    2015-05-01

    The value of Buddha images in Thailand varies dramatically depending on authentication and provenance. In general, people use their individual skills to make the justification which frequently leads to obscurity, deception and illegal activities. Here, we propose two non-destructive techniques of neutron radiography (NR) and neutron activation autoradiography (NAAR) to reveal respectively structural and elemental profiles of small Buddha images. For NR, a thermal neutron flux of 105 n cm-2s-1 was applied. NAAR needed a higher neutron flux of 1012 n cm-2 s-1 to activate the samples. Results from NR and NAAR revealed unique characteristic of the samples. Similarity of the profile played a key role in the classification of the samples. The results provided visual evidence to enhance the reliability of authenticity approval. The method can be further developed for routine practice which impact thousands of customers in Thailand.

  13. A Systematic Protein Refolding Screen Method using the DGR Approach Reveals that Time and Secondary TSA are Essential Variables.

    Science.gov (United States)

    Wang, Yuanze; van Oosterwijk, Niels; Ali, Ameena M; Adawy, Alaa; Anindya, Atsarina L; Dömling, Alexander S S; Groves, Matthew R

    2017-08-24

    Refolding of proteins derived from inclusion bodies is very promising as it can provide a reliable source of target proteins of high purity. However, inclusion body-based protein production is often limited by the lack of techniques for the detection of correctly refolded protein. Thus, the selection of the refolding conditions is mostly achieved using trial and error approaches and is thus a time-consuming process. In this study, we use the latest developments in the differential scanning fluorimetry guided refolding approach as an analytical method to detect correctly refolded protein. We describe a systematic buffer screen that contains a 96-well primary pH-refolding screen in conjunction with a secondary additive screen. Our research demonstrates that this approach could be applied for determining refolding conditions for several proteins. In addition, it revealed which "helper" molecules, such as arginine and additives are essential. Four different proteins: HA-RBD, MDM2, IL-17A and PD-L1 were used to validate our refolding approach. Our systematic protocol evaluates the impact of the "helper" molecules, the pH, buffer system and time on the protein refolding process in a high-throughput fashion. Finally, we demonstrate that refolding time and a secondary thermal shift assay buffer screen are critical factors for improving refolding efficiency.

  14. Fractal dynamics in self-evaluation reveal self-concept clarity.

    Science.gov (United States)

    Wong, Alexander E; Vallacher, Robin R; Nowak, Andrzej

    2014-10-01

    The structural account of self-esteem and self-evaluation maintains that they are distinct constructs. Trait self-esteem is stable and is expressed over macro timescales, whereas state self-evaluation is unstable and experienced on micro timescales. We compared predictions based on the structural account with those derived from a dynamical systems perspective on the self, which maintains that self-esteem and self-evaluation are hierarchically related and share basic dynamic properties. Participants recorded a 3-minute narrative about themselves, then used the mouse paradigm (Vallacher, Nowak, Froehlich, & Rockloff, 2002) to track the momentary self-evaluation in their narrative. Multiple methods converged to reveal fractal patterns in the resultant temporal patterns, indicative of nested timescales that link micro and macro selfevaluation and thus supportive of the dynamical account. The fractal dynamics were associated with participants' self-concept clarity, suggesting that the hierarchical relation between macro self-evaluation (self-esteem) and momentary self-evaluation is predicted by the coherence of self-concept organization.

  15. RESULTS OF OUTPATIENT PROGRAM ON EFFECTIVE THERAPY OF REFRACTORY ARTERIAL HYPERTENSION

    Directory of Open Access Journals (Sweden)

    M. M. Batyushin

    2015-12-01

    Full Text Available Aim. To increase in efficacy of antihypertensive therapy in patients with refractory arterial hypertension (HT.Material and methods. Patients with refractory HT were revealed during first month of program. The causes of refractory HT were analyzed. Combined antihypertensive therapy was prescribed to reach target level of blood pressure (BP. This therapy lasted 24 weeks and included angiotensin converting enzyme (ACE inhibitor, thiazid diuretic (indapamide and dihydropyridine calcium antagonist (nifedipine XL.Results. 200 patients with refractory HT were revealed. True refractory HT took place in 59,9% of patients and pseudo refractory HT – in 40,1% of patients. Lack of diuretics or combined antihypertensive therapy were the main reason of insufficient BP control. Proposed 3-drugs therapy resulted in reduction of systolic BP from 190 to 132 Hg mm and diastolic BP from 104 to 81 Hg mm. Target level of BP was reached in 94% patients. There were no side effects which demanded to stop therapy.Conclusion. High incidence of pseudorefractory HT (40,1% is revealed. Significant prevalence of renal disturbances especially chronic interstitial inflammatory could be responsible for refractory HT development. Use of 3-drugs therapy (ACE inhibitor, indapamide and nifedipine XL provides effective control of BP in refractory and pseudorefractory HT.

  16. RESULTS OF OUTPATIENT PROGRAM ON EFFECTIVE THERAPY OF REFRACTORY ARTERIAL HYPERTENSION

    Directory of Open Access Journals (Sweden)

    M. M. Batyushin

    2007-01-01

    Full Text Available Aim. To increase in efficacy of antihypertensive therapy in patients with refractory arterial hypertension (HT.Material and methods. Patients with refractory HT were revealed during first month of program. The causes of refractory HT were analyzed. Combined antihypertensive therapy was prescribed to reach target level of blood pressure (BP. This therapy lasted 24 weeks and included angiotensin converting enzyme (ACE inhibitor, thiazid diuretic (indapamide and dihydropyridine calcium antagonist (nifedipine XL.Results. 200 patients with refractory HT were revealed. True refractory HT took place in 59,9% of patients and pseudo refractory HT – in 40,1% of patients. Lack of diuretics or combined antihypertensive therapy were the main reason of insufficient BP control. Proposed 3-drugs therapy resulted in reduction of systolic BP from 190 to 132 Hg mm and diastolic BP from 104 to 81 Hg mm. Target level of BP was reached in 94% patients. There were no side effects which demanded to stop therapy.Conclusion. High incidence of pseudorefractory HT (40,1% is revealed. Significant prevalence of renal disturbances especially chronic interstitial inflammatory could be responsible for refractory HT development. Use of 3-drugs therapy (ACE inhibitor, indapamide and nifedipine XL provides effective control of BP in refractory and pseudorefractory HT.

  17. The Semianalytical Solutions for Stiff Systems of Ordinary Differential Equations by Using Variational Iteration Method and Modified Variational Iteration Method with Comparison to Exact Solutions

    Directory of Open Access Journals (Sweden)

    Mehmet Tarik Atay

    2013-01-01

    Full Text Available The Variational Iteration Method (VIM and Modified Variational Iteration Method (MVIM are used to find solutions of systems of stiff ordinary differential equations for both linear and nonlinear problems. Some examples are given to illustrate the accuracy and effectiveness of these methods. We compare our results with exact results. In some studies related to stiff ordinary differential equations, problems were solved by Adomian Decomposition Method and VIM and Homotopy Perturbation Method. Comparisons with exact solutions reveal that the Variational Iteration Method (VIM and the Modified Variational Iteration Method (MVIM are easier to implement. In fact, these methods are promising methods for various systems of linear and nonlinear stiff ordinary differential equations. Furthermore, VIM, or in some cases MVIM, is giving exact solutions in linear cases and very satisfactory solutions when compared to exact solutions for nonlinear cases depending on the stiffness ratio of the stiff system to be solved.

  18. Results and current trends of nuclear methods used in agriculture

    International Nuclear Information System (INIS)

    Horacek, P.

    1983-01-01

    The significance is evaluated of nuclear methods for agricultural research. The number of breeds induced by radiation mutations is increasing. The main importance of radiation mutation breeding consists in obtaining sources of the desired genetic properties for further hybridization. Radiostimulation is conducted with the aim of increasing yields. The irradiation of foods has not substantially increased worldwide. Very important is the irradiation of excrements and sludges which after such inactivation of pathogenic microorganisms may be used as humus-forming manure or as feed additives. In some countries the method is successfully being used of sexual sterilization for eradication of insect pests. The application of labelled compounds in the nutrition, physiology and protection of plants, farm animals and in food hygiene makes it possible to acquire new and accurate knowledge very quickly. Radioimmunoassay is a highly promising method in this respect. Labelling compounds with the stable 15 N isotope is used for the research of nitrogen metabolism. (M.D.)

  19. Daily radiotoxicological supervision of personnel at the Pierrelatte industrial complex. Methods and results

    International Nuclear Information System (INIS)

    Chalabreysse, Jacques.

    1978-05-01

    A 13 year experience gained from daily radiotoxicological supervision of personnel at the PIERRELATTE industrial complex is presented. This study is divided into two parts: part one is theoretical: bibliographical synthesis of all scattered documents and publications; a homogeneous survey of all literature on the subject is thus available. Part two reviews the experience gained in professional surroundings: laboratory measurements and analyses (development of methods and daily applications); mathematical formulae to answer the first questions which arise before an individual liable to be contaminated; results obtained at PIERRELATTE [fr

  20. Application of NDE methods to green ceramics: initial results

    International Nuclear Information System (INIS)

    Kupperman, D.S.; Karplus, H.B.; Poeppel, R.B.; Ellingson, W.A.; Berger, H.; Robbins, C.; Fuller, E.

    1984-03-01

    This paper describes a preliminary investigation to assess the effectiveness of microradiography, ultrasonic methods, nuclear magnetic resonance, and neutron radiography for the nondestructive evaluation of green (unfired), ceramics. Objective is to obtain useful information on defects, cracking, delaminations, agglomerates, inclusions, regions of high porosity, and anisotropy

  1. The results of STEM education methods in physics at the 11th grade level: Light and visual equipment lesson

    Science.gov (United States)

    Tungsombatsanti, A.; Ponkham, K.; Somtoa, T.

    2018-01-01

    This research aimed to: 1) To evaluate the efficiency of the process and the efficiency of the results (E1 / E2) of the innovative instructional lesson plan in the form of the STEM Education method in the field of physics of secondary students at the 10th grade level in physics class to determine the efficiency of the STEM based on criteria of the 70/70 standard level. 2) To study students' critical thinking skills of secondary students at the 11th grade level, and assessing skill in criteria 80 percentage 3) To compare learning achievements between students' pre-post testing after taught in STEM Education 4) To evaluate Student' Satisfaction after using STEM Education teaching by using mean compare to 5 points Likert Scale. The participant used were 40 students from grade 11 at Borabu School, Borabu District, Mahasarakham Province, semester 2, Academic year 2016. Tools used in this study consist of: 1) STEM Education plan about the force and laws of motion for grade 11 students of 1 schemes with total of 15 hours, 2) The test of critical think skills with essay type in amount of 30 items, 3) achievement test on Light and visual equipment with multiple-choice of 4 options of 30 items, 4) satisfaction learning with 5 Rating Scale of 16 items. The statistics used in data analysis were percentage, mean, standard deviation, and t-test (Dependent). The results showed that 1) The results of these findings revealed that the efficiency of the STEM based on criteria indicate that are higher than the standard level of the 70/70 at 71.51/75 2) Student has critical thinking scores that are higher than criteria 80 percentage as amount is 26 people. 3) Statistically significant of students' learning achievements to their later outcomes were differentiated between pretest and posttest at the .05 level, evidently. 4) The student' level of satisfaction toward the learning by using STEM Education plan was at a good level (X ¯ = 4.33, S.D = 0.64).

  2. Four-spacecraft determination of magnetopause orientation, motion and thickness: comparison with results from single-spacecraft methods

    Directory of Open Access Journals (Sweden)

    S. E. Haaland

    2004-04-01

    Full Text Available In this paper, we use Cluster data from one magnetopause event on 5 July 2001 to compare predictions from various methods for determination of the velocity, orientation, and thickness of the magnetopause current layer. We employ established as well as new multi-spacecraft techniques, in which time differences between the crossings by the four spacecraft, along with the duration of each crossing, are used to calculate magnetopause speed, normal vector, and width. The timing is based on data from either the Cluster Magnetic Field Experiment (FGM or the Electric Field Experiment (EFW instruments. The multi-spacecraft results are compared with those derived from various single-spacecraft techniques, including minimum-variance analysis of the magnetic field and deHoffmann-Teller, as well as Minimum-Faraday-Residue analysis of plasma velocities and magnetic fields measured during the crossings. In order to improve the overall consistency between multi- and single-spacecraft results, we have also explored the use of hybrid techniques, in which timing information from the four spacecraft is combined with certain limited results from single-spacecraft methods, the remaining results being left for consistency checks. The results show good agreement between magnetopause orientations derived from appropriately chosen single-spacecraft techniques and those obtained from multi-spacecraft timing. The agreement between magnetopause speeds derived from single- and multi-spacecraft methods is quantitatively somewhat less good but it is evident that the speed can change substantially from one crossing to the next within an event. The magnetopause thickness varied substantially from one crossing to the next, within an event. It ranged from 5 to 10 ion gyroradii. The density profile was sharper than the magnetic profile: most of the density change occured in the earthward half of the magnetopause.

    Key words. Magnetospheric physics (magnetopause, cusp and

  3. Performance evaluation of elemental analysis/isotope ratio mass spectrometry methods for the determination of the D/H ratio in tetramethylurea and other compounds--results of a laboratory inter-comparison.

    Science.gov (United States)

    Bréas, Olivier; Thomas, Freddy; Zeleny, Reinhard; Calderone, Giovanni; Jamin, Eric; Guillou, Claude

    2007-01-01

    Tetramethylurea (TMU) with a certified D/H ratio is the internal standard for Site-specific Natural Isotope Fractionation measured by Nuclear Magnetic Resonance (SNIF-NMR) analysis of wine ethanol for detection of possible adulterations (Commission Regulation 2676/90). A new batch of a TMU certified reference material (CRM) is currently being prepared. Whereas SNIF-NMR has been employed up to now, Elemental Analysis/Isotope Ratio Mass Spectrometry ((2)H-EA-IRMS) was envisaged as the method of choice for value assignment of the new CRM, as more precise (better repeatable) data might be obtained, resulting in lower uncertainty of the certified value. In order to evaluate the accuracy and intra- and inter-laboratory reproducibility of (2)H-EA-IRMS methods, a laboratory inter-comparison was carried out by analysing TMU and other organic compounds, as well as some waters. The results revealed that experienced laboratories are capable of generating robust and well comparable data, which highlights the emerging potential of IRMS in food authenticity testing. However, a systematic bias between IRMS and SNIF-NMR reference data was observed for TMU; this lack of data consistency rules out the (2)H-IRMS technique for the characterisation measurement of the new TMU CRM.

  4. Method and equipment for treating waste water resulting from the technological testing processes of NPP equipment

    International Nuclear Information System (INIS)

    Radulescu, M. C.; Valeca, S.; Iorga, C.

    2016-01-01

    Modern methods and technologies coupled together with advanced equipment for treating residual substances resulted from technological processes are mandatory measures for all industrial facilities. The correct management of the used working agents and of the all wastes resulted from the different technological process (preparation, use, collection, neutralization, discharge) is intended to reduce up to removal of their potential negative impact on the environment. The high pressure and temperature testing stands from INR intended for functional testing of nuclear components (fuel bundles, fuelling machines, etc.) were included in these measures since the use of oils, demineralized water chemically treated, greases, etc. This paper is focused on the method and equipment used at INR Pitesti in the chemical treatment of demineralized waters, as well as the equipment for collecting, neutralizing and discharging them after use. (authors)

  5. What does patient feedback reveal about the NHS? A mixed methods study of comments posted to the NHS Choices online service

    Science.gov (United States)

    Brookes, Gavin; Baker, Paul

    2017-01-01

    Objective To examine the key themes of positive and negative feedback in patients’ online feedback on NHS (National Health Service) services in England and to understand the specific issues within these themes and how they drive positive and negative evaluation. Design Computer-assisted quantitative and qualitative studies of 228 113 comments (28 971 142 words) of online feedback posted to the NHS Choices website. Comments containing the most frequent positive and negative evaluative words are qualitatively examined to determine the key drivers of positive and negative feedback. Participants Contributors posting comments about the NHS between March 2013 and September 2015. Results Overall, NHS services were evaluated positively approximately three times more often than negatively. The four key areas of focus were: treatment, communication, interpersonal skills and system/organisation. Treatment exhibited the highest proportion of positive evaluative comments (87%), followed by communication (77%), interpersonal skills (44%) and, finally, system/organisation (41%). Qualitative analysis revealed that reference to staff interpersonal skills featured prominently, even in comments relating to treatment and system/organisational issues. Positive feedback was elicited in cases of staff being caring, compassionate and knowing patients’’ names, while rudeness, apathy and not listening were frequent drivers of negative feedback. Conclusions Although technical competence constitutes an undoubtedly fundamental aspect of healthcare provision, staff members were much more likely to be evaluated both positively and negatively according to their interpersonal skills. Therefore, the findings reported in this study highlight the salience of such ‘soft’ skills to patients and emphasise the need for these to be focused upon and developed in staff training programmes, as well as ensuring that decisions around NHS funding do not result in demotivated and rushed staff. The

  6. Effectiveness of Various Innovative Learning Methods in Health Science Classrooms: A Meta-Analysis

    Science.gov (United States)

    Kalaian, Sema A.; Kasim, Rafa M.

    2017-01-01

    This study reports the results of a meta-analysis of the available literature on the effectiveness of various forms of innovative small-group learning methods on student achievement in undergraduate college health science classrooms. The results of the analysis revealed that most of the primary studies supported the effectiveness of the…

  7. Human exposure to bisphenol A by biomonitoring: Methods, results and assessment of environmental exposures

    International Nuclear Information System (INIS)

    Dekant, Wolfgang; Voelkel, Wolfgang

    2008-01-01

    Human exposure to bisphenol A is controversially discussed. This review critically assesses methods for biomonitoring of bisphenol A exposures and reported concentrations of bisphenol A in blood and urine of non-occupationally ('environmentally') exposed humans. From the many methods published to assess bisphenol A concentrations in biological media, mass spectrometry-based methods are considered most appropriate due to high sensitivity, selectivity and precision. In human blood, based on the known toxicokinetics of bisphenol A in humans, the expected very low concentrations of bisphenol A due to rapid biotransformation and the very rapid excretion result in severe limitations in the use of reported blood levels of bisphenol A for exposure assessment. Due to the rapid and complete excretion of orally administered bisphenol A, urine samples are considered as the appropriate body fluid for bisphenol A exposure assessment. In urine samples from several cohorts, bisphenol A (as glucuronide) was present in average concentrations in the range of 1-3 μg/L suggesting that daily human exposure to bisphenol A is below 6 μg per person (< 0.1 μg/kg bw/day) for the majority of the population

  8. GePb Alloy Growth Using Layer Inversion Method

    Science.gov (United States)

    Alahmad, Hakimah; Mosleh, Aboozar; Alher, Murtadha; Banihashemian, Seyedeh Fahimeh; Ghetmiri, Seyed Amir; Al-Kabi, Sattar; Du, Wei; Li, Bauhoa; Yu, Shui-Qing; Naseem, Hameed A.

    2018-04-01

    Germanium-lead films have been investigated as a new direct-bandgap group IV alloy. GePb films were deposited on Si via thermal evaporation of Ge and Pb solid sources using the layer inversion metal-induced crystallization method for comparison with the current laser-induced recrystallization method. Material characterization of the films using x-ray diffraction analysis revealed highly oriented crystallinity and Pb incorporation as high as 13.5% before and 5.2% after annealing. Transmission electron microscopy, scanning electron microscopy, and energy-dispersive x-ray mapping of the samples revealed uniform incorporation of elements and complete layer inversion. Optical characterization of the GePb films by Raman spectroscopy and photoluminescence techniques showed that annealing the samples resulted in higher crystalline quality as well as bandgap reduction. The bandgap reduction from 0.67 eV to 0.547 eV observed for the highest-quality material confirms the achievement of a direct-bandgap material.

  9. GePb Alloy Growth Using Layer Inversion Method

    Science.gov (United States)

    Alahmad, Hakimah; Mosleh, Aboozar; Alher, Murtadha; Banihashemian, Seyedeh Fahimeh; Ghetmiri, Seyed Amir; Al-Kabi, Sattar; Du, Wei; Li, Bauhoa; Yu, Shui-Qing; Naseem, Hameed A.

    2018-07-01

    Germanium-lead films have been investigated as a new direct-bandgap group IV alloy. GePb films were deposited on Si via thermal evaporation of Ge and Pb solid sources using the layer inversion metal-induced crystallization method for comparison with the current laser-induced recrystallization method. Material characterization of the films using x-ray diffraction analysis revealed highly oriented crystallinity and Pb incorporation as high as 13.5% before and 5.2% after annealing. Transmission electron microscopy, scanning electron microscopy, and energy-dispersive x-ray mapping of the samples revealed uniform incorporation of elements and complete layer inversion. Optical characterization of the GePb films by Raman spectroscopy and photoluminescence techniques showed that annealing the samples resulted in higher crystalline quality as well as bandgap reduction. The bandgap reduction from 0.67 eV to 0.547 eV observed for the highest-quality material confirms the achievement of a direct-bandgap material.

  10. Integral transform method for solving time fractional systems and fractional heat equation

    Directory of Open Access Journals (Sweden)

    Arman Aghili

    2014-01-01

    Full Text Available In the present paper, time fractional partial differential equation is considered, where the fractional derivative is defined in the Caputo sense. Laplace transform method has been applied to obtain an exact solution. The authors solved certain homogeneous and nonhomogeneous time fractional heat equations using integral transform. Transform method is a powerful tool for solving fractional singular Integro - differential equations and PDEs. The result reveals that the transform method is very convenient and effective.

  11. Task-Related Edge Density (TED)—A New Method for Revealing Dynamic Network Formation in fMRI Data of the Human Brain

    Science.gov (United States)

    Lohmann, Gabriele; Stelzer, Johannes; Zuber, Verena; Buschmann, Tilo; Margulies, Daniel; Bartels, Andreas; Scheffler, Klaus

    2016-01-01

    The formation of transient networks in response to external stimuli or as a reflection of internal cognitive processes is a hallmark of human brain function. However, its identification in fMRI data of the human brain is notoriously difficult. Here we propose a new method of fMRI data analysis that tackles this problem by considering large-scale, task-related synchronisation networks. Networks consist of nodes and edges connecting them, where nodes correspond to voxels in fMRI data, and the weight of an edge is determined via task-related changes in dynamic synchronisation between their respective times series. Based on these definitions, we developed a new data analysis algorithm that identifies edges that show differing levels of synchrony between two distinct task conditions and that occur in dense packs with similar characteristics. Hence, we call this approach “Task-related Edge Density” (TED). TED proved to be a very strong marker for dynamic network formation that easily lends itself to statistical analysis using large scale statistical inference. A major advantage of TED compared to other methods is that it does not depend on any specific hemodynamic response model, and it also does not require a presegmentation of the data for dimensionality reduction as it can handle large networks consisting of tens of thousands of voxels. We applied TED to fMRI data of a fingertapping and an emotion processing task provided by the Human Connectome Project. TED revealed network-based involvement of a large number of brain areas that evaded detection using traditional GLM-based analysis. We show that our proposed method provides an entirely new window into the immense complexity of human brain function. PMID:27341204

  12. Viscous wing theory development. Volume 1: Analysis, method and results

    Science.gov (United States)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  13. Method of fabricating nested shells and resulting product

    Science.gov (United States)

    Henderson, Timothy M.; Kool, Lawrence B.

    1982-01-01

    A multiple shell structure and a method of manufacturing such structure wherein a hollow glass microsphere is surface treated in an organosilane solution so as to render the shell outer surface hydrophobic. The surface treated glass shell is then suspended in the oil phase of an oil-aqueous phase dispersion. The oil phase includes an organic film-forming monomer, a polymerization initiator and a blowing agent. A polymeric film forms at each phase boundary of the dispersion and is then expanded in a blowing operation so as to form an outer homogeneously integral monocellular substantially spherical thermoplastic shell encapsulating an inner glass shell of lesser diameter.

  14. Evolution of different reaction methods resulting in the formation of AgI125 for use in brachytherapy sources

    International Nuclear Information System (INIS)

    Souza, C.D.; Peleias Jr, F.S.; Rostelato, M.E.C.M.; Zeituni, C.A.; Benega, M.A.G.; Tiezzi, R.; Mattos, F.R.; Rodrigues, B.T.; Oliveira, T.B.; Feher, A.; Moura, J.A.; Costa, O.L.

    2014-01-01

    Prostate cancer represents about 10% of all cases of cancer in the world. Brachytherapy has been extensively used in the early and intermediate stages of the illness. The radiotherapy method reduces the damage probability to surrounding healthy tissues. The present study compares several deposition methods of iodine-125 on silver substrate (seed core), in order to choose the most suitable one to be implemented at IPEN. Four methods were selected: method 1 (assay based on electrodeposition) which presented efficiency of 65.16%; method 2 (assay based on chemical reactions, developed by David Kubiatowicz) which presented efficiency of 70.80%; method 3 (chemical reaction based on the methodology developed by Dr. Maria Elisa Rostelato) which presented efficiency of 55.80%; new method developed by IPEN with 90.5% efficiency. Based on the results, the new method is the suggested one to be implemented. (authors)

  15. A novel method for unsteady flow field segmentation based on stochastic similarity of direction

    Science.gov (United States)

    Omata, Noriyasu; Shirayama, Susumu

    2018-04-01

    Recent developments in fluid dynamics research have opened up the possibility for the detailed quantitative understanding of unsteady flow fields. However, the visualization techniques currently in use generally provide only qualitative insights. A method for dividing the flow field into physically relevant regions of interest can help researchers quantify unsteady fluid behaviors. Most methods at present compare the trajectories of virtual Lagrangian particles. The time-invariant features of an unsteady flow are also frequently of interest, but the Lagrangian specification only reveals time-variant features. To address these challenges, we propose a novel method for the time-invariant spatial segmentation of an unsteady flow field. This segmentation method does not require Lagrangian particle tracking but instead quantitatively compares the stochastic models of the direction of the flow at each observed point. The proposed method is validated with several clustering tests for 3D flows past a sphere. Results show that the proposed method reveals the time-invariant, physically relevant structures of an unsteady flow.

  16. Do we need invasive confirmation of cardiac magnetic resonance results?

    Directory of Open Access Journals (Sweden)

    Paweł Siastała

    2017-03-01

    Full Text Available Introduction : Coronary artery revascularization is indicated in patients with documented significant obstruction of coronary blood flow associated with a large area of myocardial ischemia and/or untreatable symptoms. There are a few invasive or noninvasive methods that can provide information about the functional results of coronary artery narrowing. The application of more than one method of ischemia detection in one patient to reevaluate the indications for revascularization is used in case of atypical or no symptoms and/or borderline stenosis. Aim : To evaluate whether the results of cardiac magnetic resonance need to be reconfirmed by the invasive functional method. Material and methods : The hospital database revealed 25 consecutive patients with 29 stenoses who underwent cardiac magnetic resonance (CMR and fractional flow reserve (FFR between the end of 2010 and the end of 2014. The maximal time interval between CMR and FFR was 6 months. None of the patients experienced any clinical events or underwent procedures on coronary arteries between the studies. Results: According to the analysis, the agreement of CMR perfusion with the FFR method was at the level of 89.7%. Assuming that FFR is the gold standard in assessing the severity of stenoses, the sensitivity of CMR perfusion was 90.9%. The percentage of non-severe lesions which were correctly identified in CMR was 88.9%. Conclusions : The study shows that CMR perfusion is a highly sensitive method to detect hemodynamically significant CAD and exclude nonsevere lesions. With FFR as the reference standard, the diagnostic accuracy of MR perfusion to detect ischemic CAD is high.

  17. Dosimetry methods and results for the former residents of Bikini Atoll

    International Nuclear Information System (INIS)

    Greenhouse, N.A.

    1979-01-01

    The US Government utilized Bikini and Enewetak Atolls in the northern Marshall Islands of Micronesia for atomspheric tests of nuclear explosives in the 1940's and 1950's. The original inhabitants of these atolls were relocated prior to the tests. During the early 1970's, a small but growing population of Marshallese people reinhabited Bikini. Environmental and personnel radiological monitoring programs were begun in 1974 to ensure that doses and dose commitments received by Bikini residents remained within US Federal Radiation Council guidelines. Dramatic increases in 137 Cs body burdens among the inhabitants between April 1977 and 1978 may have played a significant role in the government decision to move the 140 Bikinians in residence off of the atoll in August 1978. The average 137 Cs body burden for the population was 2.3 μCi in April 1978. Several individuals, however, exceeded the maximum permissible body burden of 3 μCi, and some approached 6 μCi. The resultant total dose commitment was less than 200 mrem for the average resident. The average total dose for the mean residence interval of approx. 4.5 years was about 1 rem. The sources of exposure, the probable cause of the unexpected increase in 137 Cs body burdens, and the methods for calculating radionuclide intake and resultant doses are discussed. Suggestions are offered as to the implications of the most significant exposure pathways for the future inhabitation of Bikini and Enewetak

  18. Effective methods of protection of the intellectual activity results in infosphere of global telematics networks

    Directory of Open Access Journals (Sweden)

    D. A. Lovtsov

    2016-01-01

    Full Text Available The purpose of this article is perfection of using metodology of technological and organization and legal protect of intellectual activity results and related intellectual rights in information sphere of Global Telematics Networks (such as of «Internet», «Relkom», «Sitek», «Sedab», «Remart», and others. On the conduct analysis base of the peculiarities and possibilities of using of different technological, organization and legal methods and ways protection of information objects the offers of perfection of corresponding organization and legal safeguarding are formulated. The effectiveness of the protection is provided on the basis of rational aggregation technological, organization and legal methods and ways possible in a particular situation.

  19. Pervasive within-Mitochondrion Single-Nucleotide Variant Heteroplasmy as Revealed by Single-Mitochondrion Sequencing

    Directory of Open Access Journals (Sweden)

    Jacqueline Morris

    2017-12-01

    Full Text Available Summary: A number of mitochondrial diseases arise from single-nucleotide variant (SNV accumulation in multiple mitochondria. Here, we present a method for identification of variants present at the single-mitochondrion level in individual mouse and human neuronal cells, allowing for extremely high-resolution study of mitochondrial mutation dynamics. We identified extensive heteroplasmy between individual mitochondrion, along with three high-confidence variants in mouse and one in human that were present in multiple mitochondria across cells. The pattern of variation revealed by single-mitochondrion data shows surprisingly pervasive levels of heteroplasmy in inbred mice. Distribution of SNV loci suggests inheritance of variants across generations, resulting in Poisson jackpot lines with large SNV load. Comparison of human and mouse variants suggests that the two species might employ distinct modes of somatic segregation. Single-mitochondrion resolution revealed mitochondria mutational dynamics that we hypothesize to affect risk probabilities for mutations reaching disease thresholds. : Morris et al. use independent sequencing of multiple individual mitochondria from mouse and human brain cells to show high pervasiveness of mutations. The mutations are heteroplasmic within single mitochondria and within and between cells. These findings suggest mechanisms by which mutations accumulate over time, resulting in mitochondrial dysfunction and disease. Keywords: single mitochondrion, single cell, human neuron, mouse neuron, single-nucleotide variation

  20. Khater method for nonlinear Sharma Tasso-Olever (STO) equation of fractional order

    Science.gov (United States)

    Bibi, Sadaf; Mohyud-Din, Syed Tauseef; Khan, Umar; Ahmed, Naveed

    In this work, we have implemented a direct method, known as Khater method to establish exact solutions of nonlinear partial differential equations of fractional order. Number of solutions provided by this method is greater than other traditional methods. Exact solutions of nonlinear fractional order Sharma Tasso-Olever (STO) equation are expressed in terms of kink, travelling wave, periodic and solitary wave solutions. Modified Riemann-Liouville derivative and Fractional complex transform have been used for compatibility with fractional order sense. Solutions have been graphically simulated for understanding the physical aspects and importance of the method. A comparative discussion between our established results and the results obtained by existing ones is also presented. Our results clearly reveal that the proposed method is an effective, powerful and straightforward technique to work out new solutions of various types of differential equations of non-integer order in the fields of applied sciences and engineering.

  1. Feasibility to implement the radioisotopic method of nasal mucociliary transport measurement getting reliable results

    International Nuclear Information System (INIS)

    Troncoso, M.; Opazo, C.; Quilodran, C.; Lizama, V.

    2002-01-01

    Aim: Our goal was to implement the radioisotopic method to measure the nasal mucociliary velocity of transport (NMVT) in a feasible way in order to make it easily available as well as to validate the accuracy of the results. Such a method is needed when primary ciliary dyskinesia (PCD) is suspected, a disorder characterized for low NMVT, non-specific chronic respiratory symptoms that needs to be confirmed by electronic microscopic cilia biopsy. Methods: We performed one hundred studies from February 2000 until February 2002. Patients aged 2 months to 39 years, mean 9 years. All of them were referred from the Respiratory Disease Department. Ninety had upper or lower respiratory symptoms, ten were healthy controls. The procedure, done be the Nuclear Medicine Technologist, consists to put a 20 μl drop of 99mTc-MAA (0,1 mCi, 4 MBq) behind the head of the inferior turbinate in one nostril using a frontal light, a nasal speculum and a teflon catheter attached to a tuberculin syringe. The drop movement was acquired in a gamma camera-computer system and the velocity was expressed in mm/min. As there is need for the patient not to move during the procedure, sedation has to be used in non-cooperative children. Abnormal NMVT values cases were referred for nasal biopsy. Patients were classified in three groups. Normal controls (NC), PCD confirmed by biopsy (PCDB) and cases with respiratory symptoms without biopsy (RSNB). In all patients with NMVT less than 2.4 mm/min PCD was confirmed by biopsy. There was a clear-cut separation between normal and abnormal values and interestingly even the highest NMVT in PCDB cases was lower than the lowest NMVT in NC. The procedure is not as easy as is generally described in the literature because the operator has to get some skill as well as for the need of sedation in some cases. Conclusion: The procedure gives reliable, reproducible and objective results. It is safe, not expensive and quick in cooperative patients. Although, sometimes

  2. Examining Sexual Dysfunction in Non‐Muscle‐Invasive Bladder Cancer: Results of Cross‐Sectional Mixed‐Methods Research

    Directory of Open Access Journals (Sweden)

    Marc A. Kowalkowski, PhD

    2014-08-01

    Conclusions: Survivors' sexual symptoms may result from NMIBC, comorbidities, or both. These results inform literature and practice by raising awareness about the frequency of symptoms and the impact on NMIBC survivors' intimate relationships. Further work is needed to design symptom management education programs to dispel misinformation about contamination post‐treatment and improve quality of life. Kowalkowski MA, Chandrashekar A, Amiel GE, Lerner SP, Wittmann DA, Latini DM, and Goltz HH. Examining sexual dysfunction in non‐muscle‐invasive bladder cancer: Results of cross‐sectional mixed‐methods research. Sex Med 2014;2:141–151.

  3. Radioactive indium labelling of the figured elements of blood. Method, results, applications

    International Nuclear Information System (INIS)

    Ducassou, D.; Nouel, J.P.

    Following the work of Thakur et al. the authors became interested in red corpuscle, leucocyte and platelet labelling with indium 111 or 113m (8 hydroxyquinolein-indium). For easier labelling of the figured elements of blood the technique described was modified. The chelate is prepared by simple contact at room temperature of indium 111 or 113m chloride and water-soluble 8 hydroxyquinolein sulphate, in the presence of 0.2M TRIS buffer. The figured element chosen suspended in physiological serum is added directly to the solution obtained, the platelets and leucocytes being separated out beforehand by differential centrifugation. While it gives results similar to those of Thabur et al. the method proposed avoids the chloroform extraction of the radioactive chelate and the use of alcohol, liable to impair the platelet regation capacity [fr

  4. A method for purifying air containing radioactive substances resulting from the disintegration of radon

    International Nuclear Information System (INIS)

    Stringer, C.W.

    1974-01-01

    The invention relates to the extraction of radioactive isotopes from air. It refers to a method for withdrawing the radioactive substances resulting from the disintegration of radon from air, said method of the type comprising filtrating the air contaminated by the radon daughter products in a filter wetted with water in order to trap said substances in water. It is characterized in that it comprises the steps of causing the water contaminated by the radon daughter products to flow through a filtrating substance containing a non hydrosoluble granular substrate, the outer surface of which has been dried then wetted by a normally-liquid hydrocarbon, and of returning then wetted by a normally-liquid hydrocarbon, and of returning the thus filtrated water so that it wets again the air filter and entraps further radon daughter products. This can be applied to the purification of the air in uranium mines [fr

  5. Eigenspaces of networks reveal the overlapping and hierarchical community structure more precisely

    International Nuclear Information System (INIS)

    Ma, Xiaoke; Gao, Lin; Yong, Xuerong

    2010-01-01

    Identifying community structure is fundamental for revealing the structure–functionality relationship in complex networks, and spectral algorithms have been shown to be powerful for this purpose. In a traditional spectral algorithm, each vertex of a network is embedded into a spectral space by making use of the eigenvectors of the adjacency matrix or Laplacian matrix of the graph. In this paper, a novel spectral approach for revealing the overlapping and hierarchical community structure of complex networks is proposed by not only using the eigenvalues and eigenvectors but also the properties of eigenspaces of the networks involved. This gives us a better characterization of community. We first show that the communicability between a pair of vertices can be rewritten in term of eigenspaces of a network. An agglomerative clustering algorithm is then presented to discover the hierarchical communities using the communicability matrix. Finally, these overlapping vertices are discovered with the corresponding eigenspaces, based on the fact that the vertices more densely connected amongst one another are more likely to be linked through short cycles. Compared with the traditional spectral algorithms, our algorithm can identify both the overlapping and hierarchical community without increasing the time complexity O(n 3 ), where n is the size of the network. Furthermore, our algorithm can also distinguish the overlapping vertices from bridges. The method is tested by applying it to some computer-generated and real-world networks. The experimental results indicate that our algorithm can reveal community structure more precisely than the traditional spectral approaches

  6. Revealing −1 Programmed Ribosomal Frameshifting Mechanisms by Single-Molecule Techniques and Computational Methods

    Directory of Open Access Journals (Sweden)

    Kai-Chun Chang

    2012-01-01

    Full Text Available Programmed ribosomal frameshifting (PRF serves as an intrinsic translational regulation mechanism employed by some viruses to control the ratio between structural and enzymatic proteins. Most viral mRNAs which use PRF adapt an H-type pseudoknot to stimulate −1 PRF. The relationship between the thermodynamic stability and the frameshifting efficiency of pseudoknots has not been fully understood. Recently, single-molecule force spectroscopy has revealed that the frequency of −1 PRF correlates with the unwinding forces required for disrupting pseudoknots, and that some of the unwinding work dissipates irreversibly due to the torsional restraint of pseudoknots. Complementary to single-molecule techniques, computational modeling provides insights into global motions of the ribosome, whose structural transitions during frameshifting have not yet been elucidated in atomic detail. Taken together, recent advances in biophysical tools may help to develop antiviral therapies that target the ubiquitous −1 PRF mechanism among viruses.

  7. The effect of different methods and image analyzers on the results of the in vivo comet assay.

    Science.gov (United States)

    Kyoya, Takahiro; Iwamoto, Rika; Shimanura, Yuko; Terada, Megumi; Masuda, Shuichi

    2018-01-01

    The in vivo comet assay is a widely used genotoxicity test that can detect DNA damage in a range of organs. It is included in the Organisation for Economic Co-operation and Development Guidelines for the Testing of Chemicals. However, various protocols are still used for this assay, and several different image analyzers are used routinely to evaluate the results. Here, we verified a protocol that largely contributes to the equivalence of results, and we assessed the effect on the results when slides made from the same sample were analyzed using two different image analyzers (Comet Assay IV vs Comet Analyzer). Standardizing the agarose concentrations and DNA unwinding and electrophoresis times had a large impact on the equivalence of the results between the different methods used for the in vivo comet assay. In addition, there was some variation in the sensitivity of the two different image analyzers tested; however this variation was considered to be minor and became negligible when the test conditions were standardized between the two different methods. By standardizing the concentrations of low melting agarose and DNA unwinding and electrophoresis times between both methods used in the current study, the sensitivity to detect the genotoxicity of a positive control substance in the in vivo comet assay became generally comparable, independently of the image analyzer used. However, there may still be the possibility that other conditions, except for the three described here, could affect the reproducibility of the in vivo comet assay.

  8. Long-term Results of Endovascular Stent Graft Placement of Ureteroarterial Fistula

    Energy Technology Data Exchange (ETDEWEB)

    Okada, Takuya, E-mail: okabone@gmail.com; Yamaguchi, Masato, E-mail: masato03310402@yahoo.co.jp [Kobe University Hospital, Department of Radiology (Japan); Muradi, Akhmadu, E-mail: muradiakhmadu@gmail.com; Nomura, Yoshikatsu, E-mail: y_katsu1027@yahoo.co.jp [Kobe University Hospital, Center for Endovascular Therapy (Japan); Uotani, Kensuke, E-mail: uotani@tenriyorozu.jp [Tenri Hospital, Department of Radiology (Japan); Idoguchi, Koji, E-mail: idoguchi@ares.eonet.ne.jp [Kobe University Hospital, Center for Endovascular Therapy (Japan); Miyamoto, Naokazu, E-mail: naoka_zu@yahoo.co.jp; Kawasaki, Ryota, E-mail: kawaryo1999@yahoo.co.jp [Hyogo Brain and Heart Center at Himeji, Department of Radiology (Japan); Taniguchi, Takanori, E-mail: tan9523929@yahoo.co.jp [Tenri Hospital, Department of Radiology (Japan); Okita, Yutaka, E-mail: yokita@med.kobe-u.ac.jp [Kobe University Hospital, Department of Cardiovascular Surgery (Japan); Sugimoto, Koji, E-mail: kojirad@med.kobe-u.ac.jp [Kobe University Hospital, Department of Radiology (Japan)

    2013-08-01

    PurposeTo evaluate the safety, efficacy, and long-term results of endovascular stent graft placement for ureteroarterial fistula (UAF).MethodsWe retrospectively analyzed stent graft placement for UAF performed at our institution from 2004 to 2012. Fistula location was assessed by contrast-enhanced computed tomography (CT) and angiography, and freedom from hematuria recurrence and mortality rates were estimated.ResultsStent graft placement for 11 UAFs was performed (4 men, mean age 72.8 {+-} 11.6 years). Some risk factors were present, including long-term ureteral stenting in 10 (91 %), pelvic surgery in 8 (73 %), and pelvic radiation in 5 (45 %). Contrast-enhanced CT and/or angiography revealed fistula or encasement of the artery in 6 cases (55 %). In the remaining 5 (45 %), angiography revealed no abnormality, and the suspected fistula site was at the crossing area between urinary tract and artery. All procedures were successful. However, one patient died of urosepsis 37 days after the procedure. At a mean follow-up of 548 (range 35-1,386) days, 4 patients (36 %) had recurrent hematuria, and two of them underwent additional treatment with secondary stent graft placement and surgical reconstruction. The hematuria recurrence-free rates at 1 and 2 years were 76.2 and 40.6 %, respectively. The freedom from UAF-related and overall mortality rates at 2 years were 85.7 and 54.9 %, respectively.ConclusionEndovascular stent graft placement for UAF is a safe and effective method to manage acute events. However, the hematuria recurrence rate remains high. A further study of long-term results in larger number of patients is necessary.

  9. (Re)interpreting LHC New Physics Search Results : Tools and Methods, 3rd Workshop

    CERN Document Server

    The quest for new physics beyond the SM is arguably the driving topic for LHC Run2. LHC collaborations are pursuing searches for new physics in a vast variety of channels. Although collaborations provide various interpretations for their search results, the full understanding of these results requires a much wider interpretation scope involving all kinds of theoretical models. This is a very active field, with close theory-experiment interaction. In particular, development of dedicated methodologies and tools is crucial for such scale of interpretation. Recently, a Forum was initiated to host discussions among LHC experimentalists and theorists on topics related to the BSM (re)interpretation of LHC data, and especially on the development of relevant interpretation tools and infrastructure: https://twiki.cern.ch/twiki/bin/view/LHCPhysics/InterpretingLHCresults Two meetings were held at CERN, where active discussions and concrete work on (re)interpretation methods and tools took place, with valuable cont...

  10. Investigation on filter method for smoothing spiral phase plate

    Science.gov (United States)

    Zhang, Yuanhang; Wen, Shenglin; Luo, Zijian; Tang, Caixue; Yan, Hao; Yang, Chunlin; Liu, Mincai; Zhang, Qinghua; Wang, Jian

    2018-03-01

    Spiral phase plate (SPP) for generating vortex hollow beams has high efficiency in various applications. However, it is difficult to obtain an ideal spiral phase plate because of its continuous-varying helical phase and discontinued phase step. This paper describes the demonstration of continuous spiral phase plate using filter methods. The numerical simulations indicate that different filter method including spatial domain filter, frequency domain filter has unique impact on surface topography of SPP and optical vortex characteristics. The experimental results reveal that the spatial Gaussian filter method for smoothing SPP is suitable for Computer Controlled Optical Surfacing (CCOS) technique and obtains good optical properties.

  11. Approximation for Transient of Nonlinear Circuits Using RHPM and BPES Methods

    Directory of Open Access Journals (Sweden)

    H. Vazquez-Leal

    2013-01-01

    Full Text Available The microelectronics area constantly demands better and improved circuit simulation tools. Therefore, in this paper, rational homotopy perturbation method and Boubaker Polynomials Expansion Scheme are applied to a differential equation from a nonlinear circuit. Comparing the results obtained by both techniques revealed that they are effective and convenient.

  12. Core-level photoemission revealing the Mott transition

    International Nuclear Information System (INIS)

    Kim, Hyeong-Do; Noh, Han-Jin; Kim, K.H.; Oh, S.-J.

    2005-01-01

    Ru 3d core-level X-ray photoemission spectra of various ruthenates are examined. They show in general two-peak structures, which can be assigned as the screened and unscreened peaks. The screened peak is absent in a Mott insulator, but develops into a main peak as the correlation strength becomes weak. This spectral behavior is well explained by the dynamical mean-field theory calculation for the single-band Hubbard model with the on-site core-hole potential using the exact diagonalization method. The new mechanism of the core-level photoemission satellite can be utilized to reveal the Mott transition phenomenon in various strongly correlated electron systems

  13. VALUATION METHODS- LITERATURE REVIEW

    OpenAIRE

    Dorisz Talas

    2015-01-01

    This paper is a theoretical overview of the often used valuation methods with the help of which the value of a firm or its equity is calculated. Many experts (including Aswath Damodaran, Guochang Zhang and CA Hozefa Natalwala) classify the methods. The basic models are based on discounted cash flows. The main method uses the free cash flow for valuation, but there are some newer methods that reveal and correct the weaknesses of the traditional models. The valuation of flexibility of managemen...

  14. Ultrasonic Digital Communication System for a Steel Wall Multipath Channel: Methods and Results

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Timothy L. [Rensselaer Polytechnic Inst., Troy, NY (United States)

    2005-12-01

    As of the development of this thesis, no commercially available products have been identified for the digital communication of instrumented data across a thick ({approx} 6 n.) steel wall using ultrasound. The specific goal of the current research is to investigate the application of methods for digital communication of instrumented data (i.e., temperature, voltage, etc.) across the wall of a steel pressure vessel. The acoustic transmission of data using ultrasonic transducers prevents the need to breach the wall of such a pressure vessel which could ultimately affect its safety or lifespan, or void the homogeneity of an experiment under test. Actual digital communication paradigms are introduced and implemented for the successful dissemination of data across such a wall utilizing solely an acoustic ultrasonic link. The first, dubbed the ''single-hop'' configuration, can communicate bursts of digital data one-way across the wall using the Differential Binary Phase-Shift Keying (DBPSK) modulation technique as fast as 500 bps. The second, dubbed the ''double-hop'' configuration, transmits a carrier into the vessel, modulates it, and retransmits it externally. Using a pulsed carrier with Pulse Amplitude Modulation (PAM), this technique can communicate digital data as fast as 500 bps. Using a CW carrier, Least Mean-Squared (LMS) adaptive interference suppression, and DBPSK, this method can communicate data as fast as 5 kbps. A third technique, dubbed the ''reflected-power'' configuration, communicates digital data by modulating a pulsed carrier by varying the acoustic impedance at the internal transducer-wall interface. The paradigms of the latter two configurations are believed to be unique. All modulation methods are based on the premise that the wall cannot be breached in any way and can therefore be viably implemented with power delivered wirelessly through the acoustic channel using ultrasound. Methods

  15. A Comparative Study of Feature Selection and Classification Methods for Gene Expression Data

    KAUST Repository

    Abusamra, Heba

    2013-01-01

    Different experiments have been applied to compare the performance of the classification methods with and without performing feature selection. Results revealed the important role of feature selection in classifying gene expression data. By performing feature selection, the classification accuracy can be significantly boosted by using a small number of genes. The relationship of features selected in different feature selection methods is investigated and the most frequent features selected in each fold among all methods for both datasets are evaluated.

  16. Developing a bone mineral density test result letter to send to patients: a mixed-methods study

    Directory of Open Access Journals (Sweden)

    Edmonds SW

    2014-06-01

    Full Text Available Stephanie W Edmonds,1,2 Samantha L Solimeo,3 Xin Lu,1 Douglas W Roblin,4,8 Kenneth G Saag,5 Peter Cram6,7 1Department of Internal Medicine, 2College of Nursing, University of Iowa, Iowa City, IA, USA; 3Center for Comprehensive Access and Delivery Research and Evaluation, Iowa City Veterans Affairs Health Care System, Iowa City, IA, USA; 4Kaiser Permanente of Atlanta, Atlanta, GA, USA; 5Department of Rheumatology, University of Alabama at Birmingham, Birmingham, AL, USA; 6Faculty of Medicine, University of Toronto, Toronto, ON, Canada; 7University Health Network and Mount Sinai Hospital, Toronto, ON, Canada; 8School of Public Health, Georgia State University, Atlanta, GA, USA Purpose: To use a mixed-methods approach to develop a letter that can be used to notify patients of their bone mineral density (BMD results by mail that may activate patients in their bone-related health care. Patients and methods: A multidisciplinary team developed three versions of a letter for reporting BMD results to patients. Trained interviewers presented these letters in a random order to a convenience sample of adults, aged 50 years and older, at two different health care systems. We conducted structured interviews to examine the respondents’ preferences and comprehension among the various letters. Results: A total of 142 participants completed the interview. A majority of the participants were female (64.1% and white (76.1%. A plurality of the participants identified a specific version of the three letters as both their preferred version (45.2%; P<0.001 and as the easiest to understand (44.6%; P<0.01. A majority of participants preferred that the letters include specific next steps for improving their bone health. Conclusion: Using a mixed-methods approach, we were able to develop and optimize a printed letter for communicating a complex test result (BMD to patients. Our results may offer guidance to clinicians, administrators, and researchers who are

  17. First results of Minimum Fisher Regularisation as unfolding method for JET NE213 liquid scintillator neutron spectrometry

    International Nuclear Information System (INIS)

    Mlynar, Jan; Adams, John M.; Bertalot, Luciano; Conroy, Sean

    2005-01-01

    At JET, the NE213 liquid scintillator is being validated as a diagnostic tool for spectral measurements of neutrons emitted from the plasma. Neutron spectra have to be unfolded from the measured pulse-height spectra, which is an ill-conditioned problem. Therefore, use of two independent unfolding methods allows for less ambiguity on the interpretation of the data. In parallel to the routine algorithm MAXED based on the Maximum Entropy method, the Minimum Fisher Regularisation (MFR) method has been introduced at JET. The MFR method, known from two-dimensional tomography applications, has proved to provide a new transparent tool to validate the JET neutron spectra measured with the NE213 liquid scintillators. In this article, the MFR method applicable to spectra unfolding is briefly explained. After a mention of MFR tests on phantom spectra experimental neutron spectra are presented that were obtained by applying MFR to NE213 data in selected JET experiments. The results tend to confirm MAXED observations

  18. A method of estimating conceptus doses resulting from multidetector CT examinations during all stages of gestation

    International Nuclear Information System (INIS)

    Damilakis, John; Tzedakis, Antonis; Perisinakis, Kostas; Papadakis, Antonios E.

    2010-01-01

    Purpose: Current methods for the estimation of conceptus dose from multidetector CT (MDCT) examinations performed on the mother provide dose data for typical protocols with a fixed scan length. However, modified low-dose imaging protocols are frequently used during pregnancy. The purpose of the current study was to develop a method for the estimation of conceptus dose from any MDCT examination of the trunk performed during all stages of gestation. Methods: The Monte Carlo N-Particle (MCNP) radiation transport code was employed in this study to model the Siemens Sensation 16 and Sensation 64 MDCT scanners. Four mathematical phantoms were used, simulating women at 0, 3, 6, and 9 months of gestation. The contribution to the conceptus dose from single simulated scans was obtained at various positions across the phantoms. To investigate the effect of maternal body size and conceptus depth on conceptus dose, phantoms of different sizes were produced by adding layers of adipose tissue around the trunk of the mathematical phantoms. To verify MCNP results, conceptus dose measurements were carried out by means of three physical anthropomorphic phantoms, simulating pregnancy at 0, 3, and 6 months of gestation and thermoluminescence dosimetry (TLD) crystals. Results: The results consist of Monte Carlo-generated normalized conceptus dose coefficients for single scans across the four mathematical phantoms. These coefficients were defined as the conceptus dose contribution from a single scan divided by the CTDI free-in-air measured with identical scanning parameters. Data have been produced to take into account the effect of maternal body size and conceptus position variations on conceptus dose. Conceptus doses measured with TLD crystals showed a difference of up to 19% compared to those estimated by mathematical simulations. Conclusions: Estimation of conceptus doses from MDCT examinations of the trunk performed on pregnant patients during all stages of gestation can be made

  19. Methods of dealing with co-products of biofuels in life-cycle analysis and consequent results within the U.S. context

    International Nuclear Information System (INIS)

    Wang, Michael; Huo Hong; Arora, Salil

    2011-01-01

    Products other than biofuels are produced in biofuel plants. For example, corn ethanol plants produce distillers' grains and solubles. Soybean crushing plants produce soy meal and soy oil, which is used for biodiesel production. Electricity is generated in sugarcane ethanol plants both for internal consumption and export to the electric grid. Future cellulosic ethanol plants could be designed to co-produce electricity with ethanol. It is important to take co-products into account in the life-cycle analysis of biofuels and several methods are available to do so. Although the International Standard Organization's ISO 14040 advocates the system boundary expansion method (also known as the 'displacement method' or the 'substitution method') for life-cycle analyses, application of the method has been limited because of the difficulty in identifying and quantifying potential products to be displaced by biofuel co-products. As a result, some LCA studies and policy-making processes have considered alternative methods. In this paper, we examine the available methods to deal with biofuel co-products, explore the strengths and weaknesses of each method, and present biofuel LCA results with different co-product methods within the U.S. context.

  20. Methods of dealing with co-products of biofuels in life-cycle analysis and consequent results within the U.S. context

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Michael, E-mail: mqwang@anl.gov [Center for Transportation Research, Argonne National Laboratory, Argonne, IL 60439 (United States); Huo Hong [Institute of Energy, Environment, and Economics, Tsinghua University, Beijing, 100084 (China); Arora, Salil [Center for Transportation Research, Argonne National Laboratory, Argonne, IL 60439 (United States)

    2011-10-15

    Products other than biofuels are produced in biofuel plants. For example, corn ethanol plants produce distillers' grains and solubles. Soybean crushing plants produce soy meal and soy oil, which is used for biodiesel production. Electricity is generated in sugarcane ethanol plants both for internal consumption and export to the electric grid. Future cellulosic ethanol plants could be designed to co-produce electricity with ethanol. It is important to take co-products into account in the life-cycle analysis of biofuels and several methods are available to do so. Although the International Standard Organization's ISO 14040 advocates the system boundary expansion method (also known as the 'displacement method' or the 'substitution method') for life-cycle analyses, application of the method has been limited because of the difficulty in identifying and quantifying potential products to be displaced by biofuel co-products. As a result, some LCA studies and policy-making processes have considered alternative methods. In this paper, we examine the available methods to deal with biofuel co-products, explore the strengths and weaknesses of each method, and present biofuel LCA results with different co-product methods within the U.S. context.

  1. Learning Method and Its Influence on Nutrition Study Results Throwing the Ball

    Science.gov (United States)

    Samsudin; Nugraha, Bayu

    2015-01-01

    This study aimed to know the difference between playing and learning methods of exploratory learning methods to learning outcomes throwing the ball. In addition, this study also aimed to determine the effect of nutritional status of these two learning methods mentioned above. This research was conducted at SDN Cipinang Besar Selatan 16 Pagi East…

  2. Review Results on Wing-Body Interference

    Directory of Open Access Journals (Sweden)

    Frolov Vladimir

    2016-01-01

    Full Text Available The paper presents an overview of results for wing-body interference, obtained by the author for varied wing-body combinations. The lift-curve slopes of the wing-body combinations are considered. In this paper a discrete vortices method (DVM and 2D potential model for cross-flow around fuselage are used. The circular and elliptical cross-sections of the fuselage and flat wings of various forms are considered. Calculations showed that the value of the lift-curve slopes of the wing-body combinations may exceed the same value for an isolated wing. This result confirms an experimental data obtained by other authors earlier. Within a framework of the used mathematical models the investigations to optimize the wing-body combination were carried. The present results of the optimization problem for the wing-body combination allowed to select the optimal geometric characteristics for configuration to maximize the values of the lift-curve slopes of the wing-body combination. It was revealed that maximums of the lift-curve slopes for the optimal mid-wing configuration with elliptical cross-section body had a sufficiently large relative width of the body (more than 30% of the span wing.

  3. 18F-FDG PET Reveals Fronto-temporal Dysfunction in Children with Fever-Induced Refractory Epileptic Encephalopathy

    International Nuclear Information System (INIS)

    Mazzuca, M.; Dulac, O.; Chiron, C.; Jambaque, I.; Hertz-Pannier, L.; Bouilleret, V.; Archambaud, F.; Rodrigo, S.; Dulac, O.; Chiron, C.; Jambaque, I.; Hertz-Pannier, L.; Bouilleret, V.; Archambaud, F.; Rodrigo, S.; Chiron, C.; Hertz-Pannier, L.; Rodrigo, S.; Dulac, O.; Chiron, C.; Caviness, V.

    2011-01-01

    Fever-induced refractory epileptic encephalopathy in school-age children (FIRES) is a recently described epileptic entity whose etiology remains unknown. Brain abnormalities shown by MRI are usually limited to mesial-temporal structures and do not account for the catastrophic neuro-psychologic findings. Methods: We conducted FIRES studies in 8 patients, aged 6-13 y, using 18 F-FDG PET to disclose eventual neo-cortical dysfunction. Voxel-based analyses of cerebral glucose metabolism were performed using statistical parametric mapping and an age-matched control group. Results: Group analysis revealed a widespread inter-ictal hypo-metabolic network including the temporo-parietal and orbito-frontal cortices bilaterally. The individual analyses in patients identified hypo-metabolic areas corresponding to the predominant electroencephalograph foci and neuro-psychologic deficits involving language, behavior, and memory. Conclusion: Despite clinical heterogeneity, 18 F-FDG PET reveals a common network dysfunction in patients with sequelae due to fever-induced refractory epileptic encephalopathy. (authors)

  4. Memory functions reveal structural properties of gene regulatory networks

    Science.gov (United States)

    Perez-Carrasco, Ruben

    2018-01-01

    Gene regulatory networks (GRNs) control cellular function and decision making during tissue development and homeostasis. Mathematical tools based on dynamical systems theory are often used to model these networks, but the size and complexity of these models mean that their behaviour is not always intuitive and the underlying mechanisms can be difficult to decipher. For this reason, methods that simplify and aid exploration of complex networks are necessary. To this end we develop a broadly applicable form of the Zwanzig-Mori projection. By first converting a thermodynamic state ensemble model of gene regulation into mass action reactions we derive a general method that produces a set of time evolution equations for a subset of components of a network. The influence of the rest of the network, the bulk, is captured by memory functions that describe how the subnetwork reacts to its own past state via components in the bulk. These memory functions provide probes of near-steady state dynamics, revealing information not easily accessible otherwise. We illustrate the method on a simple cross-repressive transcriptional motif to show that memory functions not only simplify the analysis of the subnetwork but also have a natural interpretation. We then apply the approach to a GRN from the vertebrate neural tube, a well characterised developmental transcriptional network composed of four interacting transcription factors. The memory functions reveal the function of specific links within the neural tube network and identify features of the regulatory structure that specifically increase the robustness of the network to initial conditions. Taken together, the study provides evidence that Zwanzig-Mori projections offer powerful and effective tools for simplifying and exploring the behaviour of GRNs. PMID:29470492

  5. Influence of Specimen Preparation and Test Methods on the Flexural Strength Results of Monolithic Zirconia Materials.

    Science.gov (United States)

    Schatz, Christine; Strickstrock, Monika; Roos, Malgorzata; Edelhoff, Daniel; Eichberger, Marlis; Zylla, Isabella-Maria; Stawarczyk, Bogna

    2016-03-09

    The aim of this work was to evaluate the influence of specimen preparation and test method on the flexural strength results of monolithic zirconia. Different monolithic zirconia materials (Ceramill Zolid (Amann Girrbach, Koblach, Austria), Zenostar ZrTranslucent (Wieland Dental, Pforzheim, Germany), and DD Bio zx² (Dental Direkt, Spenge, Germany)) were tested with three different methods: 3-point, 4-point, and biaxial flexural strength. Additionally, different specimen preparation methods were applied: either dry polishing before sintering or wet polishing after sintering. Each subgroup included 40 specimens. The surface roughness was assessed using scanning electron microscopy (SEM) and a profilometer whereas monoclinic phase transformation was investigated with X-ray diffraction. The data were analyzed using a three-way Analysis of Variance (ANOVA) with respect to the three factors: zirconia, specimen preparation, and test method. One-way ANOVA was conducted for the test method and zirconia factors within the combination of two other factors. A 2-parameter Weibull distribution assumption was applied to analyze the reliability under different testing conditions. In general, values measured using the 4-point test method presented the lowest flexural strength values. The flexural strength findings can be grouped in the following order: 4-point strength values than prepared before sintering. The Weibull moduli ranged from 5.1 to 16.5. Specimens polished before sintering showed higher surface roughness values than specimens polished after sintering. In contrast, no strong impact of the polishing procedures on the monoclinic surface layer was observed. No impact of zirconia material on flexural strength was found. The test method and the preparation method significantly influenced the flexural strength values.

  6. Exact Solutions of the Time Fractional BBM-Burger Equation by Novel (G′/G-Expansion Method

    Directory of Open Access Journals (Sweden)

    Muhammad Shakeel

    2014-01-01

    Full Text Available The fractional derivatives are used in the sense modified Riemann-Liouville to obtain exact solutions for BBM-Burger equation of fractional order. This equation can be converted into an ordinary differential equation by using a persistent fractional complex transform and, as a result, hyperbolic function solutions, trigonometric function solutions, and rational solutions are attained. The performance of the method is reliable, useful, and gives newer general exact solutions with more free parameters than the existing methods. Numerical results coupled with the graphical representation completely reveal the trustworthiness of the method.

  7. An Efficient Ensemble Learning Method for Gene Microarray Classification

    Directory of Open Access Journals (Sweden)

    Alireza Osareh

    2013-01-01

    Full Text Available The gene microarray analysis and classification have demonstrated an effective way for the effective diagnosis of diseases and cancers. However, it has been also revealed that the basic classification techniques have intrinsic drawbacks in achieving accurate gene classification and cancer diagnosis. On the other hand, classifier ensembles have received increasing attention in various applications. Here, we address the gene classification issue using RotBoost ensemble methodology. This method is a combination of Rotation Forest and AdaBoost techniques which in turn preserve both desirable features of an ensemble architecture, that is, accuracy and diversity. To select a concise subset of informative genes, 5 different feature selection algorithms are considered. To assess the efficiency of the RotBoost, other nonensemble/ensemble techniques including Decision Trees, Support Vector Machines, Rotation Forest, AdaBoost, and Bagging are also deployed. Experimental results have revealed that the combination of the fast correlation-based feature selection method with ICA-based RotBoost ensemble is highly effective for gene classification. In fact, the proposed method can create ensemble classifiers which outperform not only the classifiers produced by the conventional machine learning but also the classifiers generated by two widely used conventional ensemble learning methods, that is, Bagging and AdaBoost.

  8. Comparative evaluations of the results of common X-ray examinations and computerized tomography in patients with exogenous allergic alveolitis

    International Nuclear Information System (INIS)

    Khomenko, A.G.; Dmitrieva, L.I.; Khikkel', Kh.G.; Myuller, S.

    1989-01-01

    A correlative study of the results of x-ray examination using routine methods and computerized tomography (CT) was conducted to specify the roentgenomorphological substrate of changes in patients with exogenous allergic alveolitis. The established complex of routine methods is informative enough to interpret the revealed changes. However, at early stages CT helps to specify semiotics and permits obtaining additional information, particularly on quantitative, i.e. densitometric changes. In diffuse and disseminated pulmonary lesions CT can be used as an additional method

  9. Within- and across-trial dynamics of human EEG reveal cooperative interplay between reinforcement learning and working memory.

    Science.gov (United States)

    Collins, Anne G E; Frank, Michael J

    2018-03-06

    Learning from rewards and punishments is essential to survival and facilitates flexible human behavior. It is widely appreciated that multiple cognitive and reinforcement learning systems contribute to decision-making, but the nature of their interactions is elusive. Here, we leverage methods for extracting trial-by-trial indices of reinforcement learning (RL) and working memory (WM) in human electro-encephalography to reveal single-trial computations beyond that afforded by behavior alone. Neural dynamics confirmed that increases in neural expectation were predictive of reduced neural surprise in the following feedback period, supporting central tenets of RL models. Within- and cross-trial dynamics revealed a cooperative interplay between systems for learning, in which WM contributes expectations to guide RL, despite competition between systems during choice. Together, these results provide a deeper understanding of how multiple neural systems interact for learning and decision-making and facilitate analysis of their disruption in clinical populations.

  10. Three-dimensional Crustal Structure beneath the Tibetan Plateau Revealed by Multi-scale Gravity Analysis

    Science.gov (United States)

    Xu, C.; Luo, Z.; Sun, R.; Li, Q.

    2017-12-01

    The Tibetan Plateau, the largest and highest plateau on Earth, was uplifted, shorten and thicken by the collision and continuous convergence of the Indian and Eurasian plates since 50 million years ago, the Eocene epoch. Fine three-dimensional crustal structure of the Tibetan Plateau is helpful in understanding the tectonic development. At present, the ordinary method used for revealing crustal structure is seismic method, which is inhibited by poor seismic station coverage, especially in the central and western plateau primarily due to the rugged terrain. Fortunately, with the implementation of satellite gravity missions, gravity field models have demonstrated unprecedented global-scale accuracy and spatial resolution, which can subsequently be employed to study the crustal structure of the entire Tibetan Plateau. This study inverts three-dimensional crustal density and Moho topography of the Tibetan Plateau from gravity data using multi-scale gravity analysis. The inverted results are in agreement with those provided by the previous works. Besides, they can reveal rich tectonic development of the Tibetan Plateau: (1) The low-density channel flow can be observed from the inverted crustal density; (2) The Moho depth in the west is deeper than that in the east, and the deepest Moho, which is approximately 77 km, is located beneath the western Qiangtang Block; (3) The Moho fold, the directions of which are in agreement with the results of surface movement velocities estimated from Global Positioning System, exists clearly on the Moho topography.This study is supported by the National Natural Science Foundation of China (Grant No. 41504015), the China Postdoctoral Science Foundation (Grant No. 2015M572146), and the Surveying and Mapping Basic Research Programme of the National Administration of Surveying, Mapping and Geoinformation (Grant No. 15-01-08).

  11. Impact of the Great East Japan Earthquake on feeding methods and newborn growth at 1 month postpartum: results from the Fukushima Health Management Survey

    International Nuclear Information System (INIS)

    Kyozuka, Hyo; Yasuda, Shun; Kawamura, Makoto; Nomura, Yasuhisa; Fujimori, Keiya; Goto, Aya; Yasumura, Seiji; Abe, Masafumi

    2016-01-01

    This study examined the effects of three disasters (the Great East Japan Earthquake of March 11, 2011, followed by a tsunami and the Fukushima Daiichi Nuclear Power Plant accident) on feeding methods and growth in infants born after the disasters. Using results from the Fukushima Health Management Survey, Soso District (the affected area where the damaged nuclear power plant is located) and Aizu District (a less-affected area located farthest from the plant) were compared. In this study, newborn and maternal background characteristics were examined, as well as feeding methods, and other factors for newborn growth at the first postpartum examination for 1706 newborns born after the disaster in the affected (n = 836) and less-affected (n = 870) areas. Postpartum examinations took place 1 month after birth. Feeding method trends were examined, and multivariate regression analyses were used to investigate effects on newborn mass gain. There were no significant differences in background characteristics among newborns in these areas. When birth dates were divided into four periods to assess trends, no significant change in the exclusive breastfeeding rate was found, while the exclusive formula-feeding rate was significantly different across time periods in the affected area (p = 0.02). Multivariate analyses revealed no significant independent associations of maternal depression and change in medical facilities (possible disaster effects) with other newborn growth factors in either area. No area differences in newborn growth at the first postpartum examination or in exclusive breastfeeding rates were found during any period. Exclusive formula-feeding rates varied across time periods in the affected, but not in the less-affected area. It is concluded that effective guidance to promote breast-feeding and prevent exclusive use of formula is important for women in post-disaster circumstances. (orig.)

  12. Impact of the Great East Japan Earthquake on feeding methods and newborn growth at 1 month postpartum: results from the Fukushima Health Management Survey

    Energy Technology Data Exchange (ETDEWEB)

    Kyozuka, Hyo; Yasuda, Shun; Kawamura, Makoto; Nomura, Yasuhisa; Fujimori, Keiya [Fukushima Medical University, Department of Obstetrics and Gynecology, School of Medicine, Fukushima (Japan); Goto, Aya; Yasumura, Seiji [Radiation Medical Science Center for the Fukushima Health Management Survey, Fukushima (Japan); Fukushima Medical University, Department of Public Health, School of Medicine, Fukushima (Japan); Abe, Masafumi [Radiation Medical Science Center for the Fukushima Health Management Survey, Fukushima (Japan)

    2016-05-15

    This study examined the effects of three disasters (the Great East Japan Earthquake of March 11, 2011, followed by a tsunami and the Fukushima Daiichi Nuclear Power Plant accident) on feeding methods and growth in infants born after the disasters. Using results from the Fukushima Health Management Survey, Soso District (the affected area where the damaged nuclear power plant is located) and Aizu District (a less-affected area located farthest from the plant) were compared. In this study, newborn and maternal background characteristics were examined, as well as feeding methods, and other factors for newborn growth at the first postpartum examination for 1706 newborns born after the disaster in the affected (n = 836) and less-affected (n = 870) areas. Postpartum examinations took place 1 month after birth. Feeding method trends were examined, and multivariate regression analyses were used to investigate effects on newborn mass gain. There were no significant differences in background characteristics among newborns in these areas. When birth dates were divided into four periods to assess trends, no significant change in the exclusive breastfeeding rate was found, while the exclusive formula-feeding rate was significantly different across time periods in the affected area (p = 0.02). Multivariate analyses revealed no significant independent associations of maternal depression and change in medical facilities (possible disaster effects) with other newborn growth factors in either area. No area differences in newborn growth at the first postpartum examination or in exclusive breastfeeding rates were found during any period. Exclusive formula-feeding rates varied across time periods in the affected, but not in the less-affected area. It is concluded that effective guidance to promote breast-feeding and prevent exclusive use of formula is important for women in post-disaster circumstances. (orig.)

  13. INTERDISCIPLINARITY IN PUBLIC SPACE PARTICIPATIVE PROJECTS: METHODS AND RESULTS IN PRACTICE AND TEACHING

    Directory of Open Access Journals (Sweden)

    Pedro Brandão

    2015-06-01

    • In the development of design practice and studio teaching methods We shall see in this paper how interdisciplinary approaches correspond to new and complex urban transformations, focusing on the importance of actors’ interaction processes, combining professional and non-professional knowledge and theory-practice relations. Therefore, we aim at a deepening in public space area of knowledge under the growing complexity of urban life. We see it as a base for further development of collaborative projects and their implications on community empowerment and urban governance at local level. Motivations of this line of work are persistent in several ongoing research projects, aiming to: - Understand public space as a cohesion factor both in urban life and urban form - Manage processes and strategies as elements of urban transformation, - Stimulate the understanding of actors’ roles in urban design practices. - Favoring the questioning of emerging aspects of urban space production… The paper presents and analyses processes, methods and results from civic participation projects developed in the neighbourhood of Barò de Viver (Barcelona and in the District of Marvila (Lisbon. In the first case, a long process initiated in 2004 and partially completed in 2011, neighbours developed the projects "Memory Wall" and Ciutat d'Asuncion Promenade as part of identity construction in public space, in collaboration  with a team of facilitators from CrPolis group. In the second case, different participatory processes dated from 2001 and 2003 have resulted in the implementation of a specific identity urban brand and communication system with an ongoing project of "maps" construction according to the neighbours perception and representation systems. We may conclude that processes of urban governance require more active participation of citizens in projects regarding the improvement of quality of life. At the same time, the implementation of these processes requires a clear

  14. Inequalities and Duality in Gene Coexpression Networks of HIV-1 Infection Revealed by the Combination of the Double-Connectivity Approach and the Gini's Method

    Directory of Open Access Journals (Sweden)

    Chuang Ma

    2011-01-01

    Full Text Available The symbiosis (Sym and pathogenesis (Pat is a duality problem of microbial infection, including HIV/AIDS. Statistical analysis of inequalities and duality in gene coexpression networks (GCNs of HIV-1 infection may gain novel insights into AIDS. In this study, we focused on analysis of GCNs of uninfected subjects and HIV-1-infected patients at three different stages of viral infection based on data deposited in the GEO database of NCBI. The inequalities and duality in these GCNs were analyzed by the combination of the double-connectivity (DC approach and the Gini's method. DC analysis reveals that there are significant differences between positive and negative connectivity in HIV-1 stage-specific GCNs. The inequality measures of negative connectivity and edge weight are changed more significantly than those of positive connectivity and edge weight in GCNs from the HIV-1 uninfected to the AIDS stages. With the permutation test method, we identified a set of genes with significant changes in the inequality and duality measure of edge weight. Functional analysis shows that these genes are highly enriched for the immune system, which plays an essential role in the Sym-Pat duality (SPD of microbial infections. Understanding of the SPD problems of HIV-1 infection may provide novel intervention strategies for AIDS.

  15. Neutralization method for a hydrofluoric acid release

    International Nuclear Information System (INIS)

    Williams, D.L.; Deacon, L.E.

    1976-01-01

    A laboratory investigation of methods for neutralizing a release at the hydrofluoric acid tank farm at the Portsmouth Gaseous Diffusion Plant has revealed that the best neutralization method incorporates the use of a lime/water slurry. In this method, settling of suspended solids in the liquid is enhanced by the application of sodium dodecyl sulfate, which causes immediate flocculation and settling. Dilution and expulsion of the supernatant liquid above the flocculated solids result in an effluent which meets the one part per million fluoride limit established by the U.S. Environmental Protection Agency. A fluoride specific ion electrode is used to determine fluoride concentration. This method presently is being adapted for use in the hydrofluoric acid tank farm and is being considered for use at the plant's fluorine generation facility. It could be adapted for use in any facility that contains fluoride in aqueous solution

  16. A novel method for assessing elbow pain resulting from epicondylitis

    Science.gov (United States)

    Polkinghorn, Bradley S.

    2002-01-01

    Abstract Objective To describe a novel orthopedic test (Polk's test) which can assist the clinician in differentiating between me- dial and lateral epicondylitis, 2 of the most common causes of elbow pain. This test has not been previously described in the literature. Clinical Features The testing procedure described in this paper is easy to learn, simple to perform and may provide the clinician with a quick and effective method of differentiating between lateral and medial epicondylitis. The test also helps to elucidate normal activities of daily living that the patient may unknowingly be performing on a repetitive basis that are hindering recovery. The results of this simple test allow the clinician to make immediate lifestyle recommendations to the patient that should improve and hasten the response to subsequent treatment. It may be used in conjunction with other orthopedic testing procedures, as it correlates well with other clinical tests for assessing epicondylitis. Conclusion The use of Polk's Test may help the clinician to diagnostically differentiate between lateral and medial epicondylitis, as well as supply information relative to choosing proper instructions for the patient to follow as part of their treatment program. Further research, performed in an academic setting, should prove helpful in more thoroughly evaluating the merits of this test. In the meantime, clinical experience over the years suggests that the practicing physician should find a great deal of clinical utility in utilizing this simple, yet effective, diagnostic procedure. PMID:19674572

  17. Identifying usability issues for personalization during formative evaluations: a comparisons of three methods

    NARCIS (Netherlands)

    van Velsen, Lex Stefan; van der Geest, Thea; Klaassen, R.F.

    2011-01-01

    A personalized system is one that generates unique output for each individual. As a result, personalization has transformed the interaction between the user and the system, and specific new usability issues have arisen. Methods used for evaluating personalized systems should be able to reveal the

  18. HPLC-MS-Based Metabonomics Reveals Disordered Lipid Metabolism in Patients with Metabolic Syndrome

    Directory of Open Access Journals (Sweden)

    Xinjie Zhao

    2011-12-01

    Full Text Available Ultra-high performance liquid chromatography/ quadrupole time of flight mass spectrometry-based metabonomics platform was employed to profile the plasma metabolites of patients with metabolic syndrome and the healthy controls. Data analysis revealed lots of differential metabolites between the two groups, and most of them were identified as lipids. Several fatty acids and lysophosphatidylcholines were of higher plasma levels in the patient group, indicating the occurrence of insulin resistance and inflammation. The identified ether phospholipids were decreased in the patient group, reflecting the oxidative stress and some metabolic disorders. These identified metabolites can also be used to aid diagnosis of patients with metabolic syndrome. These results showed that metabonomics was a promising and powerful method to study metabolic syndrome.

  19. NeuCode Proteomics Reveals Bap1 Regulation of Metabolism

    Directory of Open Access Journals (Sweden)

    Joshua M. Baughman

    2016-07-01

    Full Text Available We introduce neutron-encoded (NeuCode amino acid labeling of mice as a strategy for multiplexed proteomic analysis in vivo. Using NeuCode, we characterize an inducible knockout mouse model of Bap1, a tumor suppressor and deubiquitinase whose in vivo roles outside of cancer are not well established. NeuCode proteomics revealed altered metabolic pathways following Bap1 deletion, including profound elevation of cholesterol biosynthetic machinery coincident with reduced expression of gluconeogenic and lipid homeostasis proteins in liver. Bap1 loss increased pancreatitis biomarkers and reduced expression of mitochondrial proteins. These alterations accompany a metabolic remodeling with hypoglycemia, hypercholesterolemia, hepatic lipid loss, and acinar cell degeneration. Liver-specific Bap1 null mice present with fully penetrant perinatal lethality, severe hypoglycemia, and hepatic lipid deficiency. This work reveals Bap1 as a metabolic regulator in liver and pancreas, and it establishes NeuCode as a reliable proteomic method for deciphering in vivo biology.

  20. Comparison between Two Linear Supervised Learning Machines' Methods with Principle Component Based Methods for the Spectrofluorimetric Determination of Agomelatine and Its Degradants.

    Science.gov (United States)

    Elkhoudary, Mahmoud M; Naguib, Ibrahim A; Abdel Salam, Randa A; Hadad, Ghada M

    2017-05-01

    Four accurate, sensitive and reliable stability indicating chemometric methods were developed for the quantitative determination of Agomelatine (AGM) whether in pure form or in pharmaceutical formulations. Two supervised learning machines' methods; linear artificial neural networks (PC-linANN) preceded by principle component analysis and linear support vector regression (linSVR), were compared with two principle component based methods; principle component regression (PCR) as well as partial least squares (PLS) for the spectrofluorimetric determination of AGM and its degradants. The results showed the benefits behind using linear learning machines' methods and the inherent merits of their algorithms in handling overlapped noisy spectral data especially during the challenging determination of AGM alkaline and acidic degradants (DG1 and DG2). Relative mean squared error of prediction (RMSEP) for the proposed models in the determination of AGM were 1.68, 1.72, 0.68 and 0.22 for PCR, PLS, SVR and PC-linANN; respectively. The results showed the superiority of supervised learning machines' methods over principle component based methods. Besides, the results suggested that linANN is the method of choice for determination of components in low amounts with similar overlapped spectra and narrow linearity range. Comparison between the proposed chemometric models and a reported HPLC method revealed the comparable performance and quantification power of the proposed models.

  1. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    Science.gov (United States)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  2. [Interactions of DNA bases with individual water molecules. Molecular mechanics and quantum mechanics computation results vs. experimental data].

    Science.gov (United States)

    Gonzalez, E; Lino, J; Deriabina, A; Herrera, J N F; Poltev, V I

    2013-01-01

    To elucidate details of the DNA-water interactions we performed the calculations and systemaitic search for minima of interaction energy of the systems consisting of one of DNA bases and one or two water molecules. The results of calculations using two force fields of molecular mechanics (MM) and correlated ab initio method MP2/6-31G(d, p) of quantum mechanics (QM) have been compared with one another and with experimental data. The calculations demonstrated a qualitative agreement between geometry characteristics of the most of local energy minima obtained via different methods. The deepest minima revealed by MM and QM methods correspond to water molecule position between two neighbor hydrophilic centers of the base and to the formation by water molecule of hydrogen bonds with them. Nevertheless, the relative depth of some minima and peculiarities of mutual water-base positions in' these minima depend on the method used. The analysis revealed insignificance of some differences in the results of calculations performed via different methods and the importance of other ones for the description of DNA hydration. The calculations via MM methods enable us to reproduce quantitatively all the experimental data on the enthalpies of complex formation of single water molecule with the set of mono-, di-, and trimethylated bases, as well as on water molecule locations near base hydrophilic atoms in the crystals of DNA duplex fragments, while some of these data cannot be rationalized by QM calculations.

  3. Analysis of risk of nonconformities and applied quality inspection methods in the process of aluminium profiles coating based on FMEA results

    OpenAIRE

    Krzysztof Knop

    2017-01-01

    The article presents the results of risk analysis associated with nonconformities of aluminium profiles in the process of coating and quality inspection methods used to their detection. Analysis of risk was done based on results of FMEA method. Evaluated quality inspection methods were distinguished based on the term of inspection in the ISO 9000:2005 norm. Manufacturing process of aluminium profile in micro-technological approach was presented. Triple quantification of nonconformities risk b...

  4. On the superconvergence of the SBB method

    International Nuclear Information System (INIS)

    Franca, L.P.

    1988-05-01

    Written in a mixed form a simple two-point boundary value problem is shown to have superconvergence characteristics under the SBB method. Convergence and accuracy analyses reveal superior performance of the method compared to the usual Galerkin method. (author) [pt

  5. Finding function: evaluation methods for functional genomic data

    Directory of Open Access Journals (Sweden)

    Barrett Daniel R

    2006-07-01

    Full Text Available Abstract Background Accurate evaluation of the quality of genomic or proteomic data and computational methods is vital to our ability to use them for formulating novel biological hypotheses and directing further experiments. There is currently no standard approach to evaluation in functional genomics. Our analysis of existing approaches shows that they are inconsistent and contain substantial functional biases that render the resulting evaluations misleading both quantitatively and qualitatively. These problems make it essentially impossible to compare computational methods or large-scale experimental datasets and also result in conclusions that generalize poorly in most biological applications. Results We reveal issues with current evaluation methods here and suggest new approaches to evaluation that facilitate accurate and representative characterization of genomic methods and data. Specifically, we describe a functional genomics gold standard based on curation by expert biologists and demonstrate its use as an effective means of evaluation of genomic approaches. Our evaluation framework and gold standard are freely available to the community through our website. Conclusion Proper methods for evaluating genomic data and computational approaches will determine how much we, as a community, are able to learn from the wealth of available data. We propose one possible solution to this problem here but emphasize that this topic warrants broader community discussion.

  6. Energy Conservation Program Evaluation : Practical Methods, Useful Results : Proceedings of the 1987 Conference.

    Energy Technology Data Exchange (ETDEWEB)

    Argonne National Laboratory; International Conference on Energy Conservation Program Evaluation (3rd : 1987 : Chicago, ILL.)

    1987-01-01

    The success of cutting-edge evaluation methodologies depends on our ability to merge, manage, and maintain huge amounts of data. Equally important is presenting results of the subsequent analysis in a meaningful way. These topics are addressed at this session. The considerable amounts of data that have been collected about energy conservation programs are rarely used by other researchers, either because they are not available in computerized form or, if they are, because of the difficulties of interpreting someone else's data, format inconsistencies, incompatibility of computers, lack of documentation, data entry errors, and obtaining data use agreements. Even census, RECS, and AHS data can be best used only by a researcher who is intimately familiar with them. Once the data have been accessed and analyzed, the results need to be put in a format that can be readily understood by others. This is a particularly difficult task when submetered data is the basis of the analysis. Stoops and Gilbride will demonstrate their methods of using off-the-shelf graphics software to illustrate complex hourly data from nonresidential buildings.

  7. Long-term results of vaginal repairs with and without xenograft reinforcement

    DEFF Research Database (Denmark)

    Mouritsen, Lone; Kronschnabl, M.; Lose, G.

    2010-01-01

    INTRODUCTION AND HYPOTHESIS: The aim of this paper is to study if xenograft reinforcement of vaginal repair reduces recurrence of prolapse. METHODS: Results 1-5 years after vaginal repair were studied in 41 cases with xenograft and in 82 matched controls without. Symptoms were evaluated...... as POPQ > -1 plus symptoms revealed recurrence in 3% of cases and 12% controls. None of the recurrence rates was significantly different for cases versus controls. No vaginal erosions were seen. Previous surgery was a significant risk factor with odds ratio 7.3 for another recurrence. CONCLUSIONS...

  8. Further results for crack-edge mappings by ray methods

    International Nuclear Information System (INIS)

    Norris, A.N.; Achenbach, J.D.; Ahlberg, L.; Tittman, B.R.

    1984-01-01

    This chapter discusses further extensions of the local edge mapping method to the pulse-echo case and to configurations of water-immersed specimens and transducers. Crack edges are mapped by the use of arrival times of edge-diffracted signals. Topics considered include local edge mapping in a homogeneous medium, local edge mapping algorithms, local edge mapping through an interface, and edge mapping through an interface using synthetic data. Local edge mapping is iterative, with two or three iterations required for convergence

  9. The distinctive gastric fluid proteome in gastric cancer reveals a multi-biomarker diagnostic profile

    Directory of Open Access Journals (Sweden)

    Eng Alvin KH

    2008-10-01

    Full Text Available Abstract Background Overall gastric cancer survival remains poor mainly because there are no reliable methods for identifying highly curable early stage disease. Multi-protein profiling of gastric fluids, obtained from the anatomic site of pathology, could reveal diagnostic proteomic fingerprints. Methods Protein profiles were generated from gastric fluid samples of 19 gastric cancer and 36 benign gastritides patients undergoing elective, clinically-indicated gastroscopy using surface-enhanced laser desorption/ionization time-of-flight mass spectrometry on multiple ProteinChip arrays. Proteomic features were compared by significance analysis of microarray algorithm and two-way hierarchical clustering. A second blinded sample set (24 gastric cancers and 29 clinically benign gastritides was used for validation. Results By significance analysyis of microarray, 60 proteomic features were up-regulated and 46 were down-regulated in gastric cancer samples (p Conclusion This simple and reproducible multimarker proteomic assay could supplement clinical gastroscopic evaluation of symptomatic patients to enhance diagnostic accuracy for gastric cancer and pre-malignant lesions.

  10. Analysis of risk of nonconformities and applied quality inspection methods in the process of aluminium profiles coating based on FMEA results

    Directory of Open Access Journals (Sweden)

    Krzysztof Knop

    2017-10-01

    Full Text Available The article presents the results of risk analysis associated with nonconformities of aluminium profiles in the process of coating and quality inspection methods used to their detection. Analysis of risk was done based on results of FMEA method. Evaluated quality inspection methods were distinguished based on the term of inspection in the ISO 9000:2005 norm. Manufacturing process of aluminium profile in micro-technological approach was presented. Triple quantification of nonconformities risk based on the FMEA method by using three different approaches was conducted. Analysis of nonconformities risks associated with the use of specific quality inspection methods was done. In the last part the analysis of causes of critical nonconformities, proposals for improvement actions reducing the risk of the critical nonconformities and applied critical quality inspection method were showed.

  11. Sodium flow rate measurement method of annular linear induction pumps

    International Nuclear Information System (INIS)

    Araseki, Hideo; Kirillov, Igor R.; Preslitsky, Gennady V.

    2012-01-01

    Highlights: ► We found a new method of flow rate monitoring of electromagnetic pump. ► The method is very simple and does not require a large space. ► The method was verified with an experiment and a numerical analysis. ► The experimental data and the numerical results are in good agreement. - Abstract: The present paper proposes a method for measuring sodium flow rate of annular linear induction pumps. The feature of the method lies in measuring the leaked magnetic field with measuring coils near the stator end on the outlet side and in correlating it with the sodium flow rate. This method is verified through an experiment and a numerical analysis. The data obtained in the experiment reveals that the correlation between the leaked magnetic field and the sodium flow rate is almost linear. The result of the numerical analysis agrees with the experimental data. The present method will be particularly effective to sodium flow rate monitoring of each one of plural annular linear induction pumps arranged in parallel in a vessel which forms a large-scale pump unit.

  12. Changes of forest stands vulnerability to future wind damage resulting from different management methods

    DEFF Research Database (Denmark)

    Panferov, O.; Sogachev, Andrey; Ahrends, B.

    2010-01-01

    The structure of forests stands changes continuously as a result of forest growth and both natural and anthropogenic disturbances like windthrow or management activities – planting/cutting of trees. These structure changes can stabilize or destabilize forest stands in terms of their resistance...... to wind damage. The driving force behind the damage is the climate, but the magnitude and sign of resulting effect depend on tree species, management method and soil conditions. The projected increasing frequency of weather extremes in the whole and severe storms in particular might produce wide area...... damage in European forest ecosystems during the 21st century. To assess the possible wind damage and stabilization/destabilization effects of forest management a number of numeric experiments are carried out for the region of Solling, Germany. The coupled small-scale process-based model combining Brook90...

  13. Self-disclosure on Facebook: How much do we really reveal?

    Directory of Open Access Journals (Sweden)

    Stephanie Day

    Full Text Available This paper investigates the use of the social networking site Facebook to self-disclose and analyses the responses of a small group of Facebook users surveyed about their own willingness to self-disclose. An online survey was used to ask Facebook users about their level of Facebook use, what types of personal information they are willing to reveal and the frequency of these personal revelations. The survey also asked the participants to take a look at their publicly viewable profile and the types of information revealed there. Results indicated that overall, most people tended to be cautious about the types of information they revealed, posted mainly positive statements about themselves and were aware of personal privacy issues.

  14. Biosocial Conservation: Integrating Biological and Ethnographic Methods to Study Human-Primate Interactions.

    Science.gov (United States)

    Setchell, Joanna M; Fairet, Emilie; Shutt, Kathryn; Waters, Siân; Bell, Sandra

    2017-01-01

    Biodiversity conservation is one of the grand challenges facing society. Many people interested in biodiversity conservation have a background in wildlife biology. However, the diverse social, cultural, political, and historical factors that influence the lives of people and wildlife can be investigated fully only by incorporating social science methods, ideally within an interdisciplinary framework. Cultural hierarchies of knowledge and the hegemony of the natural sciences create a barrier to interdisciplinary understandings. Here, we review three different projects that confront this difficulty, integrating biological and ethnographic methods to study conservation problems. The first project involved wildlife foraging on crops around a newly established national park in Gabon. Biological methods revealed the extent of crop loss, the species responsible, and an effect of field isolation, while ethnography revealed institutional and social vulnerability to foraging wildlife. The second project concerned great ape tourism in the Central African Republic. Biological methods revealed that gorilla tourism poses risks to gorillas, while ethnography revealed why people seek close proximity to gorillas. The third project focused on humans and other primates living alongside one another in Morocco. Incorporating shepherds in the coproduction of ecological knowledge about primates built trust and altered attitudes to the primates. These three case studies demonstrate how the integration of biological and social methods can help us to understand the sustainability of human-wildlife interactions, and thus promote coexistence. In each case, an integrated biosocial approach incorporating ethnographic data produced results that would not otherwise have come to light. Research that transcends conventional academic boundaries requires the openness and flexibility to move beyond one's comfort zone to understand and acknowledge the legitimacy of "other" kinds of knowledge. It is

  15. A validation of direct grey Dancoff factors results for cylindrical cells in cluster geometry by the Monte Carlo method

    International Nuclear Information System (INIS)

    Rodrigues, Leticia Jenisch; Bogado, Sergio; Vilhena, Marco T.

    2008-01-01

    The WIMS code is a well known and one of the most used codes to handle nuclear core physics calculations. Recently, the PIJM module of the WIMS code was modified in order to allow the calculation of Grey Dancoff factors, for partially absorbing materials, using the alternative definition in terms of escape and collision probabilities. Grey Dancoff factors for the Canadian CANDU-37 and CANFLEX assemblies were calculated with PIJM at five symmetrically distinct fuel pin positions. The results, obtained via Direct Method, i.e., by direct calculation of escape and collision probabilities, were satisfactory when compared with the ones of literature. On the other hand, the PIJMC module was developed to calculate escape and collision probabilities using Monte Carlo method. Modifications in this module were performed to determine Black Dancoff factors, considering perfectly absorbing fuel rods. In this work, we proceed further in the task of validating the Direct Method by the Monte Carlo approach. To this end, the PIJMC routine is modified to compute Grey Dancoff factors using the cited alternative definition. Results are reported for the mentioned CANDU-37 and CANFLEX assemblies obtained with PIJMC, at the same fuel pin positions as with PIJM. A good agreement is observed between the results from the Monte Carlo and Direct methods

  16. Low-temperature infiltration identified using infrared thermography in patients with subcutaneous edema revealed ultrasonographically: A case report.

    Science.gov (United States)

    Oya, Maiko; Takahashi, Toshiaki; Tanabe, Hidenori; Oe, Makoto; Murayama, Ryoko; Yabunaka, Koichi; Matsui, Yuko; Sanada, Hiromi

    Infiltration is a frequent complication of infusion therapy. We previously demonstrated the usefulness of infrared thermography as an objective method of detecting infiltration in healthy people. However, whether thermography can detect infiltration in clinical settings remains unknown. Therefore, we report two cases where thermography was useful in detecting infiltration at puncture sites. In both cases, tissue changes were verified ultrasonographically. The patients were a 56-year-old male with cholangitis and a 76-year-old female with hepatoma. In both cases, infiltration symptoms such as swelling and erythema occurred one day after the insertion of a peripheral intravenous catheter. Thermographic images from both patients revealed low-temperature areas spreading from the puncture sites; however, these changes were not observed in other patients. The temperature difference between the low-temperature areas and their surrounding skin surface exceeded 1.0°C. Concurrently, ultrasound images revealed that tissues surrounding the vein had a cobblestone appearance, indicating edema. In both patients, subcutaneous tissue changes suggested infiltration and both had low-temperature areas spreading from the puncture sites. Thus, subcutaneous edema may indicate infusion leakage, resulting in a decrease in the temperature of the associated skin surface. These cases suggest that infrared thermography is an effective method of objectively and noninvasively detecting infiltration.

  17. METHODS OF MEASURING THE EFFECTS OF LIGHTNING BY SIMULATING ITS STRIKES WITH THE INTERVAL ASSESSMENT OF THE RESULTS OF MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    P. V. Kriksin

    2017-01-01

    Full Text Available The article presents the results of the development of new methods aimed at more accurate interval estimate of the experimental values of voltages on grounding devices of substations and circuits in the control cables, that occur when lightning strikes to lightning rods; the abovementioned estimate made it possible to increase the accuracy of the results of the study of lightning noise by 28 %. A more accurate value of interval estimation were achieved by developing a measurement model that takes into account, along with the measured values, different measurement errors and includes the special processing of the measurement results. As a result, the interval of finding the true value of the sought voltage is determined with an accuracy of 95 %. The methods can be applied to the IK-1 and IKP-1 measurement complexes, consisting in the aperiodic pulse generator, the generator of high-frequency pulses and selective voltmeters, respectively. To evaluate the effectiveness of the developed methods series of experimental voltage assessments of grounding devices of ten active high-voltage substation have been fulfilled in accordance with the developed methods and traditional techniques. The evaluation results confirmed the possibility of finding the true values of voltage over a wide range, that ought to be considered in the process of technical diagnostics of lightning protection of substations when the analysis of the measurement results and the development of measures to reduce the effects of lightning are being fulfilled. Also, a comparative analysis of the results of measurements made in accordance with the developed methods and traditional techniques has demonstrated that the true value of the sought voltage may exceed the measured value at an average of 28 %, that ought to be considered in the further analysis of the parameters of lightning protection at the facility and in the development of corrective actions. The developed methods have been

  18. Assessing Cost-Effectiveness in Obesity (ACE-Obesity: an overview of the ACE approach, economic methods and cost results

    Directory of Open Access Journals (Sweden)

    Swinburn Boyd

    2009-11-01

    Full Text Available Abstract Background The aim of the ACE-Obesity study was to determine the economic credentials of interventions which aim to prevent unhealthy weight gain in children and adolescents. We have reported elsewhere on the modelled effectiveness of 13 obesity prevention interventions in children. In this paper, we report on the cost results and associated methods together with the innovative approach to priority setting that underpins the ACE-Obesity study. Methods The Assessing Cost Effectiveness (ACE approach combines technical rigour with 'due process' to facilitate evidence-based policy analysis. Technical rigour was achieved through use of standardised evaluation methods, a research team that assembles best available evidence and extensive uncertainty analysis. Cost estimates were based on pathway analysis, with resource usage estimated for the interventions and their 'current practice' comparator, as well as associated cost offsets. Due process was achieved through involvement of stakeholders, consensus decisions informed by briefing papers and 2nd stage filter analysis that captures broader factors that influence policy judgements in addition to cost-effectiveness results. The 2nd stage filters agreed by stakeholders were 'equity', 'strength of the evidence', 'feasibility of implementation', 'acceptability to stakeholders', 'sustainability' and 'potential for side-effects'. Results The intervention costs varied considerably, both in absolute terms (from cost saving [6 interventions] to in excess of AUD50m per annum and when expressed as a 'cost per child' estimate (from Conclusion The use of consistent methods enables valid comparison of potential intervention costs and cost-offsets for each of the interventions. ACE-Obesity informs policy-makers about cost-effectiveness, health impact, affordability and 2nd stage filters for important options for preventing unhealthy weight gain in children. In related articles cost-effectiveness results and

  19. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  20. Resource costing for multinational neurologic clinical trials: methods and results.

    Science.gov (United States)

    Schulman, K; Burke, J; Drummond, M; Davies, L; Carlsson, P; Gruger, J; Harris, A; Lucioni, C; Gisbert, R; Llana, T; Tom, E; Bloom, B; Willke, R; Glick, H

    1998-11-01

    We present the results of a multinational resource costing study for a prospective economic evaluation of a new medical technology for treatment of subarachnoid hemorrhage within a clinical trial. The study describes a framework for the collection and analysis of international resource cost data that can contribute to a consistent and accurate intercountry estimation of cost. Of the 15 countries that participated in the clinical trial, we collected cost information in the following seven: Australia, France, Germany, the UK, Italy, Spain, and Sweden. The collection of cost data in these countries was structured through the use of worksheets to provide accurate and efficient cost reporting. We converted total average costs to average variable costs and then aggregated the data to develop study unit costs. When unit costs were unavailable, we developed an index table, based on a market-basket approach, to estimate unit costs. To estimate the cost of a given procedure, the market-basket estimation process required that cost information be available for at least one country. When cost information was unavailable in all countries for a given procedure, we estimated costs using a method based on physician-work and practice-expense resource-based relative value units. Finally, we converted study unit costs to a common currency using purchasing power parity measures. Through this costing exercise we developed a set of unit costs for patient services and per diem hospital services. We conclude by discussing the implications of our costing exercise and suggest guidelines to facilitate more effective multinational costing exercises.

  1. Interval sampling methods and measurement error: a computer simulation.

    Science.gov (United States)

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  2. Finite elements volumes methods: applications to the Navier-Stokes equations and convergence results

    International Nuclear Information System (INIS)

    Emonot, P.

    1992-01-01

    In the first chapter are described the equations modeling incompressible fluid flow and a quick presentation of finite volumes method. The second chapter is an introduction to the finite elements volumes method. The box model is described and a method adapted to Navier-Stokes problems is proposed. The third chapter shows a fault analysis of the finite elements volumes method for the Laplacian problem and some examples in one, two, three dimensional calculations. The fourth chapter is an extension of the error analysis of the method for the Navier-Stokes problem

  3. Prevalence of Overweight and Obesity in Portuguese Adolescents: Comparison of Different Anthropometric Methods

    Science.gov (United States)

    Minghelli, Beatriz; Nunes, Carla; Oliveira, Raul

    2013-01-01

    Background: The recommended anthropometric methods to assess the weight status include body mass index (BMI), skinfold thickness, and waist circumference. However, these methods have advantages and disadvantages regarding the classification of overweight and obesity in adolescents. Aims: The study was to analyze the correlation between the measurements of BMI, skinfold thickness and waist circumference to assess overweight and obesity in Portuguese adolescents. Materials and Methods: A sample of 966 students of Portugal was used. Of them, 437 (45.2%) were males and 529 (54.8%) were females aged between 10 and 16 years. The evaluations included BMI calculation, skinfold thickness, and waist circumference measurements. Results: This study revealed a high prevalence of overweight and obesity with values ranging from 31.6%, 61.4%, and 41.1% according to the measurement of BMI, skinfold thickness, and waist circumference, respectively. The results found a high level of correlation between BMI and skinfold thickness (P < 0.001, r = 0.712), between BMI and waist circumference (P < 0.001, r = 0.884), and waist circumference and skinfold thickness (P < 0.001, r = 0.701). Conclusions: This study revealed a high prevalence of overweight and obesity in Portuguese adolescents using three different anthropometric methods, where the BMI showed the lowest values of prevalence of overweight and obesity and the skinfold thickness showed the highest values. The three anthropometric methods were highly correlated. PMID:24404544

  4. Effects of test method and participant musical training on preference ratings of stimuli with different reverberation times.

    Science.gov (United States)

    Lawless, Martin S; Vigeant, Michelle C

    2017-10-01

    Selecting an appropriate listening test design for concert hall research depends on several factors, including listening test method and participant critical-listening experience. Although expert listeners afford more reliable data, their perceptions may not be broadly representative. The present paper contains two studies that examined the validity and reliability of the data obtained from two listening test methods, a successive and a comparative method, and two types of participants, musicians and non-musicians. Participants rated their overall preference of auralizations generated from eight concert hall conditions with a range of reverberation times (0.0-7.2 s). Study 1, with 34 participants, assessed the two methods. The comparative method yielded similar results and reliability as the successive method. Additionally, the comparative method was rated as less difficult and more preferable. For study 2, an additional 37 participants rated the stimuli using the comparative method only. An analysis of variance of the responses from both studies revealed that musicians are better than non-musicians at discerning their preferences across stimuli. This result was confirmed with a k-means clustering analysis on the entire dataset that revealed five preference groups. Four groups exhibited clear preferences to the stimuli, while the fifth group, predominantly comprising non-musicians, demonstrated no clear preference.

  5. The healthy building intervention study: Objectives, methods and results of selected environmental measurements

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, W.J.; Faulkner, D.; Sullivan, D. [and others

    1998-02-17

    To test proposed methods for reducing SBS symptoms and to learn about the causes of these symptoms, a double-blind controlled intervention study was designed and implemented. This study utilized two different interventions designed to reduce occupants` exposures to airborne particles: (1) high efficiency filters in the building`s HVAC systems; and (2) thorough cleaning of carpeted floors and fabric-covered chairs with an unusually powerful vacuum cleaner. The study population was the workers on the second and fourth floors of a large office building with mechanical ventilation, air conditioning, and sealed windows. Interventions were implemented on one floor while the occupants on the other floor served as a control group. For the enhanced-filtration intervention, a multiple crossover design was used (a crossover is a repeat of the experiment with the former experimental group as the control group and vice versa). Demographic and health symptom data were collected via an initial questionnaire on the first study week and health symptom data were obtained each week, for eight additional weeks, via weekly questionnaires. A large number of indoor environmental parameters were measured during the study including air temperatures and humidities, carbon dioxide concentrations, particle concentrations, concentrations of several airborne bioaerosols, and concentrations of several microbiologic compounds within the dust sampled from floors and chairs. This report describes the study methods and summarizes the results of selected environmental measurements.

  6. Time-lapse imagery of Adélie penguins reveals differential winter strategies and breeding site occupation.

    Science.gov (United States)

    Black, Caitlin; Southwell, Colin; Emmerson, Louise; Lunn, Daniel; Hart, Tom

    2018-01-01

    Polar seabirds adopt different over-wintering strategies to survive and build condition during the critical winter period. Penguin species either reside at the colony during the winter months or migrate long distances. Tracking studies and survey methods have revealed differences in winter migration routes among penguin species and colonies, dependent on both biotic and abiotic factors present. However, scan sampling methods are rarely used to reveal non-breeding behaviors during winter and little is known about presence at the colony site over this period. Here we show that Adélie penguins on the Yalour Islands in the Western Antarctic Peninsula (WAP) are present year-round at the colony and undergo a mid-winter peak in abundance during winter. We found a negative relationship between daylight hours and penguin abundance when either open water or compact ice conditions were present, suggesting that penguins return to the breeding colony when visibility is lowest for at-sea foraging and when either extreme low or high levels of sea ice exist offshore. In contrast, Adélie penguins breeding in East Antarctica were not observed at the colonies during winter, suggesting that Adélie penguins undergo differential winter strategies in the marginal ice zone on the WAP compared to those in East Antarctica. These results demonstrate that cameras can successfully monitor wildlife year-round in areas that are largely inaccessible during winter.

  7. Standard and applied material testing methods of austenitic CrNi stainless steels in different nitric acid media - procedures and results

    International Nuclear Information System (INIS)

    Leistikow, S.; Kraft, R.; Schanz, G.

    1989-07-01

    Extended ASTM Standard Huey Testing has been performed in at 120 0 C boiling 14.4 molar (65%) nitric acid during 15 periods (15x48 = 720 h duration) for quality control of numerous commercial nitric acid resistant austenitic CrNi steels. It was shown how sensitively the chosen testing conditions could differentiate between CrNi steels of the same nominal composition as specified for DIN W.Nr. 1.4306 (AISI Type 304 L), but with varying residual element contents. Within an attempt to differentiate within this group of steels by application of electrochemical methods, potentiostatic tests at 1250 mV in nitric acid of equal concentration and temperature were able to detect remarkable differences in corrosion behaviour already after one hour. Another approach, more typical for the electrochemical potentials during materials application in reprocessing plants of nuclear fuel, gave preference to long-term immersion tests, which were performed in nitric acid of lower concentration and temperature. Reference tests in pure 7 molar, 90 0 C nitric acid could only reveal by surface attack small differences in steel quality by exposures of 720 h duration. To shorten the test time by an increase of the redox potential chromium (VI) ions were added to the nitric acid. In a solution of 0,5 g Gr (VI)/l at 90 0 C remarkable differences in corrosion behavior of the steels - similar to the Huey test results - became measurable by means of gravimetry and metallography already during a short-term exposure of 24-71 h. (orig./MM) [de

  8. Model films of cellulose. I. Method development and initial results

    NARCIS (Netherlands)

    Gunnars, S.; Wågberg, L.; Cohen Stuart, M.A.

    2002-01-01

    This report presents a new method for the preparation of thin cellulose films. NMMO (N- methylmorpholine- N-oxide) was used to dissolve cellulose and addition of DMSO (dimethyl sulfoxide) was used to control viscosity of the cellulose solution. A thin layer of the cellulose solution is spin- coated

  9. Doppler method leak detection for LMFBR steam generators. Pt. 3. Investigation of detection sensitivity and method

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi; Kinoshita, Izumi

    2001-01-01

    To prevent the expansion of tube damage and to maintain structural integrity in the steam generators (SGs) of a fast breeder reactor (FBR), it is necessary to detect precisely and immediately any leakage of water from heat transfer tubes. Therefore, the Doppler method was developed. Previous studies have revealed that, in the SG full-sector model that simulates actual SGs, the Doppler method can detect bubbles of 0.4 l/s within a few seconds. However in consideration of the dissolution rate of hydrogen generated by a sodium-water reaction even from a small water leak, it is necessary to detect smaller leakages of water from the heat transfer tubes. The detection sensitivity of the Doppler method and the influence of background noise were experimentally investigated. In-water experiments were performed using the SG model. The results show that the Doppler method can detect bubbles of 0.01 l/s (equivalent to a water leak rate of about 0.01 g/s) within a few seconds and that the background noise has little effect on water leak detection performance. The Doppler method thus has great potential for the detection of water leakage in SGs. (author)

  10. Unsupervised text mining methods for literature analysis: a case study for Thomas Pynchon's V.

    Directory of Open Access Journals (Sweden)

    Christos Iraklis Tsatsoulis

    2013-08-01

    Full Text Available We investigate the use of unsupervised text mining methods for the analysis of prose literature works, using Thomas Pynchon's novel 'V'. as a case study. Our results suggest that such methods may be employed to reveal meaningful information regarding the novel’s structure. We report results using a wide variety of clustering algorithms, several distinct distance functions, and different visualization techniques. The application of a simple topic model is also demonstrated. We discuss the meaningfulness of our results along with the limitations of our approach, and we suggest some possible paths for further study.

  11. Development of isothermal-isobaric replica-permutation method for molecular dynamics and Monte Carlo simulations and its application to reveal temperature and pressure dependence of folded, misfolded, and unfolded states of chignolin

    Science.gov (United States)

    Yamauchi, Masataka; Okumura, Hisashi

    2017-11-01

    We developed a two-dimensional replica-permutation molecular dynamics method in the isothermal-isobaric ensemble. The replica-permutation method is a better alternative to the replica-exchange method. It was originally developed in the canonical ensemble. This method employs the Suwa-Todo algorithm, instead of the Metropolis algorithm, to perform permutations of temperatures and pressures among more than two replicas so that the rejection ratio can be minimized. We showed that the isothermal-isobaric replica-permutation method performs better sampling efficiency than the isothermal-isobaric replica-exchange method and infinite swapping method. We applied this method to a β-hairpin mini protein, chignolin. In this simulation, we observed not only the folded state but also the misfolded state. We calculated the temperature and pressure dependence of the fractions on the folded, misfolded, and unfolded states. Differences in partial molar enthalpy, internal energy, entropy, partial molar volume, and heat capacity were also determined and agreed well with experimental data. We observed a new phenomenon that misfolded chignolin becomes more stable under high-pressure conditions. We also revealed this mechanism of the stability as follows: TYR2 and TRP9 side chains cover the hydrogen bonds that form a β-hairpin structure. The hydrogen bonds are protected from the water molecules that approach the protein as the pressure increases.

  12. Continuous multistep methods for volterra integro-differential ...

    African Journals Online (AJOL)

    A new class of numerical methods for Volterra integro-differential equations of the second order is developed. The methods are based on interpolation and collocation of the shifted Legendre polynomial as basis function with Trapezoidal quadrature rules. The convergence analysis revealed that the methods are consistent ...

  13. EXPLORATORY DATA ANALYSIS AND MULTIVARIATE STRATEGIES FOR REVEALING MULTIVARIATE STRUCTURES IN CLIMATE DATA

    Directory of Open Access Journals (Sweden)

    2016-12-01

    Full Text Available This paper is on data analysis strategy in a complex, multidimensional, and dynamic domain. The focus is on the use of data mining techniques to explore the importance of multivariate structures; using climate variables which influences climate change. Techniques involved in data mining exercise vary according to the data structures. The multivariate analysis strategy considered here involved choosing an appropriate tool to analyze a process. Factor analysis is introduced into data mining technique in order to reveal the influencing impacts of factors involved as well as solving for multicolinearity effect among the variables. The temporal nature and multidimensionality of the target variables is revealed in the model using multidimensional regression estimates. The strategy of integrating the method of several statistical techniques, using climate variables in Nigeria was employed. R2 of 0.518 was obtained from the ordinary least square regression analysis carried out and the test was not significant at 5% level of significance. However, factor analysis regression strategy gave a good fit with R2 of 0.811 and the test was significant at 5% level of significance. Based on this study, model building should go beyond the usual confirmatory data analysis (CDA, rather it should be complemented with exploratory data analysis (EDA in order to achieve a desired result.

  14. On numerical solution of Burgers' equation by homotopy analysis method

    International Nuclear Information System (INIS)

    Inc, Mustafa

    2008-01-01

    In this Letter, we present the Homotopy Analysis Method (shortly HAM) for obtaining the numerical solution of the one-dimensional nonlinear Burgers' equation. The initial approximation can be freely chosen with possible unknown constants which can be determined by imposing the boundary and initial conditions. Convergence of the solution and effects for the method is discussed. The comparison of the HAM results with the Homotopy Perturbation Method (HPM) and the results of [E.N. Aksan, Appl. Math. Comput. 174 (2006) 884; S. Kutluay, A. Esen, Int. J. Comput. Math. 81 (2004) 1433; S. Abbasbandy, M.T. Darvishi, Appl. Math. Comput. 163 (2005) 1265] are made. The results reveal that HAM is very simple and effective. The HAM contains the auxiliary parameter h, which provides us with a simple way to adjust and control the convergence region of solution series. The numerical solutions are compared with the known analytical and some numerical solutions

  15. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  16. The method of producing climate change datasets impacts the resulting policy guidance and chance of mal-adaptation

    Directory of Open Access Journals (Sweden)

    Marie Ekström

    2016-12-01

    Full Text Available Impact, adaptation and vulnerability (IAV research underpin strategies for adaptation to climate change and help to conceptualise what life may look like in decades to come. Research draws on information from global climate models (GCMs though typically post-processed into a secondary product with finer resolution through methods of downscaling. Through worked examples set in an Australian context we assess the influence of GCM sub-setting, geographic area sub-setting and downscaling method on the regional change signal. Examples demonstrate that choices impact on the final results differently depending on factors such as application needs, range of uncertainty of the projected variable, amplitude of natural variability, and size of study region. For heat extremes, the choice of emissions scenario is of prime importance, but for a given scenario the method of preparing data can affect the magnitude of the projection by a factor of two or more, strongly affecting the indicated adaptation decision. For catchment level runoff projections, the choice of emission scenario is less dominant. Rather the method of selecting and producing application-ready datasets is crucial as demonstrated by results with opposing sign of change, raising the real possibility of mal-adaptive decisions. This work illustrates the potential pitfalls of GCM sub-sampling or the use of a single downscaled product when conducting IAV research. Using the broad range of change from all available model sources, whilst making the application more complex, avoids the larger problem of over-confidence in climate projections and lessens the chance of mal-adaptation.

  17. STANDARDIZATION OF GLYCOHEMOGLOBIN RESULTS AND REFERENCE VALUES IN WHOLE-BLOOD STUDIED IN 103 LABORATORIES USING 20 METHODS

    NARCIS (Netherlands)

    WEYKAMP, CW; PENDERS, TJ; MUSKIET, FAJ; VANDERSLIK, W

    We investigated the effect of calibration with lyophilized calibrators on whole-blood glycohemoglobin (glyHb) results. One hundred three laboratories, using 20 different methods, determined glyHb in two lyophilized calibrators and two whole-blood samples. For whole-blood samples with low (5%) and

  18. Lack of tissue renewal in human adult Achilles tendon is revealed by nuclear bomb C

    DEFF Research Database (Denmark)

    Heinemeier, Katja Maria; Schjerling, Peter; Heinemeier, J.

    2013-01-01

    the 14C bomb-pulse method. This method takes advantage of the dramatic increase in atmospheric levels of 14C, produced by nuclear bomb tests in 1955-1963, which is reflected in all living organisms. Levels of 14C were measured in 28 forensic samples of Achilles tendon core and 4 skeletal muscle samples...... is revealed by nuclear bomb 14C....

  19. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Pion emission from the T2K replica target: method, results and application

    CERN Document Server

    Abgrall, N.; Anticic, T.; Antoniou, N.; Argyriades, J.; Baatar, B.; Blondel, A.; Blumer, J.; Bogomilov, M.; Bravar, A.; Brooks, W.; Brzychczyk, J.; Bubak, A.; Bunyatov, S.A.; Busygina, O.; Christakoglou, P.; Chung, P.; Czopowicz, T.; Davis, N.; Debieux, S.; Di Luise, S.; Dominik, W.; Dumarchez, J.; Dynowski, K.; Engel, R.; Ereditato, A.; Esposito, L.S.; Feofilov, G.A.; Fodor, Z.; Ferrero, A.; Fulop, A.; Gazdzicki, M.; Golubeva, M.; Grabez, B.; Grebieszkow, K.; Grzeszczuk, A.; Guber, F.; Haesler, A.; Hakobyan, H.; Hasegawa, T.; Idczak, R.; Igolkin, S.; Ivanov, Y.; Ivashkin, A.; Kadija, K.; Kapoyannis, A.; Katrynska, N.; Kielczewska, D.; Kikola, D.; Kirejczyk, M.; Kisiel, J.; Kiss, T.; Kleinfelder, S.; Kobayashi, T.; Kochebina, O.; Kolesnikov, V.I.; Kolev, D.; Kondratiev, V.P.; Korzenev, A.; Kowalski, S.; Krasnoperov, A.; Kuleshov, S.; Kurepin, A.; Lacey, R.; Larsen, D.; Laszlo, A.; Lyubushkin, V.V.; Mackowiak-Pawlowska, M.; Majka, Z.; Maksiak, B.; Malakhov, A.I.; Maletic, D.; Marchionni, A.; Marcinek, A.; Maris, I.; Marin, V.; Marton, K.; Matulewicz, T.; Matveev, V.; Melkumov, G.L.; Messina, M.; Mrowczynski, St.; Murphy, S.; Nakadaira, T.; Nishikawa, K.; Palczewski, T.; Palla, G.; Panagiotou, A.D.; Paul, T.; Peryt, W.; Petukhov, O.; Planeta, R.; Pluta, J.; Popov, B.A.; Posiadala, M.; Pulawski, S.; Puzovic, J.; Rauch, W.; Ravonel, M.; Renfordt, R.; Robert, A.; Rohrich, D.; Rondio, E.; Rossi, B.; Roth, M.; Rubbia, A.; Rustamov, A.; Rybczynski, M.; Sadovsky, A.; Sakashita, K.; Savic, M.; Sekiguchi, T.; Seyboth, P.; Shibata, M.; Sipos, M.; Skrzypczak, E.; Slodkowski, M.; Staszel, P.; Stefanek, G.; Stepaniak, J.; Strabel, C.; Strobele, H.; Susa, T.; Szuba, M.; Tada, M.; Taranenko, A.; Tereshchenko, V.; Tolyhi, T.; Tsenov, R.; Turko, L.; Ulrich, R.; Unger, M.; Vassiliou, M.; Veberic, D.; Vechernin, V.V.; Vesztergombi, G.; Wilczek, A.; Wlodarczyk, Z.; Wojtaszek-Szwarc, A.; Wyszynski, O.; Zambelli, L.; Zipper, W.; Hartz, M.; Ichikawa, A.K.; Kubo, H.; Marino, A.D.; Matsuoka, K.; Murakami, A.; Nakaya, T.; Suzuki, K.; Yuan, T.; Zimmerman, E.D.

    2013-01-01

    The T2K long-baseline neutrino oscillation experiment in Japan needs precise predictions of the initial neutrino flux. The highest precision can be reached based on detailed measurements of hadron emission from the same target as used by T2K exposed to a proton beam of the same kinetic energy of 30 GeV. The corresponding data were recorded in 2007-2010 by the NA61/SHINE experiment at the CERN SPS using a replica of the T2K graphite target. In this paper details of the experiment, data taking, data analysis method and results from the 2007 pilot run are presented. Furthermore, the application of the NA61/SHINE measurements to the predictions of the T2K initial neutrino flux is described and discussed.

  1. Automatically classifying sentences in full-text biomedical articles into Introduction, Methods, Results and Discussion.

    Science.gov (United States)

    Agarwal, Shashank; Yu, Hong

    2009-12-01

    Biomedical texts can be typically represented by four rhetorical categories: Introduction, Methods, Results and Discussion (IMRAD). Classifying sentences into these categories can benefit many other text-mining tasks. Although many studies have applied different approaches for automatically classifying sentences in MEDLINE abstracts into the IMRAD categories, few have explored the classification of sentences that appear in full-text biomedical articles. We first evaluated whether sentences in full-text biomedical articles could be reliably annotated into the IMRAD format and then explored different approaches for automatically classifying these sentences into the IMRAD categories. Our results show an overall annotation agreement of 82.14% with a Kappa score of 0.756. The best classification system is a multinomial naïve Bayes classifier trained on manually annotated data that achieved 91.95% accuracy and an average F-score of 91.55%, which is significantly higher than baseline systems. A web version of this system is available online at-http://wood.ims.uwm.edu/full_text_classifier/.

  2. Some methods for blindfolded record linkage

    Directory of Open Access Journals (Sweden)

    Christen Peter

    2004-06-01

    Full Text Available Abstract Background The linkage of records which refer to the same entity in separate data collections is a common requirement in public health and biomedical research. Traditionally, record linkage techniques have required that all the identifying data in which links are sought be revealed to at least one party, often a third party. This necessarily invades personal privacy and requires complete trust in the intentions of that party and their ability to maintain security and confidentiality. Dusserre, Quantin, Bouzelat and colleagues have demonstrated that it is possible to use secure one-way hash transformations to carry out follow-up epidemiological studies without any party having to reveal identifying information about any of the subjects – a technique which we refer to as "blindfolded record linkage". A limitation of their method is that only exact comparisons of values are possible, although phonetic encoding of names and other strings can be used to allow for some types of typographical variation and data errors. Methods A method is described which permits the calculation of a general similarity measure, the n-gram score, without having to reveal the data being compared, albeit at some cost in computation and data communication. This method can be combined with public key cryptography and automatic estimation of linkage model parameters to create an overall system for blindfolded record linkage. Results The system described offers good protection against misdeeds or security failures by any one party, but remains vulnerable to collusion between or simultaneous compromise of two or more parties involved in the linkage operation. In order to reduce the likelihood of this, the use of last-minute allocation of tasks to substitutable servers is proposed. Proof-of-concept computer programmes written in the Python programming language are provided to illustrate the similarity comparison protocol. Conclusion Although the protocols described in

  3. A Review of Spectral Methods for Variable Amplitude Fatigue Prediction and New Results

    Science.gov (United States)

    Larsen, Curtis E.; Irvine, Tom

    2013-01-01

    A comprehensive review of the available methods for estimating fatigue damage from variable amplitude loading is presented. The dependence of fatigue damage accumulation on power spectral density (psd) is investigated for random processes relevant to real structures such as in offshore or aerospace applications. Beginning with the Rayleigh (or narrow band) approximation, attempts at improved approximations or corrections to the Rayleigh approximation are examined by comparison to rainflow analysis of time histories simulated from psd functions representative of simple theoretical and real world applications. Spectral methods investigated include corrections by Wirsching and Light, Ortiz and Chen, the Dirlik formula, and the Single-Moment method, among other more recent proposed methods. Good agreement is obtained between the spectral methods and the time-domain rainflow identification for most cases, with some limitations. Guidelines are given for using the several spectral methods to increase confidence in the damage estimate.

  4. Revealing the timing of ocean stratification using remotely sensed ocean fronts

    Science.gov (United States)

    Miller, Peter I.; Loveday, Benjamin R.

    2017-10-01

    Stratification is of critical importance to the circulation, mixing and productivity of the ocean, and is expected to be modified by climate change. Stratification is also understood to affect the surface aggregation of pelagic fish and hence the foraging behaviour and distribution of their predators such as seabirds and cetaceans. Hence it would be prudent to monitor the stratification of the global ocean, though this is currently only possible using in situ sampling, profiling buoys or underwater autonomous vehicles. Earth observation (EO) sensors cannot directly detect stratification, but can observe surface features related to the presence of stratification, for example shelf-sea fronts that separate tidally-mixed water from seasonally stratified water. This paper describes a novel algorithm that accumulates evidence for stratification from a sequence of oceanic front maps, and discusses preliminary results in comparison with in situ data and simulations from 3D hydrodynamic models. In certain regions, this method can reveal the timing of the seasonal onset and breakdown of stratification.

  5. Contamination-free Ge-based graphene as revealed by graphene enhanced secondary ion mass spectrometry (GESIMS)

    Science.gov (United States)

    Michałowski, P. P.; Pasternak, I.; Strupiński, W.

    2018-01-01

    In this study, we demonstrate that graphene grown on Ge does not contain any copper contamination, and identify some of the errors affecting the accuracy of commonly used measurement methods. Indeed, one of these, the secondary ion mass spectrometry (SIMS) technique, reveals copper contamination in Ge-based graphene but does not take into account the effect of the presence of the graphene layer. We have shown that this layer increases negative ionization significantly, and thus yields false results, but also that the graphene enhances, by an order of two, the magnitude of the intensity of SIMS signals when compared with a similar graphene-free sample, enabling much better detection limits. This forms the basis of a new measurement procedure, graphene enhanced SIMS (GESIMS) (pending European patent application no. EP 16461554.4), which allows for the precise estimation of the realistic distribution of dopants and contamination in graphene. In addition, we present evidence that the GESIMS effect leads to unexpected mass interferences with double-ionized species, and that these interferences are negligible in samples without graphene. The GESIMS method also shows that graphene transferred from Cu results in increased copper contamination.

  6. On a new iterative method for solving linear systems and comparison results

    Science.gov (United States)

    Jing, Yan-Fei; Huang, Ting-Zhu

    2008-10-01

    In Ujevic [A new iterative method for solving linear systems, Appl. Math. Comput. 179 (2006) 725-730], the author obtained a new iterative method for solving linear systems, which can be considered as a modification of the Gauss-Seidel method. In this paper, we show that this is a special case from a point of view of projection techniques. And a different approach is established, which is both theoretically and numerically proven to be better than (at least the same as) Ujevic's. As the presented numerical examples show, in most cases, the convergence rate is more than one and a half that of Ujevic.

  7. Assessment of South African uranium resources: methods and results

    International Nuclear Information System (INIS)

    Camisani-Calzolari, F.A.G.M.; De Klerk, W.J.; Van der Merwe, P.J.

    1985-01-01

    This paper deals primarily with the methods used by the Atomic Energy Corporation of South Africa, in arriving at the assessment of the South African uranium resources. The Resource Evaluation Group is responsible for this task, which is carried out on a continuous basis. The evaluation is done on a property-by-property basis and relies upon data submitted to the Nuclear Development Corporation of South Africa by the various companies involved in uranium mining and prospecting in South Africa. Resources are classified into Reasonably Assured (RAR), Estimated Additional (EAR) and Speculative (SR) categories as defined by the NEA/IAEA Steering Group on Uranium Resources. Each category is divided into three categories, viz, resources exploitable at less than $80/kg uranium, at $80-130/kg uranium and at $130-260/kg uranium. Resources are reported in quantities of uranium metal that could be recovered after mining and metallurgical losses have been taken into consideration. Resources in the RAR and EAR categories exploitable at costs of less than $130/kg uranium are now estimated at 460 000 t uranium which represents some 14 per cent of WOCA's (World Outside the Centrally Planned Economies Area) resources. The evaluation of a uranium venture is carried out in various steps, of which the most important, in order of implementation, are: geological interpretation, assessment of in situ resources using techniques varying from manual contouring of values, geostatistics, feasibility studies and estimation of recoverable resources. Because the choice of an evaluation method is, to some extent, dictated by statistical consderations, frequency distribution curves of the uranium grade variable are illustrated and discussed for characteristic deposits

  8. Capability of crop water content for revealing variability of winter wheat grain yield and soil moisture under limited irrigation.

    Science.gov (United States)

    Zhang, Chao; Liu, Jiangui; Shang, Jiali; Cai, Huanjie

    2018-08-01

    Winter wheat (Triticum aestivum L.) is a major crop in the Guanzhong Plain, China. Understanding its water status is important for irrigation planning. A few crop water indicators, such as the leaf equivalent water thickness (EWT: g cm -2 ), leaf water content (LWC: %) and canopy water content (CWC: kg m -2 ), have been estimated using remote sensing techniques for a wide range of crops, yet their suitability and utility for revealing winter wheat growth and soil moisture status have not been well studied. To bridge this knowledge gap, field-scale irrigation experiments were conducted over two consecutive years (2014 and 2015) to investigate relationships of crop water content with soil moisture and grain yield, and to assess the performance of four spectral process methods for retrieving these three crop water indicators. The result revealed that the water indicators were more sensitive to soil moisture variation before the jointing stage. All three water indicators were significantly correlated with soil moisture during the reviving stage, and the correlations were stronger for leaf water indicators than that of the canopy water indicator at the jointing stage. No correlation was observed after the heading stage. All three water indicators showed good capabilities of revealing grain yield variability in jointing stage, with R 2 up to 0.89. CWC had a consistent relationship with grain yield over different growing seasons, but the performances of EWT and LWC were growing-season specific. The partial least squares regression was the most accurate method for estimating LWC (R 2 =0.72; RMSE=3.6%) and comparable capability for EWT and CWC. Finally, the work highlights the usefulness of crop water indicators to assess crop growth, productivity, and soil water status and demonstrates the potential of various spectral processing methods for retrieving crop water contents from canopy reflectance spectrums. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Improvement of human cell line activation test (h-CLAT) using short-time exposure methods for prevention of false-negative results.

    Science.gov (United States)

    Narita, Kazuto; Ishii, Yuuki; Vo, Phuc Thi Hong; Nakagawa, Fumiko; Ogata, Shinichi; Yamashita, Kunihiko; Kojima, Hajime; Itagaki, Hiroshi

    2018-01-01

    Recently, animal testing has been affected by increasing ethical, social, and political concerns regarding animal welfare. Several in vitro safety tests for evaluating skin sensitization, such as the human cell line activation test (h-CLAT), have been proposed. However, similar to other tests, the h-CLAT has produced false-negative results, including in tests for acid anhydride and water-insoluble chemicals. In a previous study, we demonstrated that the cause of false-negative results from phthalic anhydride was hydrolysis by an aqueous vehicle, with IL-8 release from THP-1 cells, and that short-time exposure to liquid paraffin (LP) dispersion medium could reduce false-negative results from acid anhydrides. In the present study, we modified the h-CLAT by applying this exposure method. We found that the modified h-CLAT is a promising method for reducing false-negative results obtained from acid anhydrides and chemicals with octanol-water partition coefficients (LogK ow ) greater than 3.5. Based on the outcomes from the present study, a combination of the original and the modified h-CLAT is suggested for reducing false-negative results. Notably, the combination method provided a sensitivity of 95% (overall chemicals) or 93% (chemicals with LogK ow > 2.0), and an accuracy of 88% (overall chemicals) or 81% (chemicals with LogK ow > 2.0). We found that the combined method is a promising evaluation scheme for reducing false-negative results seen in existing in vitro skin-sensitization tests. In the future, we expect a combination of original and modified h-CLAT to be applied in a newly developed in vitro test for evaluating skin sensitization.

  10. Experimental single-strain mobilomics reveals events that shape pathogen emergence.

    Science.gov (United States)

    Schoeniger, Joseph S; Hudson, Corey M; Bent, Zachary W; Sinha, Anupama; Williams, Kelly P

    2016-08-19

    Virulence genes on mobile DNAs such as genomic islands (GIs) and plasmids promote bacterial pathogen emergence. Excision is an early step in GI mobilization, producing a circular GI and a deletion site in the chromosome; circular forms are also known for some bacterial insertion sequences (ISs). The recombinant sequence at the junctions of such circles and deletions can be detected sensitively in high-throughput sequencing data, using new computational methods that enable empirical discovery of mobile DNAs. For the rich mobilome of a hospital Klebsiella pneumoniae strain, circularization junctions (CJs) were detected for six GIs and seven IS types. Our methods revealed differential biology of multiple mobile DNAs, imprecision of integrases and transposases, and differential activity among identical IS copies for IS26, ISKpn18 and ISKpn21 Using the resistance of circular dsDNA molecules to exonuclease, internally calibrated with the native plasmids, showed that not all molecules bearing GI CJs were circular. Transpositions were also detected, revealing replicon preference (ISKpn18 prefers a conjugative IncA/C2 plasmid), local action (IS26), regional preferences, selection (against capsule synthesis) and IS polarity inversion. Efficient discovery and global characterization of numerous mobile elements per experiment improves accounting for the new gene combinations that arise in emerging pathogens. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Preparation of red phosphor (Y, Gd)BO3:Eu by soft chemistry methods

    International Nuclear Information System (INIS)

    Cui Xiangzhong; Zhuang Weidong; Yu Zhijian; Xia Tian; Huang Xiaowei; Li Hongwei

    2008-01-01

    The three soft chemistry methods were employed to prepare the red phosphor (Y, Gd)BO 3 :Eu, such as coprecipitation-combustion method, salt assisted combustion method and emulsion method. The main factors affecting particle size, particle distribution and luminescent properties of the product were investigated in detail, and as a result, the preparation processes were optimized. The phosphors were characterized by X-ray diffraction (XRD), scanning electronic microscope (SEM), transmission electronic microscope (TEM) and vacuum ultraviolet (VUV) spectra. Results reveal that phosphors with different morphology, small particle size and high luminescence intensity could be obtained by soft chemistry methods. The difference between the luminescence properties of phosphors in this work and commercial rare earth borate phosphor is discussed. The phosphor with grain shape and high luminescence intensity could be prepared by coprecipitation-combustion method, nanophosphor could be prepared by salt assisted combustion method, and spherical phosphor with a narrow size distribution could be obtained by using emulsion method

  12. Intratumor Heterogeneity and Branched Evolution Revealed by Multiregion Sequencing

    DEFF Research Database (Denmark)

    Gerlinger, Marco; Rowan, Andrew J.; Horswell, Stuart

    2012-01-01

    .RESULTS: Phylogenetic reconstruction revealed branched evolutionary tumor growth, with 63 to 69% of all somatic mutations not detectable across every tumor region. Intratumor heterogeneity was observed for a mutation within an autoinhibitory domain of the mammalian target of rapamycin (mTOR) kinase, correlating with S6...

  13. Resultant geometric variation of a fixtured workpiece Part I: a simulation

    Directory of Open Access Journals (Sweden)

    Supapan Sangnui Chaiprapat

    2006-01-01

    Full Text Available When a workpiece is fixtured for a machining or inspection operation, the accuracy of an operation is mainly determined by the efficiency of the fixturing method. Variability in manufactured workpiece is hardly inevitable. When such variability is found at contact areas between the workpiece and the fixture, errors in location are expected. The errors will affect quality of features to be produced. This paper developed an algorithm to determine variant final locations of a displaced workpiece given normally distributed errorsat contact points. Resultant geometric variation of workpiece location reveals interesting information which is beneficial in tolerance planning.

  14. Implantable central venous chemoport: camparision of results according to approach routes and methods

    International Nuclear Information System (INIS)

    Shin, Byung Suck; Ahn, Moon Sang

    2003-01-01

    To evaluate the results and complications of placement of implantable port according to approach routes and methods. Between April 2001 and October 2002, a total of 103 implantable chemoport was placed in 95 patients for chemotherapy using preconnected type (n=39) and attachable type (n=64). Puncture sites were left subclavian vein (n=35), right subclavian vein (n=5), left internal jugular vein (n=9), right internal jugular vein (n=54). We evaluated duration of catheterization days, complications according to approach routes and methods. Implantable chemoport was placed successfully in all cases. Duration of catheterization ranged from 8 to 554 days(mean 159, total 17,872 catheter days). Procedure related complications occurred transient pulmonary air embolism (n=1), small hematoma (n=1) and malposition in using preconnected type (n=2). Late complications occurred catheter migration (n=5), catheter malfunction (n=3), occlusion (n=1) and infection (n=11). Among them 15 chemoport was removed (14.5%). Catheter migration was occured via subclavian vein in all cases (13%, p=.008). Infection developed in 10.7% of patients(0.61 per 1000 catheter days). There were no catheter-related central vein thrombosis. Implantation of chemoport is a safe procedure. Choice of right internal jugular vein than subclavian vain vein for puncture site has less complications. And selection of attachable type of chemoport is convenient than preconnected type. Adequate care of chemoport is essential for long patency

  15. An Investigation into Novice English Teachers’ Beliefs about Method and Post-method Pedagogy in Turkish EFL Context

    Directory of Open Access Journals (Sweden)

    Mustafa Tekin

    2013-04-01

    Full Text Available This study which has a qualitative research design, reports on the views and beliefs of eleven novice English as a foreign language (EFL teachers about the English language teaching (ELT methods for the purpose of examining their knowledge about and attitudes towards popular methods and post-method as well as towards current discussions in ELT and the effects of them on their reported classroom practices. In this respect, the novice teachers were interviewed by means of the video conferencing feature of the Windows Live Messenger (currently SKYPE about their views and beliefs related to method vs. post-method discussions as well as their current teaching practices. The results revealed a discrepancy between the participants’ views and their classroom practices. In fact, the majority of the participants reported a negative change in their attitudes towards teaching after they started teaching. The majority of the eleven participants were totally unaware of the postmethod discussions. In the final section of the paper, the reasons for these findings are discussed in detail, and further suggestions are made in an attempt to find solutions to some of the problems reported by the participant novice teachers

  16. Visual Literacy and Science Education: Results of a Qualitative Research Project

    Directory of Open Access Journals (Sweden)

    Regula Fankhauser

    2008-10-01

    Full Text Available In the didactics of science the role of pictures—mainly photographs and diagrams—as learning media and their function in the acquisition of knowledge have been discussed. However, the specific problems understanding pictures have seldom been reflected systematically. The aim of the project described in this paper was to address this deficiency. In a first step I refer to theoretical concepts of understanding pictures that were generated within the context of qualitative social research. Next I generate a theoretical model of visual literacy. The focus is on the understanding of pictures used in science education. The model includes aesthetic, epistemological, technical, and pragmatic dimensions. This model was then empirically tested. Thirty-five students were interviewed regarding their reception of scientific pictures. The results reveal that students have difficulties in describing the aesthetic features of pictures. The interviews clarified the epistemological frame theory on which picture understanding is based: most of the students consider the picture as a realistic copy of the object represented. Only a few students showed a more constructivist frame theory. Furthermore, the results revealed no connection between the epistemological theory and the technical knowledge of the students. The discussion of the design and the method of interpretation reflects the results of the study; the students' patterns of picture understanding are surprisingly homogeneous. On the one hand this could be reduced to the method of content analysis; on the other hand it could be an effect of the single sided view of the design. I explored only the subjective reception of pictures. Further research must consider other perspectives and focus on the way teachers work with visual material in classroom teaching. URN: urn:nbn:de:0114-fqs090129

  17. Novel spectroscopic methods for determination of Cromolyn sodium and Oxymetazoline hydrochloride in binary mixture

    Science.gov (United States)

    Abdel-Aziz, Omar; El-Kosasy, A. M.; Magdy, N.; El Zahar, N. M.

    2014-10-01

    New accurate, sensitive and selective spectrophotometric and spectrofluorimetric methods were developed and subsequently validated for determination of Cromolyn sodium (CS) and Oxymetazoline HCl (OXY) in binary mixture. These methods include ‘H-point standard addition method (HPSAM) and area under the curve (AUC)' spectrophotometric method and first derivative synchronous fluorescence spectroscopic (FDSFS) method. For spectrophotometric methods, absorbances were recorded at 241.5 nm and 274.9 nm for HPSAM and the wavelength was selected in ranges 232.0-254.0 nm and 216.0-229.0 nm for AUC method, where the concentration was obtained by applying Cramer's rule. For FDSFS method, the first-derivative synchronous fluorescence signal was measured at 290.0 nm, using Δλ = 145.0 nm. The suggested methods were validated according to International Conference of Harmonization (ICH) guidelines and the results revealed that they were precise and reproducible. All the obtained results were statistically compared with those of the reported method and there was no significant difference.

  18. Revealing Rembrandt

    Directory of Open Access Journals (Sweden)

    Andrew J Parker

    2014-04-01

    Full Text Available The power and significance of artwork in shaping human cognition is self-evident. The starting point for our empirical investigations is the view that the task of neuroscience is to integrate itself with other forms of knowledge, rather than to seek to supplant them. In our recent work, we examined a particular aspect of the appreciation of artwork using present-day functional magnetic resonance imaging (fMRI. Our results emphasised the continuity between viewing artwork and other human cognitive activities. We also showed that appreciation of a particular aspect of artwork, namely authenticity, depends upon the co-ordinated activity between the brain regions involved in multiple decision making and those responsible for processing visual information. The findings about brain function probably have no specific consequences for understanding how people respond to the art of Rembrandt in comparison with their response to other artworks. However, the use of images of Rembrandt’s portraits, his most intimate and personal works, clearly had a significant impact upon our viewers, even though they have been spatially confined to the interior of an MRI scanner at the time of viewing. Neuroscientific studies of humans viewing artwork have the capacity to reveal the diversity of human cognitive responses that may be induced by external advice or context as people view artwork in a variety of frameworks and settings.

  19. New Hybrid Features Selection Method: A Case Study on Websites Phishing

    Directory of Open Access Journals (Sweden)

    Khairan D. Rajab

    2017-01-01

    Full Text Available Phishing is one of the serious web threats that involves mimicking authenticated websites to deceive users in order to obtain their financial information. Phishing has caused financial damage to the different online stakeholders. It is massive in the magnitude of hundreds of millions; hence it is essential to minimize this risk. Classifying websites into “phishy” and legitimate types is a primary task in data mining that security experts and decision makers are hoping to improve particularly with respect to the detection rate and reliability of the results. One way to ensure the reliability of the results and to enhance performance is to identify a set of related features early on so the data dimensionality reduces and irrelevant features are discarded. To increase reliability of preprocessing, this article proposes a new feature selection method that combines the scores of multiple known methods to minimize discrepancies in feature selection results. The proposed method has been applied to the problem of website phishing classification to show its pros and cons in identifying relevant features. Results against a security dataset reveal that the proposed preprocessing method was able to derive new features datasets which when mined generate high competitive classifiers with reference to detection rate when compared to results obtained from other features selection methods.

  20. Intuitionistic fuzzy based DEMATEL method for developing green practices and performances in a green supply chain

    DEFF Research Database (Denmark)

    Govindan, Kannan; Khodaverdi, Roohollah; Vafadarnikjoo, Amin

    2015-01-01

    for organizations to enhance their environmental performance and achieve competitive advantages. This study pioneers using the decision-making trial and evaluation laboratory (DEMATEL) method with intuitionistic fuzzy sets to handle the important and causal relationships between GSCM practices and performances...... to evaluate the efficiency of the proposed method. The results reveal "internal management support", "green purchasing" and "ISO 14001 certification" are the most significant GSCM practices. The practical results of this study offer useful insights for managers to become more environmentally responsible...

  1. A novel method for measuring cellular antibody uptake using imaging flow cytometry reveals distinct uptake rates for two different monoclonal antibodies targeting L1.

    Science.gov (United States)

    Hazin, John; Moldenhauer, Gerhard; Altevogt, Peter; Brady, Nathan R

    2015-08-01

    Monoclonal antibodies (mAbs) have emerged as a promising tool for cancer therapy. Differing approaches utilize mAbs to either deliver a drug to the tumor cells or to modulate the host's immune system to mediate tumor kill. The rate by which a therapeutic antibody is being internalized by tumor cells is a decisive feature for choosing the appropriate treatment strategy. We herein present a novel method to effectively quantitate antibody uptake of tumor cells by using image-based flow cytometry, which combines image analysis with high throughput of sample numbers and sample size. The use of this method is established by determining uptake rate of an anti-EpCAM antibody (HEA125), from single cell measurements of plasma membrane versus internalized antibody, in conjunction with inhibitors of endocytosis. The method is then applied to two mAbs (L1-9.3, L1-OV52.24) targeting the neural cell adhesion molecule L1 (L1CAM) at two different epitopes. Based on median cell population responses, we find that mAb L1-OV52.24 is rapidly internalized by the ovarian carcinoma cell line SKOV3ip while L1 mAb 9.3 is mainly retained at the cell surface. These findings suggest the L1 mAb OV52.24 as a candidate to be further developed for drug-delivery to cancer cells, while L1-9.3 may be optimized to tag the tumor cells and stimulate immunogenic cancer cell killing. Furthermore, when analyzing cell-to-cell variability, we observed L1 mAb OV52.24 rapidly transition into a subpopulation with high-internalization capacity. In summary, this novel high-content method for measuring antibody internalization rate provides a high level of accuracy and sensitivity for cell population measurements and reveals further biologically relevant information when taking into account cellular heterogeneity. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Effect of sample stratification on dairy GWAS results

    Directory of Open Access Journals (Sweden)

    Ma Li

    2012-10-01

    Full Text Available Abstract Background Artificial insemination and genetic selection are major factors contributing to population stratification in dairy cattle. In this study, we analyzed the effect of sample stratification and the effect of stratification correction on results of a dairy genome-wide association study (GWAS. Three methods for stratification correction were used: the efficient mixed-model association expedited (EMMAX method accounting for correlation among all individuals, a generalized least squares (GLS method based on half-sib intraclass correlation, and a principal component analysis (PCA approach. Results Historical pedigree data revealed that the 1,654 contemporary cows in the GWAS were all related when traced through approximately 10–15 generations of ancestors. Genome and phenotype stratifications had a striking overlap with the half-sib structure. A large elite half-sib family of cows contributed to the detection of favorable alleles that had low frequencies in the general population and high frequencies in the elite cows and contributed to the detection of X chromosome effects. All three methods for stratification correction reduced the number of significant effects. EMMAX method had the most severe reduction in the number of significant effects, and the PCA method using 20 principal components and GLS had similar significance levels. Removal of the elite cows from the analysis without using stratification correction removed many effects that were also removed by the three methods for stratification correction, indicating that stratification correction could have removed some true effects due to the elite cows. SNP effects with good consensus between different methods and effect size distributions from USDA’s Holstein genomic evaluation included the DGAT1-NIBP region of BTA14 for production traits, a SNP 45kb upstream from PIGY on BTA6 and two SNPs in NIBP on BTA14 for protein percentage. However, most of these consensus effects had

  3. Compartmentation of glycogen metabolism revealed from 13C isotopologue distributions

    Directory of Open Access Journals (Sweden)

    Marin de Mas Igor

    2011-10-01

    Full Text Available Abstract Background Stable isotope tracers are used to assess metabolic flux profiles in living cells. The existing methods of measurement average out the isotopic isomer distribution in metabolites throughout the cell, whereas the knowledge of compartmental organization of analyzed pathways is crucial for the evaluation of true fluxes. That is why we accepted a challenge to create a software tool that allows deciphering the compartmentation of metabolites based on the analysis of average isotopic isomer distribution. Results The software Isodyn, which simulates the dynamics of isotopic isomer distribution in central metabolic pathways, was supplemented by algorithms facilitating the transition between various analyzed metabolic schemes, and by the tools for model discrimination. It simulated 13C isotope distributions in glucose, lactate, glutamate and glycogen, measured by mass spectrometry after incubation of hepatocytes in the presence of only labeled glucose or glucose and lactate together (with label either in glucose or lactate. The simulations assumed either a single intracellular hexose phosphate pool, or also channeling of hexose phosphates resulting in a different isotopic composition of glycogen. Model discrimination test was applied to check the consistency of both models with experimental data. Metabolic flux profiles, evaluated with the accepted model that assumes channeling, revealed the range of changes in metabolic fluxes in liver cells. Conclusions The analysis of compartmentation of metabolic networks based on the measured 13C distribution was included in Isodyn as a routine procedure. The advantage of this implementation is that, being a part of evaluation of metabolic fluxes, it does not require additional experiments to study metabolic compartmentation. The analysis of experimental data revealed that the distribution of measured 13C-labeled glucose metabolites is inconsistent with the idea of perfect mixing of hexose

  4. Master stability functions reveal diffusion-driven pattern formation in networks

    Science.gov (United States)

    Brechtel, Andreas; Gramlich, Philipp; Ritterskamp, Daniel; Drossel, Barbara; Gross, Thilo

    2018-03-01

    We study diffusion-driven pattern formation in networks of networks, a class of multilayer systems, where different layers have the same topology, but different internal dynamics. Agents are assumed to disperse within a layer by undergoing random walks, while they can be created or destroyed by reactions between or within a layer. We show that the stability of homogeneous steady states can be analyzed with a master stability function approach that reveals a deep analogy between pattern formation in networks and pattern formation in continuous space. For illustration, we consider a generalized model of ecological meta-food webs. This fairly complex model describes the dispersal of many different species across a region consisting of a network of individual habitats while subject to realistic, nonlinear predator-prey interactions. In this example, the method reveals the intricate dependence of the dynamics on the spatial structure. The ability of the proposed approach to deal with this fairly complex system highlights it as a promising tool for ecology and other applications.

  5. REVEAL II: Seasonality and spatial variability of particle and visibility conditions in the Fraser Valley

    DEFF Research Database (Denmark)

    Pryor, S.C.; Barthelmie, R.J.

    2000-01-01

    This paper presents data collected during a year-long field experiment (REVEAL II) in the Fraser Valley, British Columbia. The data are used to provide information regarding ambient visibility conditions and fine particle concentrations in the valley. Although average fine mass measured during RE...... taken at a number of sites during REVEAL II are used to evaluate a simple method for obtaining (classed) quantitative estimates of visual range from this medium without requiring access to specialized instrumentation. (C) 2000 Elsevier Science B.V. All rights reserved....

  6. Peptidomic analysis reveals proteolytic activity of kefir microorganisms on bovine milk proteins

    Science.gov (United States)

    Dallas, David C.; Citerne, Florine; Tian, Tian; Silva, Vitor L. M.; Kalanetra, Karen M.; Frese, Steven A.; Robinson, Randall C.; Mills, David A.; Barile, Daniela

    2015-01-01

    Scope The microorganisms that make up kefir grains are well known for lactose fermentation, but the extent to which they hydrolyze and consume milk proteins remains poorly understood. Peptidomics technologies were used to examine the proteolytic activity of kefir grains on bovine milk proteins. Methods and results Gel electrophoresis revealed substantial digestion of milk proteins by kefir grains, with mass spectrometric analysis showing the release of 609 protein fragments and alteration of the abundance of >1,500 peptides that derived from 27 milk proteins. Kefir contained 25 peptides identified from the literature as having biological activity, including those with antihypertensive, antimicrobial, immunomodulatory, opioid and anti-oxidative functions. 16S rRNA and shotgun metagenomic sequencing identified the principle taxa in the culture as Lactobacillus species. Conclusion The model kefir sample contained thousands of protein fragments released in part by kefir microorganisms and in part by native milk proteases. PMID:26616950

  7. Summary of EPA's risk assessment results from the analysis of alternative methods of low-level waste disposal

    International Nuclear Information System (INIS)

    Bandrowski, M.S.; Hung, C.Y.; Meyer, G.L.; Rogers, V.C.

    1987-01-01

    Evaluation of the potential health risk and individual exposure from a broad number of disposal alternatives is an important part of EPA's program to develop generally applicable environmental standards for the land disposal of low-level radioactive wastes (LLW). The Agency has completed an analysis of the potential population health risks and maximum individual exposures from ten disposal methods under three different hydrogeological and climatic settings. This paper briefly describes the general input and analysis procedures used in the risk assessment for LLW disposal and presents their preliminary results. Some important lessons learned from simulating LLW disposal under a large variety of methods and conditions are identified

  8. Do we need invasive confirmation of cardiac magnetic resonance results?

    Science.gov (United States)

    Siastała, Paweł; Kądziela, Jacek; Małek, Łukasz A; Śpiewak, Mateusz; Lech, Katarzyna; Witkowski, Adam

    2017-01-01

    Coronary artery revascularization is indicated in patients with documented significant obstruction of coronary blood flow associated with a large area of myocardial ischemia and/or untreatable symptoms. There are a few invasive or noninvasive methods that can provide information about the functional results of coronary artery narrowing. The application of more than one method of ischemia detection in one patient to reevaluate the indications for revascularization is used in case of atypical or no symptoms and/or borderline stenosis. To evaluate whether the results of cardiac magnetic resonance need to be reconfirmed by the invasive functional method. The hospital database revealed 25 consecutive patients with 29 stenoses who underwent cardiac magnetic resonance (CMR) and fractional flow reserve (FFR) between the end of 2010 and the end of 2014. The maximal time interval between CMR and FFR was 6 months. None of the patients experienced any clinical events or underwent procedures on coronary arteries between the studies. According to the analysis, the agreement of CMR perfusion with the FFR method was at the level of 89.7%. Assuming that FFR is the gold standard in assessing the severity of stenoses, the sensitivity of CMR perfusion was 90.9%. The percentage of non-severe lesions which were correctly identified in CMR was 88.9%. The study shows that CMR perfusion is a highly sensitive method to detect hemodynamically significant CAD and exclude nonsevere lesions. With FFR as the reference standard, the diagnostic accuracy of MR perfusion to detect ischemic CAD is high.

  9. Adaptation to High Ethanol Reveals Complex Evolutionary Pathways.

    Directory of Open Access Journals (Sweden)

    Karin Voordeckers

    2015-11-01

    Full Text Available Tolerance to high levels of ethanol is an ecologically and industrially relevant phenotype of microbes, but the molecular mechanisms underlying this complex trait remain largely unknown. Here, we use long-term experimental evolution of isogenic yeast populations of different initial ploidy to study adaptation to increasing levels of ethanol. Whole-genome sequencing of more than 30 evolved populations and over 100 adapted clones isolated throughout this two-year evolution experiment revealed how a complex interplay of de novo single nucleotide mutations, copy number variation, ploidy changes, mutator phenotypes, and clonal interference led to a significant increase in ethanol tolerance. Although the specific mutations differ between different evolved lineages, application of a novel computational pipeline, PheNetic, revealed that many mutations target functional modules involved in stress response, cell cycle regulation, DNA repair and respiration. Measuring the fitness effects of selected mutations introduced in non-evolved ethanol-sensitive cells revealed several adaptive mutations that had previously not been implicated in ethanol tolerance, including mutations in PRT1, VPS70 and MEX67. Interestingly, variation in VPS70 was recently identified as a QTL for ethanol tolerance in an industrial bio-ethanol strain. Taken together, our results show how, in contrast to adaptation to some other stresses, adaptation to a continuous complex and severe stress involves interplay of different evolutionary mechanisms. In addition, our study reveals functional modules involved in ethanol resistance and identifies several mutations that could help to improve the ethanol tolerance of industrial yeasts.

  10. Revealing conceptual understanding of international business

    NARCIS (Netherlands)

    Sue Ashley; Dr. Harmen Schaap; Prof.Dr. Elly de Bruijn

    2017-01-01

    This study aims to identify an adequate approach for revealing conceptual understanding in higher professional education. Revealing students’ conceptual understanding is an important step towards developing effective curricula, assessment and aligned teaching strategies to enhance conceptual

  11. Effectiveness of various innovative learning methods in health science classrooms: a meta-analysis.

    Science.gov (United States)

    Kalaian, Sema A; Kasim, Rafa M

    2017-12-01

    This study reports the results of a meta-analysis of the available literature on the effectiveness of various forms of innovative small-group learning methods on student achievement in undergraduate college health science classrooms. The results of the analysis revealed that most of the primary studies supported the effectiveness of the small-group learning methods in improving students' academic achievement with an overall weighted average effect-size of 0.59 in standard deviation units favoring small-group learning methods. The subgroup analysis showed that the various forms of innovative and reform-based small-group learning interventions appeared to be significantly more effective for students in higher levels of college classes (sophomore, junior, and senior levels), students in other countries (non-U.S.) worldwide, students in groups of four or less, and students who choose their own group. The random-effects meta-regression results revealed that the effect sizes were influenced significantly by the instructional duration of the primary studies. This means that studies with longer hours of instruction yielded higher effect sizes and on average every 1 h increase in instruction, the predicted increase in effect size was 0.009 standard deviation units, which is considered as a small effect. These results may help health science and nursing educators by providing guidance in identifying the conditions under which various forms of innovative small-group learning pedagogies are collectively more effective than the traditional lecture-based teaching instruction.

  12. Lesion insertion in the projection domain: Methods and initial results.

    Science.gov (United States)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia

    2015-12-01

    phantom in terms of Hounsfield unit and high-contrast resolution. For the validation of the lesion realism, lesions of various types were successfully inserted, including well circumscribed and invasive lesions, homogeneous and heterogeneous lesions, high-contrast and low-contrast lesions, isolated and vessel-attached lesions, and small and large lesions. The two experienced radiologists who reviewed the original and inserted lesions could not identify the lesions that were inserted. The same lesion, when inserted into the projection domain and reconstructed with different parameters, demonstrated a parameter-dependent appearance. A framework has been developed for projection-domain insertion of lesions into commercial CT images, which can be potentially expanded to all geometries of CT scanners. Compared to conventional image-domain methods, the authors' method reflected the impact of scan and reconstruction parameters on lesion appearance. Compared to prior projection-domain methods, the authors' method has the potential to achieve higher anatomical complexity by employing clinical patient projections and real patient lesions.

  13. Mathematics revealed

    CERN Document Server

    Berman, Elizabeth

    1979-01-01

    Mathematics Revealed focuses on the principles, processes, operations, and exercises in mathematics.The book first offers information on whole numbers, fractions, and decimals and percents. Discussions focus on measuring length, percent, decimals, numbers as products, addition and subtraction of fractions, mixed numbers and ratios, division of fractions, addition, subtraction, multiplication, and division. The text then examines positive and negative numbers and powers and computation. Topics include division and averages, multiplication, ratios, and measurements, scientific notation and estim

  14. Dysconnection topography in schizophrenia revealed with state-space analysis of EEG.

    Science.gov (United States)

    Jalili, Mahdi; Lavoie, Suzie; Deppen, Patricia; Meuli, Reto; Do, Kim Q; Cuénod, Michel; Hasler, Martin; De Feo, Oscar; Knyazeva, Maria G

    2007-10-24

    The dysconnection hypothesis has been proposed to account for pathophysiological mechanisms underlying schizophrenia. Widespread structural changes suggesting abnormal connectivity in schizophrenia have been imaged. A functional counterpart of the structural maps would be the EEG synchronization maps. However, due to the limits of currently used bivariate methods, functional correlates of dysconnection are limited to the isolated measurements of synchronization between preselected pairs of EEG signals. To reveal a whole-head synchronization topography in schizophrenia, we applied a new method of multivariate synchronization analysis called S-estimator to the resting dense-array (128 channels) EEG obtained from 14 patients and 14 controls. This method determines synchronization from the embedding dimension in a state-space domain based on the theoretical consequence of the cooperative behavior of simultaneous time series-the shrinking of the state-space embedding dimension. The S-estimator imaging revealed a specific synchronization landscape in schizophrenia patients. Its main features included bilaterally increased synchronization over temporal brain regions and decreased synchronization over the postcentral/parietal region neighboring the midline. The synchronization topography was stable over the course of several months and correlated with the severity of schizophrenia symptoms. In particular, direct correlations linked positive, negative, and general psychopathological symptoms to the hyper-synchronized temporal clusters over both hemispheres. Along with these correlations, general psychopathological symptoms inversely correlated within the hypo-synchronized postcentral midline region. While being similar to the structural maps of cortical changes in schizophrenia, the S-maps go beyond the topography limits, demonstrating a novel aspect of the abnormalities of functional cooperation: namely, regionally reduced or enhanced connectivity. The new method of

  15. Dysconnection topography in schizophrenia revealed with state-space analysis of EEG.

    Directory of Open Access Journals (Sweden)

    Mahdi Jalili

    2007-10-01

    Full Text Available The dysconnection hypothesis has been proposed to account for pathophysiological mechanisms underlying schizophrenia. Widespread structural changes suggesting abnormal connectivity in schizophrenia have been imaged. A functional counterpart of the structural maps would be the EEG synchronization maps. However, due to the limits of currently used bivariate methods, functional correlates of dysconnection are limited to the isolated measurements of synchronization between preselected pairs of EEG signals.To reveal a whole-head synchronization topography in schizophrenia, we applied a new method of multivariate synchronization analysis called S-estimator to the resting dense-array (128 channels EEG obtained from 14 patients and 14 controls. This method determines synchronization from the embedding dimension in a state-space domain based on the theoretical consequence of the cooperative behavior of simultaneous time series-the shrinking of the state-space embedding dimension. The S-estimator imaging revealed a specific synchronization landscape in schizophrenia patients. Its main features included bilaterally increased synchronization over temporal brain regions and decreased synchronization over the postcentral/parietal region neighboring the midline. The synchronization topography was stable over the course of several months and correlated with the severity of schizophrenia symptoms. In particular, direct correlations linked positive, negative, and general psychopathological symptoms to the hyper-synchronized temporal clusters over both hemispheres. Along with these correlations, general psychopathological symptoms inversely correlated within the hypo-synchronized postcentral midline region. While being similar to the structural maps of cortical changes in schizophrenia, the S-maps go beyond the topography limits, demonstrating a novel aspect of the abnormalities of functional cooperation: namely, regionally reduced or enhanced connectivity.The new

  16. Air sampling methods to evaluate microbial contamination in operating theatres: results of a comparative study in an orthopaedics department.

    Science.gov (United States)

    Napoli, C; Tafuri, S; Montenegro, L; Cassano, M; Notarnicola, A; Lattarulo, S; Montagna, M T; Moretti, B

    2012-02-01

    To evaluate the level of microbial contamination of air in operating theatres using active [i.e. surface air system (SAS)] and passive [i.e. index of microbial air contamination (IMA) and nitrocellulose membranes positioned near the wound] sampling systems. Sampling was performed between January 2010 and January 2011 in the operating theatre of the orthopaedics department in a university hospital in Southern Italy. During surgery, the mean bacterial loads recorded were 2232.9 colony-forming units (cfu)/m(2)/h with the IMA method, 123.2 cfu/m(3) with the SAS method and 2768.2 cfu/m(2)/h with the nitrocellulose membranes. Correlation was found between the results of the three methods. Staphylococcus aureus was detected in 12 of 60 operations (20%) with the membranes, five (8.3%) operations with the SAS method, and three operations (5%) with the IMA method. Use of nitrocellulose membranes placed near a wound is a valid method for measuring the microbial contamination of air. This method was more sensitive than the IMA method and was not subject to any calibration bias, unlike active air monitoring systems. Copyright © 2011 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  17. Genetic variation in Phoca vitulina (the harbour seal) revealed by DNA fingerprinting and RAPDs

    NARCIS (Netherlands)

    Kappe, A.L.; van de Zande, L.; Vedder, E.J.; Bijlsma, R.; van Delden, Wilke

    Genetic variation in two harbour seal (Phoca vitulina) populations from the Dutch Wadden Sea and Scotland was examined by RAPD analysis and DNA fingerprinting. For comparison a population of grey seals (Halichoerus grypus) was studied. The RAPD method revealed a very low number of polymorphic bands.

  18. Whole Exome Sequencing Reveals Genetic Predisposition in a Large Family with Retinitis Pigmentosa

    Directory of Open Access Journals (Sweden)

    Juan Wu

    2014-01-01

    Full Text Available Next-generation sequencing has become more widely used to reveal genetic defect in monogenic disorders. Retinitis pigmentosa (RP, the leading cause of hereditary blindness worldwide, has been attributed to more than 67 disease-causing genes. Due to the extreme genetic heterogeneity, using general molecular screening alone is inadequate for identifying genetic predispositions in susceptible individuals. In order to identify underlying mutation rapidly, we utilized next-generation sequencing in a four-generation Chinese family with RP. Two affected patients and an unaffected sibling were subjected to whole exome sequencing. Through bioinformatics analysis and direct sequencing confirmation, we identified p.R135W transition in the rhodopsin gene. The mutation was subsequently confirmed to cosegregate with the disease in the family. In this study, our results suggest that whole exome sequencing is a robust method in diagnosing familial hereditary disease.

  19. Effectiveness of various transport synthetic acceleration methods with and without GMRES

    International Nuclear Information System (INIS)

    Chang, J.H.; Adams, M.L.

    2005-01-01

    We explore the effectiveness of three types of transport synthetic acceleration (TSA) methods as stand-alone methods and as pre-conditioners within the GMRES Krylov solver. The three types are β TSA, 'stretched' TSA, and 'stretched and filtered' (SF) TSA. We analyzed the effectiveness of these algorithms using Fourier mode analysis of model two-dimensional problems with periodic boundary conditions, including problems with alternating layers of different materials. The analyses revealed that both β-TSA and stretched TSA can diverge for fairly weak heterogeneities. Performance of SF TSA, even with the optimum filtering parameter, degrades with heterogeneity. However, with GMRES, all TSA methods are convergent. SF TSA with the optimum filtering parameter was the most effective method. Numerical results support our Fourier mode analysis. (authors)

  20. The application of isotopic dating methods for prospection and exploration of nuclear raw material

    International Nuclear Information System (INIS)

    Komlev, L.V.; Anderson, E.B.

    1977-01-01

    Among the geological and geochemical methods for prospecting and searching the nuclear raw material, the isotope-dating methods determine the most important search criterion - the time of the ore-forming. The elaboration and use of these methods in uranium-ore regions reveal a series of geochemical epochs of uranium and thorium accumulation connected naturally with the history of geological evolution of the earth crust. The isotope-dating methods enable with confidence to establish the stages of tectono-magmatic activity resulting in the redistribution and the local concentration of uranium. The wide use of isotopic methods is a necessary condition for reasonable trends of the modern geological exploration [ru

  1. Time-lapse imagery of Adélie penguins reveals differential winter strategies and breeding site occupation

    Science.gov (United States)

    Southwell, Colin; Emmerson, Louise; Lunn, Daniel

    2018-01-01

    Polar seabirds adopt different over-wintering strategies to survive and build condition during the critical winter period. Penguin species either reside at the colony during the winter months or migrate long distances. Tracking studies and survey methods have revealed differences in winter migration routes among penguin species and colonies, dependent on both biotic and abiotic factors present. However, scan sampling methods are rarely used to reveal non-breeding behaviors during winter and little is known about presence at the colony site over this period. Here we show that Adélie penguins on the Yalour Islands in the Western Antarctic Peninsula (WAP) are present year-round at the colony and undergo a mid-winter peak in abundance during winter. We found a negative relationship between daylight hours and penguin abundance when either open water or compact ice conditions were present, suggesting that penguins return to the breeding colony when visibility is lowest for at-sea foraging and when either extreme low or high levels of sea ice exist offshore. In contrast, Adélie penguins breeding in East Antarctica were not observed at the colonies during winter, suggesting that Adélie penguins undergo differential winter strategies in the marginal ice zone on the WAP compared to those in East Antarctica. These results demonstrate that cameras can successfully monitor wildlife year-round in areas that are largely inaccessible during winter. PMID:29561876

  2. A method to estimate the ageing of a cooling tower

    International Nuclear Information System (INIS)

    Barnel, Nathalie; Courtois, Alexis; Ilie, Petre-Lazar

    2006-09-01

    This paper deals with cooling towers ageing. Our contribution is a method to determine which part of on site measured strain we are able to predict by means of simulations. As a result, we map a gap indicator on the structure. Calculations have been performed in three configurations. Comparing the values obtained in the three cases helps to determine which researches are worth to be done. Indeed, gap indicator reveals that: - THM can not be considered as the main and only ageing mechanism, so long as tower older than 10 years are examined. At least creep has to be taken into account too; - Gap indicator is sensitive to initial hydration conditions. Drying process before bringing into service should be estimated properly, taking into account the different construction steps; - Comparing different thermal conditions reveals that meteorological conditions have a significant influence on results. So, it will be interesting to differentiate the sunny and the shaded part of the tower when the measurements are done; - A large part of the values obtained can be explicated by construction defects. A study on this particular problematic seems to be essential. The four items mentioned must be considered as perspectives to improve the present method of simulations. (authors)

  3. Comparison of multianalyte proficiency test results by sum of ranking differences, principal component analysis, and hierarchical cluster analysis.

    Science.gov (United States)

    Škrbić, Biljana; Héberger, Károly; Durišić-Mladenović, Nataša

    2013-10-01

    Sum of ranking differences (SRD) was applied for comparing multianalyte results obtained by several analytical methods used in one or in different laboratories, i.e., for ranking the overall performances of the methods (or laboratories) in simultaneous determination of the same set of analytes. The data sets for testing of the SRD applicability contained the results reported during one of the proficiency tests (PTs) organized by EU Reference Laboratory for Polycyclic Aromatic Hydrocarbons (EU-RL-PAH). In this way, the SRD was also tested as a discriminant method alternative to existing average performance scores used to compare mutlianalyte PT results. SRD should be used along with the z scores--the most commonly used PT performance statistics. SRD was further developed to handle the same rankings (ties) among laboratories. Two benchmark concentration series were selected as reference: (a) the assigned PAH concentrations (determined precisely beforehand by the EU-RL-PAH) and (b) the averages of all individual PAH concentrations determined by each laboratory. Ranking relative to the assigned values and also to the average (or median) values pointed to the laboratories with the most extreme results, as well as revealed groups of laboratories with similar overall performances. SRD reveals differences between methods or laboratories even if classical test(s) cannot. The ranking was validated using comparison of ranks by random numbers (a randomization test) and using seven folds cross-validation, which highlighted the similarities among the (methods used in) laboratories. Principal component analysis and hierarchical cluster analysis justified the findings based on SRD ranking/grouping. If the PAH-concentrations are row-scaled, (i.e., z scores are analyzed as input for ranking) SRD can still be used for checking the normality of errors. Moreover, cross-validation of SRD on z scores groups the laboratories similarly. The SRD technique is general in nature, i.e., it can

  4. IN-SITU MEASURING METHOD OF RADON AND THORON DIFFUSION COEFFICIENT IN SOIL

    Directory of Open Access Journals (Sweden)

    V.S. Yakovleva

    2014-06-01

    Full Text Available A simple and valid in-situ measurement method of effective diffusion coefficient of radon and thoron in soil and other porous materials was designed. The analysis of numerical investigation of radon and thoron transport in upper layers of soil revealed that thoron flux density from the earth surface does not depend on soil gas advective velocity and varies only with diffusion coefficient changes. This result showed the advantages of thoron using versus radon using in the suggested method. The comparison of the new method with existing ones previously developed. The method could be helpful for solving of problems of radon mass-transport in porous media and gaseous exchange between soil and atmosphere.

  5. Preferential Interactions between ApoE-containing Lipoproteins and Aβ Revealed by a Detection Method that Combines Size Exclusion Chromatography with Non-Reducing Gel-shift

    Science.gov (United States)

    LaDu, Mary Jo; Munson, Gregory W.; Jungbauer, Lisa; Getz, Godfrey S.; Reardon, Catherine A.; Tai, Leon M.; Yu, Chunjiang

    2012-01-01

    The association between apolipoprotein E (apoE) and amyloid-β peptide (Aβ) may significantly impact the function of both proteins, thus affecting the etiology of Alzheimer’s disease (AD). However, apoE/Aβ interactions remain fundamentally defined by the stringency of the detection method. Here we use size exclusion chromatography (SEC) as a non-stringent approach to the detection of apoE/Aβ interactions in solution, specifically apoE and both endogenous and exogenous Aβ from plasma, CSF and astrocyte conditioned media. By SEC analysis, Aβ association with plasma and CNS lipoproteins is apoE-dependent. While endogenous Aβ elutes to specific human plasma lipoproteins distinct from those containing apoE, it is the apoE-containing lipoproteins that absorb excess amounts of exogenous Aβ40. In human CSF, apoE, endogenous Aβ and phospholipid elute in an almost identical profile, as do apoE, exogenous Aβ and phospholipid from astrocyte conditioned media. Combining SEC fractionation with subsequent analysis for SDS-stable apoE/Aβ complex reveals that apoE-containing astrocyte lipoproteins exhibit the most robust interactions with Aβ. Thus, standardization of the methods for detecting apoE/Aβ complex is necessary to determine its functional significance in the neuropathology characteristic of AD. Importantly, a systematic understanding of the role of apoE-containing plasma and CNS lipoproteins in Aβ homeostasis could potentially contribute to identifying a plasma biomarker currently over-looked because it has multiple components. PMID:22138302

  6. Methods for measuring shrinkage

    OpenAIRE

    Chapman, Paul; Templar, Simon

    2006-01-01

    This paper presents findings from research amongst European grocery retailers into their methods for measuring shrinkage. The findings indicate that: there is no dominant method for valuing or stating shrinkage; shrinkage in the supply chain is frequently overlooked; data is essential in pinpointing where and when loss occurs and that many retailers collect data at the stock-keeping unit (SKU) level and do so every 6 months. These findings reveal that it is difficult to benc...

  7. Rainfall assimilation in RAMS by means of the Kuo parameterisation inversion: method and preliminary results

    Science.gov (United States)

    Orlandi, A.; Ortolani, A.; Meneguzzo, F.; Levizzani, V.; Torricella, F.; Turk, F. J.

    2004-03-01

    In order to improve high-resolution forecasts, a specific method for assimilating rainfall rates into the Regional Atmospheric Modelling System model has been developed. It is based on the inversion of the Kuo convective parameterisation scheme. A nudging technique is applied to 'gently' increase with time the weight of the estimated precipitation in the assimilation process. A rough but manageable technique is explained to estimate the partition of convective precipitation from stratiform one, without requiring any ancillary measurement. The method is general purpose, but it is tuned for geostationary satellite rainfall estimation assimilation. Preliminary results are presented and discussed, both through totally simulated experiments and through experiments assimilating real satellite-based precipitation observations. For every case study, Rainfall data are computed with a rapid update satellite precipitation estimation algorithm based on IR and MW satellite observations. This research was carried out in the framework of the EURAINSAT project (an EC research project co-funded by the Energy, Environment and Sustainable Development Programme within the topic 'Development of generic Earth observation technologies', Contract number EVG1-2000-00030).

  8. Recent adaptive events in human brain revealed by meta-analysis of positively selected genes.

    Directory of Open Access Journals (Sweden)

    Yue Huang

    Full Text Available BACKGROUND AND OBJECTIVES: Analysis of positively-selected genes can help us understand how human evolved, especially the evolution of highly developed cognitive functions. However, previous works have reached conflicting conclusions regarding whether human neuronal genes are over-represented among genes under positive selection. METHODS AND RESULTS: We divided positively-selected genes into four groups according to the identification approaches, compiling a comprehensive list from 27 previous studies. We showed that genes that are highly expressed in the central nervous system are enriched in recent positive selection events in human history identified by intra-species genomic scan, especially in brain regions related to cognitive functions. This pattern holds when different datasets, parameters and analysis pipelines were used. Functional category enrichment analysis supported these findings, showing that synapse-related functions are enriched in genes under recent positive selection. In contrast, immune-related functions, for instance, are enriched in genes under ancient positive selection revealed by inter-species coding region comparison. We further demonstrated that most of these patterns still hold even after controlling for genomic characteristics that might bias genome-wide identification of positively-selected genes including gene length, gene density, GC composition, and intensity of negative selection. CONCLUSION: Our rigorous analysis resolved previous conflicting conclusions and revealed recent adaptation of human brain functions.

  9. Advanced methods comparisons of reaction rates in the Purdue Fast Breeder Blanket Facility

    International Nuclear Information System (INIS)

    Hill, R.N.; Ott, K.O.

    1988-01-01

    A review of worldwide results revealed that reaction rates in the blanket region are generally underpredicted with the discrepancy increasing with penetration; however, these results vary widely. Experiments in the large uniform Purdue Fast Breeder Blanket Facility (FBBF) blanket yield an accurate quantification of this discrepancy. Using standard production code methods (diffusion theory with 50 group cross sections), a consistent Calculated/Experimental (C/E) drop-off was observed for various reaction rates. A 50% increase in the calculated results at the outer edge of the blanket is necessary for agreement with experiments. The usefulness of refined group constant generation utilizing specialized weighting spectra and transport theory methods in correcting this discrepancy was analyzed. Refined group constants reduce the discrepancy to half that observed using the standard method. The surprising result was that transport methods had no effect on the blanket deviations; thus, transport theory considerations do not constitute or even contribute to an explanation of the blanket discrepancies. The residual blanket C/E drop-off (about half the standard drop-off) using advanced methods must be caused by some approximations which are applied in all current methods. 27 refs., 3 figs., 1 tab

  10. Biopsy results of Bosniak 2F and 3 cystic lesions

    DEFF Research Database (Denmark)

    Rasmussen, René; Hørlyck, Arne; Nielsen, Tommy Kjærgaard

    be helpful in clinical decisions. Material and Methods: From March 2013 - December 2014 a total of 295 percutaneous ultrasound guided biopsies from 287 patients with a suspected malignant renal lesion were performed at our institution. All cases were reviewed in PACS by (RR) and lesions presenting...... with a cystic change were re-evaluated and re-categorized after the Bosniak classification system. The re-evaluation and re-categorization was performed in consensus by a junior radiologist (RR) and an uro-radiological expert (OG). Results: Biopsies from eighteen Bosniak 2F cystic lesions were pathologically...... analyzed and three (17%) proved to be malignant. Biopsies from seventeen Bosniak 3 cystic lesions were pathologically analyzed and five (29%) were found to be malignant. Conclusion: Our results reveal a considerable malignancy rate among both Bosniak 2F and 3 cystic renal lesions. Biopsy seems...

  11. A Computer-Supported Method to Reveal and Assess Personal Professional Theories in Vocational Education

    Science.gov (United States)

    van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.

    2016-01-01

    This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…

  12. A Survey of Electronic Serials Managers Reveals Diversity in Practice

    Directory of Open Access Journals (Sweden)

    Laura Costello

    2014-09-01

    Full Text Available A Review of: Branscome, B. A. (2013. Management of electronic serials in academic libraries: The results of an online survey. Serials Review, 39(4, 216-226. http://dx.doi.org/10.1016/j.serrev.2013.10.004 Abstract Objective – To examine industry standards for the management of electronic serials and measure the adoption of electronic serials over print. Design – Survey questionnaire. Setting – Email lists aimed at academic librarians working in serials management. Subjects – 195 self-selected subscribers to serials email lists. Methods – The author created a 20 question survey that consisted primarily of closed-ended questions pertaining to the collection demographics, staff, budget, and tools of serials management groups in academic libraries. The survey was conducted via Survey Monkey and examined using the analytical features of the tool. Participants remained anonymous and the survey questions did not ask them to reveal identifiable information about their libraries. Main Results – Collection demographics questions revealed that 78% of surveyed librarians estimated that print-only collections represented 40% or fewer of their serials holdings. The author observed diversity in the factors that influence print to digital transitions in academic libraries. However 71.5% of participants indicated that publisher technology support like IP authentication was required before adopting digital subscriptions. A lack of standardization also marked serials workflows, department responsibilities, and department titles. The author did not find a correlation between serials budget and the enrollment size of the institution. Participants reported that they used tools from popular serials management vendors like Serials Solutions, Innovative Interfaces, EBSCO, and Ex Libris, but most indicated that they used more than one tool for serials management. Participants specified 52 unique serials management products used in their libraries. Conclusion

  13. Revealing Conceptual Understanding of International Business

    Science.gov (United States)

    Ashley, Sue; Schaap, Harmen; de Bruijn, Elly

    2017-01-01

    This study aims to identify an adequate approach for revealing conceptual understanding in higher professional education. Revealing students' conceptual understanding is an important step towards developing effective curricula, assessment and aligned teaching strategies to enhance conceptual understanding in higher education. Essays and concept…

  14. Method-related estimates of sperm vitality.

    Science.gov (United States)

    Cooper, Trevor G; Hellenkemper, Barbara

    2009-01-01

    Comparison of methods that estimate viability of human spermatozoa by monitoring head membrane permeability revealed that wet preparations (whether using positive or negative phase-contrast microscopy) generated significantly higher percentages of nonviable cells than did air-dried eosin-nigrosin smears. Only with the latter method did the sum of motile (presumed live) and stained (presumed dead) preparations never exceed 100%, making this the method of choice for sperm viability estimates.

  15. Large scale aggregate microarray analysis reveals three distinct molecular subclasses of human preeclampsia.

    Science.gov (United States)

    Leavey, Katherine; Bainbridge, Shannon A; Cox, Brian J

    2015-01-01

    Preeclampsia (PE) is a life-threatening hypertensive pathology of pregnancy affecting 3-5% of all pregnancies. To date, PE has no cure, early detection markers, or effective treatments short of the removal of what is thought to be the causative organ, the placenta, which may necessitate a preterm delivery. Additionally, numerous small placental microarray studies attempting to identify "PE-specific" genes have yielded inconsistent results. We therefore hypothesize that preeclampsia is a multifactorial disease encompassing several pathology subclasses, and that large cohort placental gene expression analysis will reveal these groups. To address our hypothesis, we utilized known bioinformatic methods to aggregate 7 microarray data sets across multiple platforms in order to generate a large data set of 173 patient samples, including 77 with preeclampsia. Unsupervised clustering of these patient samples revealed three distinct molecular subclasses of PE. This included a "canonical" PE subclass demonstrating elevated expression of known PE markers and genes associated with poor oxygenation and increased secretion, as well as two other subclasses potentially representing a poor maternal response to pregnancy and an immunological presentation of preeclampsia. Our analysis sheds new light on the heterogeneity of PE patients, and offers up additional avenues for future investigation. Hopefully, our subclassification of preeclampsia based on molecular diversity will finally lead to the development of robust diagnostics and patient-based treatments for this disorder.

  16. Metagenomics of the Svalbard reindeer rumen microbiome reveals abundance of polysaccharide utilization loci.

    Directory of Open Access Journals (Sweden)

    Phillip B Pope

    Full Text Available Lignocellulosic biomass remains a largely untapped source of renewable energy predominantly due to its recalcitrance and an incomplete understanding of how this is overcome in nature. We present here a compositional and comparative analysis of metagenomic data pertaining to a natural biomass-converting ecosystem adapted to austere arctic nutritional conditions, namely the rumen microbiome of Svalbard reindeer (Rangifer tarandus platyrhynchus. Community analysis showed that deeply-branched cellulolytic lineages affiliated to the Bacteroidetes and Firmicutes are dominant, whilst sequence binning methods facilitated the assemblage of metagenomic sequence for a dominant and novel Bacteroidales clade (SRM-1. Analysis of unassembled metagenomic sequence as well as metabolic reconstruction of SRM-1 revealed the presence of multiple polysaccharide utilization loci-like systems (PULs as well as members of more than 20 glycoside hydrolase and other carbohydrate-active enzyme families targeting various polysaccharides including cellulose, xylan and pectin. Functional screening of cloned metagenome fragments revealed high cellulolytic activity and an abundance of PULs that are rich in endoglucanases (GH5 but devoid of other common enzymes thought to be involved in cellulose degradation. Combining these results with known and partly re-evaluated metagenomic data strongly indicates that much like the human distal gut, the digestive system of herbivores harbours high numbers of deeply branched and as-yet uncultured members of the Bacteroidetes that depend on PUL-like systems for plant biomass degradation.

  17. Critical state transformation in hard superconductors resulting from thermomagnetic avalanches

    International Nuclear Information System (INIS)

    Chabanenko, V.V.; Kuchuk, E.I.; Rusakov, V.F.; Abaloszewa, I.; Nabialek, A.; Perez-Rodriguez, F.

    2016-01-01

    The results of experimental investigations of magnetic flux dynamics in finite superconductors, obtained using integral and local measurements methods, are presented. Local methods were aimed at clarifying the role of demagnetizing factor in dynamic formation of a complex magnetic structure of the critical state of hard superconductors. To understand the reasons for cardinal restructuring of the induction, we further analyzed the literature data of flux dynamics visualization during avalanches, obtained by magneto-optical methods. New features in the behavior of the magnetic flux during and after the avalanche were discovered. Two stages of the formation of the induction structures in the avalanche area were established, i.e. of homogeneous and heterogeneous filling with the magnetic flux. The mechanism of the inversion of the induction profile was considered. Oscillations in the speed of the front of the magnetic flux were revealed. Transformation of the critical state near the edge of the sample was analyzed. The role of thermal effects and of de-magnetizing factor in the dissipative flux dynamics was shown. Generalized information allowed, in the framework of the Bean concept, to present a model the transformation of the picture of the induction of the critical state and of the superconducting currents of a finite superconductor as a result of flux avalanches for two regimes - of screening and trapping of the magnetic flux.

  18. Method, equipment and results of determination of element composition of the Venus rock by the Vega-2 space probe

    International Nuclear Information System (INIS)

    Surkov, Yu.A.; Moskaleva, L.P.; Shcheglov, O.P.

    1985-01-01

    Venus rock composition was determined by X-ray radiometric method in the northeast site of the Aphrodita terra. The experiment was performed on the Vega-2 spacecraft. Composition of Venus rock proved to be close to the composition of the anorthosite-norite-troctolite rocks widespread in the lunar highland crust. The descriptions of the method, instrumentation and results of determining the composition of rocks in landing site of Vega-2 spacecraft are given

  19. Surface analytical methods in nuclear technology

    International Nuclear Information System (INIS)

    Baumgaertner, F.

    1985-06-01

    Application of SEM-EDX, AES, XPS are exemplarily demonstrated for highly radioactive materials with ionizing dose rates of about 1 Sv near the surface. The samples studied are aerosols from the high level waste vitrification process, postprecipitation in a pretreated fuel solution and emulsifying sludge from a solvent extraction process. The results of the chemical composition differentiated down to microscopic level reveal much more information about the history of a sample than those available from the integral macro-methods analysing. Elucidication of chemical composition and body structure in micrometer level may give insight into the origin and generation processes of samples under investigation. (orig.)

  20. Empirical quantification of lacustrine groundwater discharge - different methods and their limitations

    Science.gov (United States)

    Meinikmann, K.; Nützmann, G.; Lewandowski, J.

    2015-03-01

    Groundwater discharge into lakes (lacustrine groundwater discharge, LGD) can be an important driver of lake eutrophication. Its quantification is difficult for several reasons, and thus often neglected in water and nutrient budgets of lakes. In the present case several methods were applied to determine the expansion of the subsurface catchment, to reveal areas of main LGD and to identify the variability of LGD intensity. Size and shape of the subsurface catchment served as a prerequisite in order to calculate long-term groundwater recharge and thus the overall amount of LGD. Isotopic composition of near-shore groundwater was investigated to validate the quality of catchment delineation in near-shore areas. Heat as a natural tracer for groundwater-surface water interactions was used to find spatial variations of LGD intensity. Via an analytical solution of the heat transport equation, LGD rates were calculated from temperature profiles of the lake bed. The method has some uncertainties, as can be found from the results of two measurement campaigns in different years. The present study reveals that a combination of several different methods is required for a reliable identification and quantification of LGD and groundwater-borne nutrient loads.

  1. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  2. Nuclear-physical methods of investigation of an element composition in samples of soils and plants

    International Nuclear Information System (INIS)

    Hushmurodov, Sh.; Botaev, N.

    2002-01-01

    Soil (ground) and vegetative covers of the Earth are one of the most responsive and specific parts of the biosphere with respect to pollution. A proper control after them is of fundamental importance in creating and protecting optical surrounding. Analysis of soils and plants is a necessary and important stage in the process of investigation of microelements' migration in biogeochemical cycles. For this purpose we studied some reserved terrains of Uzbekistan to reveal a level of their contamination by heavy metals, as well as to find out typical and territorial singularities in accumulation of a number of elements by soils and plants. In order to decrease an influence of systematic errors, and to obtain more precise and reliable data, we carried out the element analysis of the samples by different methods, such as gamma-activation analysis, neutron-activation analysis, X-ray spectral analysis, and X-ray fluorescent analysis. As a result of our investigations we have obtained rather great information, which can be used in future to estimate the conditions of the surrounding nature. The investigations allowed us to determine the content of about 40 elements, as well as to show that the data, obtained by different nuclear-physical methods, are in rather good agreement. A reproducibility of the results of the methods, determined in control measurements, depends on the concentration of the analyzed elements, and is equal to 10-35 %. A comparison of the obtained data allowed us to reveal some singularities in element composition of the investigated samples depending on their type and territorial factor. It has been revealed that the data, obtained by different methods, are in rather good agreement. Our investigations allowed us to find out a series of regularities and singularities in accumulation of elements in plants, as well as to show the possibility of using nuclear-physical methods in such investigations

  3. Musical Practices and Methods in Music Lessons: A Comparative Study of Estonian and Finnish General Music Education

    Science.gov (United States)

    Sepp, Anu; Ruokonen, Inkeri; Ruismäki, Heikki

    2015-01-01

    This article reveals the results of a comparative study of Estonian and Finnish general music education. The aim was to find out what music teaching practices and approaches/methods were mostly used, what music education perspectives supported those practices. The data were collected using questionnaires and the results of 107 Estonian and 50…

  4. Revisiting diversity: cultural variation reveals the constructed nature of emotion perception.

    Science.gov (United States)

    Gendron, Maria

    2017-10-01

    The extent of cultural variation in emotion perception has long been assumed to be bounded by underlying universality. A growing body of research reveals, however, that evidence of universality in emotion perception is method-bound. Without the assumption of underlying universality, new lines of inquiry become relevant. Accumulating evidence suggests that cultures vary in what cues are relevant to perceptions of emotion. Further, cultural groups vary in their spontaneous inferences; mental state inference does not appear to be the only, or even most routine, mode of perception across cultures. Finally, setting universality assumptions aside requires innovation in the theory and measurement of culture. Recent studies reveal the promise of refinements in psychological approaches to culture. Together, the available evidence is consistent with a view of emotion perceptions as actively constructed by perceivers to fit the social and physical constraints of their cultural worlds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Functional connectivity analysis of the neural bases of emotion regulation: A comparison of independent component method with density-based k-means clustering method.

    Science.gov (United States)

    Zou, Ling; Guo, Qian; Xu, Yi; Yang, Biao; Jiao, Zhuqing; Xiang, Jianbo

    2016-04-29

    Functional magnetic resonance imaging (fMRI) is an important tool in neuroscience for assessing connectivity and interactions between distant areas of the brain. To find and characterize the coherent patterns of brain activity as a means of identifying brain systems for the cognitive reappraisal of the emotion task, both density-based k-means clustering and independent component analysis (ICA) methods can be applied to characterize the interactions between brain regions involved in cognitive reappraisal of emotion. Our results reveal that compared with the ICA method, the density-based k-means clustering method provides a higher sensitivity of polymerization. In addition, it is more sensitive to those relatively weak functional connection regions. Thus, the study concludes that in the process of receiving emotional stimuli, the relatively obvious activation areas are mainly distributed in the frontal lobe, cingulum and near the hypothalamus. Furthermore, density-based k-means clustering method creates a more reliable method for follow-up studies of brain functional connectivity.

  6. Usability Testing as a Method to Refine a Health Sciences Library Website.

    Science.gov (United States)

    Denton, Andrea H; Moody, David A; Bennett, Jason C

    2016-01-01

    User testing, a method of assessing website usability, can be a cost-effective and easily administered process to collect information about a website's effectiveness. A user experience (UX) team at an academic health sciences library has employed user testing for over three years to help refine the library's home page. Test methodology used in-person testers using the "think aloud" method to complete tasks on the home page. Review of test results revealed problem areas of the design and redesign; further testing was effective in refining the page. User testing has proved to be a valuable method to engage users and provide feedback to continually improve the library's home page.

  7. Using means-end chain analysis to reveal consumers' motivation for buying local foods: An exploratory study

    Directory of Open Access Journals (Sweden)

    Poppy Arsil

    2016-11-01

    Full Text Available This article utilizes and discusses specific aspects of Means-End Chain (MEC analysis for understanding of the motives of Indonesian consumers who are involved in purchasing local foods. The MEC theory is used as a measure of attributes, consequences, and values of locally produced products involving specific aspects of this theory namely laddering methods of administration, content analysis procedure, constructing and interpreting Hierarchy Value Map (HVM. The results of the study indicate that MEC approach is a powerful method to reveal consumer motivation of local foods when associated with the various cultural groupings identified by the study particular between Javanese and Non-Javanese consumers. This study offers a practical implication and source of knowledge for other future studies and policies in term of (a a new approach for understanding the motives behind purchasing local foods for Indonesia consumers, and (b developing new categories of attributes, consequences and values of local foods.

  8. Application of Multistage Homotopy Perturbation Method to the Chaotic Genesio System

    Directory of Open Access Journals (Sweden)

    M. S. H. Chowdhury

    2012-01-01

    Full Text Available Finding accurate solution of chaotic system by using efficient existing numerical methods is very hard for its complex dynamical behaviors. In this paper, the multistage homotopy-perturbation method (MHPM is applied to the Chaotic Genesio system. The MHPM is a simple reliable modification based on an adaptation of the standard homotopy-perturbation method (HPM. The HPM is treated as an algorithm in a sequence of intervals for finding accurate approximate solutions to the Chaotic Genesio system. Numerical comparisons between the MHPM and the classical fourth-order Runge-Kutta (RK4 solutions are made. The results reveal that the new technique is a promising tool for the nonlinear chaotic systems of ordinary differential equations.

  9. Human-factors methods for assessing and enhancing power-plant maintainability

    International Nuclear Information System (INIS)

    Seminara, J.L.

    1982-05-01

    EPRI Final Report NP-1567, dated February 1981, presented the results of a human factors review of plant maintainability at nine power plants (five nuclear and four fossil). This investigation revealed a wide range of plant and equipment design features that can potentially compromise the effectiveness, safety, and productivity of maintenance personnel. The present study is an extension of the earlier work. It provides those utilities that did not participate in the original study with the methodological tools to conduct a review of maintenance provisions, facilities, and practices. This report describes and provides a self-review checklist; a structured interview; a task analysis approach; methods for reviewing maintenance errors or accidents; and recommended survey techniques for evaluating such factors as noise, illumination, and communications. Application of the human factors methods described in this report should reveal avenues for enhancing existing power plants from the maintainability and availability standpoints. This document may also serve a useful purpose for designers or reviewers of new plant designs or near-operational plants presently being constructed

  10. Symbolic joint entropy reveals the coupling of various brain regions

    Science.gov (United States)

    Ma, Xiaofei; Huang, Xiaolin; Du, Sidan; Liu, Hongxing; Ning, Xinbao

    2018-01-01

    The convergence and divergence of oscillatory behavior of different brain regions are very important for the procedure of information processing. Measurements of coupling or correlation are very useful to study the difference of brain activities. In this study, EEG signals were collected from ten subjects under two conditions, i.e. eyes closed state and idle with eyes open. We propose a nonlinear algorithm, symbolic joint entropy, to compare the coupling strength among the frontal, temporal, parietal and occipital lobes and between two different states. Instead of decomposing the EEG into different frequency bands (theta, alpha, beta, gamma etc.), the novel algorithm is to investigate the coupling from the entire spectrum of brain wave activities above 4Hz. The coupling coefficients in two states with different time delay steps are compared and the group statistics are presented as well. We find that the coupling coefficient of eyes open state with delay consistently lower than that of eyes close state across the group except for one subject, whereas the results without delay are not consistent. The differences between two brain states with non-zero delay can reveal the intrinsic inter-region coupling better. We also use the well-known Hénon map data to validate the algorithm proposed in this paper. The result shows that the method is robust and has a great potential for other physiologic time series.

  11. Application of X-ray methods to assess grain vulnerability to damage resulting from multiple loads

    International Nuclear Information System (INIS)

    Zlobecki, A.

    1995-01-01

    The aim of the work is to describe wheat grain behavior under multiple dynamic loads with various multipliers. The experiments were conducted on Almari variety grain. Grain moisture was 11, 16, 21 and 28%. A special ram stand was used for loading the grain. The experiments were carried out using an 8 g weight, equivalent to impact energy of 4,6 x 10 -3 [J]. The X-ray method was used to assess damage. The exposure time was 8 minutes with X-ray lamp voltage equal to 15 kV. The position index was used as the measure of the damage. The investigation results were elaborated statistically. Based on the results of analysis of variance, regression analysis, the d-Duncan test and the Kolmogorov-Smirnov test, the damage number was shown to depend greatly on the number of impacts for the whole range of moisture of the grain loaded. (author)

  12. Evaluation of Hydraulic Parameters Obtained by Different Measurement Methods for Heterogeneous Gravel Soil

    Directory of Open Access Journals (Sweden)

    Chen Zeng

    2012-01-01

    Full Text Available Knowledge of soil hydraulic parameters for the van Genuchten function is important to characterize soil water movement for watershed management. Accurate and rapid prediction of soil water flow in heterogeneous gravel soil has become a hot topic in recent years. However, it is difficult to precisely estimate hydraulic parameters in a heterogeneous soil with rock fragments. In this study, the HYDRUS-2D numerical model was used to evaluate hydraulic parameters for heterogeneous gravel soil that was irregularly embedded with rock fragments in a grape production base. The centrifugal method (CM, tensiometer method (TM and inverse solution method (ISM were compared for various parameters in the van Genuchten function. The soil core method (SCM, disc infiltration method (DIM and inverse solution method (ISM were also investigated for measuring saturated hydraulic conductivity. Simulation with the DIM approach revealed a problem of overestimating soil water infiltration whereas simulation with the SCM approach revealed a problem of underestimating water movement as compared to actual field observation. The ISM approach produced the best simulation result even though this approach slightly overestimated soil moisture by ignoring the impact of rock fragments. This study provides useful information on the overall evaluation of soil hydraulic parameters attained with different measurement methods for simulating soil water movement and distribution in heterogeneous gravel soil.

  13. A method for calculating Bayesian uncertainties on internal doses resulting from complex occupational exposures

    International Nuclear Information System (INIS)

    Puncher, M.; Birchall, A.; Bull, R. K.

    2012-01-01

    Estimating uncertainties on doses from bioassay data is of interest in epidemiology studies that estimate cancer risk from occupational exposures to radionuclides. Bayesian methods provide a logical framework to calculate these uncertainties. However, occupational exposures often consist of many intakes, and this can make the Bayesian calculation computationally intractable. This paper describes a novel strategy for increasing the computational speed of the calculation by simplifying the intake pattern to a single composite intake, termed as complex intake regime (CIR). In order to assess whether this approximation is accurate and fast enough for practical purposes, the method is implemented by the Weighted Likelihood Monte Carlo Sampling (WeLMoS) method and evaluated by comparing its performance with a Markov Chain Monte Carlo (MCMC) method. The MCMC method gives the full solution (all intakes are independent), but is very computationally intensive to apply routinely. Posterior distributions of model parameter values, intakes and doses are calculated for a representative sample of plutonium workers from the United Kingdom Atomic Energy cohort using the WeLMoS method with the CIR and the MCMC method. The distributions are in good agreement: posterior means and Q 0.025 and Q 0.975 quantiles are typically within 20 %. Furthermore, the WeLMoS method using the CIR converges quickly: a typical case history takes around 10-20 min on a fast workstation, whereas the MCMC method took around 12-hr. The advantages and disadvantages of the method are discussed. (authors)

  14. Revealing metabolomic variations in Cortex Moutan from different root parts using HPLC-MS method.

    Science.gov (United States)

    Xiao, Chaoni; Wu, Man; Chen, Yongyong; Zhang, Yajun; Zhao, Xinfeng; Zheng, Xiaohui

    2015-01-01

    The distribution of metabolites in the different root parts of Cortex Moutan (the root bark of Paeonia suffruticosa Andrews) is not well understood, therefore, scientific evidence is not available for quality assessment of Cortex Moutan. To reveal metabolomic variations in Cortex Moutan in order to gain deeper insights to enable quality control. Metabolomic variations in the different root parts of Cortex Moutan were characterised using high-performance liquid chromatography combined with mass spectrometry (HPLC-MS) and multivariate data analysis. The discriminating metabolites in different root parts were evaluated by the one-way analysis of variance and a fold change parameter. The metabolite profiles of Cortex Moutan were largely dominated by five primary and 41 secondary metabolites . Higher levels of malic acid, gallic acid and mudanoside-B were mainly observed in the second lateral roots, whereas dihydroxyacetophenone, benzoyloxypaeoniflorin, suffruticoside-A, kaempferol dihexoside, mudanpioside E and mudanpioside J accumulated in the first lateral and axial roots. The highest contents of paeonol, galloyloxypaeoniflorin and procyanidin B were detected in the axial roots. Accordingly, metabolite compositions of Cortex Moutan were found to vary among different root parts. The axial roots have higher quality than the lateral roots in Cortex Moutan due to the accumulation of bioactive secondary metabolites associated with plant physiology. These findings provided important scientific evidence for grading Cortex Moutan on the general market. Copyright © 2014 John Wiley & Sons, Ltd.

  15. [Reconsidering children's dreams. A critical review of methods and results in developmental dream research from Freud to contemporary works].

    Science.gov (United States)

    Sándor, Piroska; Bódizs, Róbert

    2014-01-01

    Examining children's dream development is a significant challenge for researchers. Results from studies on children's dreaming may enlighten us on the nature and role of dreaming as well as broaden our knowledge of consciousness and cognitive development. This review summarizes the main questions and historical progress in developmental dream research, with the aim of shedding light on the advantages, disadvantages and effects of different settings and methods on research outcomes. A typical example would be the dreams of 3 to 5 year-olds: they are simple and static, with a relative absence of emotions and active self participation according to laboratory studies; studies using different methodology however found them to be vivid, rich in emotions, with the self as an active participant. Questions about the validity of different methods arise, and are considered within this review. Given that methodological differences can result in highly divergent outcomes, it is strongly recommended for future research to select methodology and treat results more carefully.

  16. [Mastitis revealing Churg-Strauss syndrome].

    Science.gov (United States)

    Dannepond, C; Le Fourn, E; de Muret, A; Ouldamer, L; Carmier, D; Machet, L

    2014-01-01

    Churg-Strauss syndrome often involves the skin, and this may sometimes reveal the disease. A 25-year-old woman was referred to a gynaecologist for inflammation of the right breast with breast discharge. Cytological analysis of the liquid showed numerous inflammatory cells, particularly polymorphonuclear eosinophils and neutrophils. Ultrasound examination of the breast was consistent with galactophoritis. CRP was normal, and hypereosinophilia was seen. The patient was subsequently referred to a dermatology unit. Skin examination revealed inflammation of the entire breast, which was painful, warm and erythematous; the border was oedematous with blisters. Necrotic lesions were also present on the thumbs and knees. Skin biopsy of the breast showed a dermal infiltrate with abundant infiltrate of polymorphonuclear eosinophils, including patchy necrosis and intraepidermal vesicles. Histological examination of a biopsy sample from a thumb revealed eosinophilic granuloma and leukocytoclastic vasculitis. The patient was also presenting asthma, pulmonary infiltrates and mononeuropathy at L3, consistent with Churg-Strauss syndrome. Breast involvement in Churg-Strauss syndrome is very rare (only one other case has been reported). This is the first case in which the breast condition revealed the disease. Cutaneous involvement of the breast is, however, also compatible with Wells' cellulitis. The lesions quickly disappeared with 1mg/kg/d oral prednisolone. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  17. ACCURATE 3D SCANNING OF DAMAGED ANCIENT GREEK INSCRIPTIONS FOR REVEALING WEATHERED LETTERS

    Directory of Open Access Journals (Sweden)

    A. I. Papadaki

    2015-02-01

    Full Text Available In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and ”hidden” letters.

  18. Accurate 3d Scanning of Damaged Ancient Greek Inscriptions for Revealing Weathered Letters

    Science.gov (United States)

    Papadaki, A. I.; Agrafiotis, P.; Georgopoulos, A.; Prignitz, S.

    2015-02-01

    In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and "hidden" letters.

  19. The effect of uncertainty of reactor parameters obtained using k0-NAA on result of analysis

    International Nuclear Information System (INIS)

    Sasajima, Fumio

    2006-01-01

    Neutron Activation Analysis using the k 0 method is a useful method allowing convenient and accurate simultaneous analysis of plural elements, eliminating the need for the use of comparative reference samples. As already well known, it is essential for the correct result of an analysis to obtain the α-factor and f-factor for a neutron spectrum in an irradiation field accurately when an attempt is made to use the k 0 method. For this reason, based on data obtained from the experiment conducted in the JRR-3 PN-3 system, how uncertainty of the measured values for α-factor and f-factor affects the result of an analysis was evaluated. The process of evaluation involved intentionally varying the values for reactor parameters followed by making an analysis of environmental reference samples (NIST SRM-1632c) using the k 0 method to examine the effect of these factors on the concentrations of 19 elements. The result of the evaluation revealed that the degree of the effect of uncertainty on the concentrations of 19 elements was at best approx. 1% under the condition of this experiment assuming that the factor α, a reactor parameter, had uncertainty of approx. 200%. (author)

  20. The new fabrication method of standard surface sources

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Yasushi E-mail: yss.sato@aist.go.jp; Hino, Yoshio; Yamada, Takahiro; Matsumoto, Mikio

    2004-04-01

    We developed a new fabrication method for standard surface sources by using an inkjet printer with inks in which a radioactive material is mixed to print on a sheet of paper. Three printed test patterns have been prepared: (1) 100 mmx100 mm uniformity-test patterns, (2) positional-resolution test patterns with different widths and intervals of straight lines, and (3) logarithmic intensity test patterns with different radioactive intensities. The results revealed that the fabricated standard surface sources had high uniformity, high positional resolution, arbitrary shapes and a broad intensity range.

  1. The microbiome of Brazilian mangrove sediments as revealed by metagenomics

    NARCIS (Netherlands)

    Andreote, Fernando Dini; Jiménez Avella, Diego; Chaves, Diego; Dias, Armando Cavalcante Franco; Luvizotto, Danice Mazzer; Dini-Andreote, Francisco; Fasanella, Cristiane Cipola; Lopez, Maryeimy Varon; Baena, Sandra; Taketani, Rodrigo Gouvêa; de Melo, Itamar Soares

    2012-01-01

    Here we embark in a deep metagenomic survey that revealed the taxonomic and potential metabolic pathways aspects of mangrove sediment microbiology. The extraction of DNA from sediment samples and the direct application of pyrosequencing resulted in approximately 215 Mb of data from four distinct

  2. Review of quantum Monte Carlo methods and results for Coulombic systems

    International Nuclear Information System (INIS)

    Ceperley, D.

    1983-01-01

    The various Monte Carlo methods for calculating ground state energies are briefly reviewed. Then a summary of the charged systems that have been studied with Monte Carlo is given. These include the electron gas, small molecules, a metal slab and many-body hydrogen

  3. Teaching english grammar through interactive methods

    OpenAIRE

    Aminova N.

    2016-01-01

    The article is devoted for the effective ways of teaching grammar. Actuality of the theme is justified as it sets conditions for revealing high progress in teaching a foreign language and for developing effective methods which can be helpful for foreign language teachers. Different progressive methods of teaching English grammar are given in this paper as well.

  4. COMPARISON OF CONSEQUENCE ANALYSIS RESULTS FROM TWO METHODS OF PROCESSING SITE METEOROLOGICAL DATA

    International Nuclear Information System (INIS)

    , D

    2007-01-01

    Consequence analysis to support documented safety analysis requires the use of one or more years of representative meteorological data for atmospheric transport and dispersion calculations. At minimum, the needed meteorological data for most atmospheric transport and dispersion models consist of hourly samples of wind speed and atmospheric stability class. Atmospheric stability is inferred from measured and/or observed meteorological data. Several methods exist to convert measured and observed meteorological data into atmospheric stability class data. In this paper, one year of meteorological data from a western Department of Energy (DOE) site is processed to determine atmospheric stability class using two methods. The method that is prescribed by the U.S. Nuclear Regulatory Commission (NRC) for supporting licensing of nuclear power plants makes use of measurements of vertical temperature difference to determine atmospheric stability. Another method that is preferred by the U.S. Environmental Protection Agency (EPA) relies upon measurements of incoming solar radiation, vertical temperature gradient, and wind speed. Consequences are calculated and compared using the two sets of processed meteorological data from these two methods as input data into the MELCOR Accident Consequence Code System 2 (MACCS2) code

  5. Deposition of Ni-CGO composite anodes by electrostatic assisted ultrasonic spray pyrolysis method

    International Nuclear Information System (INIS)

    Chen, J.-C.; Chang, C.-L.; Hsu, C.-S.; Hwang, B.-H.

    2007-01-01

    Deposition of composite films of Ni and Gd-doped ceria was carried out using the electrostatic assisted ultrasonic spray pyrolysis method for the first time. The composite films were highly homogeneous, as revealed by element mapping via energy-dispersive spectrometry. Scanning electron microscope examinations revealed that deposition temperature and electric field strength had profound influence on resultant microstructure, while composition of the precursor solution had little effect. A highly porous cauliflower structure ideal for solid oxide fuel cell anode performance was obtained with a deposition temperature of 450 deg. C under an electric field introduced by an applied voltage of 12 kV. Films obtained with a lower deposition temperature of 250 deg. C or a higher applied voltage of 15 kV resulted in denser films with low porosity, while lower applied voltages of 7 or 5 kV resulted in thinner or discontinuous films due to the insufficient electrostatic attraction on the aerosol droplets. As revealed by AC impedance measurement, the area specific resistances of the Ni-CGO anode with porous cauliflower structure were rather low and a value of 0.09 Ω cm 2 at 550 deg. C was obtained

  6. Three-dimensional modeling in the electromagnetic/magnetotelluric methods. Accuracy of various finite-element and finite difference methods; Denjiho MT ho ni okeru sanjigen modeling. Shushu no yugen yosoho to sabunho no seido

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Y [Kyushu University, Fukuoka (Japan). Faculty of Engineering

    1997-05-27

    To enhance the reliability of electromagnetic/magnetotelluric (MT) survey, calculation results of finite-element methods (FEMs) and finite difference methods (FDMs) were compared. Accuracy of individual methods and convergence of repitition solution were examined. As a result of the investigation, it was found that appropriate accuracy can be obtained from the edge FEM and FDM for the example of vertical magnetic dipole, and that the best accuracy can be obtained from the FDM among four methods for the example of MT survey. It was revealed that the ICBCG (incomplete Cholesky bi-conjugate gradient) method is an excellent method as a solution method of simultaneous equations from the viewpoint of accuracy and calculation time. For the joint FEM, solutions of SOR method converged for both the examples. It was concluded that the cause of error is not due to the error of numerical calculation, but due to the consideration without discontinuity of electric field. The conditions of coefficient matrix increased with decreasing the frequency, which resulted in the unstable numerical calculation. It would be required to incorporate the constraint in a certain form. 4 refs., 12 figs.

  7. The Method for Assessing and Forecasting Value of Knowledge in SMEs – Research Results

    Directory of Open Access Journals (Sweden)

    Justyna Patalas-Maliszewska

    2010-10-01

    Full Text Available Decisions by SMEs regarding knowledge development are made at a strategic level (Haas-Edersheim, 2007. Related to knowledge management are approaches to "measure" knowledge, where literature distinguishes between qualitative and quantitative methods of valuating intellectual capital. Although there is a quite range of such methods to build an intellectual capital reporting system, none of them is really widely recognized. This work presents a method enabling assessing the effectiveness of investing in human resources, taking into consideration existing methods. The method presented is focusing on SMEs (taking into consideration their importance for, especially, regional development. It consists of four parts: an SME reference model, an indicator matrix to assess investments into knowledge, innovation indicators, and the GMDH algorithm for decision making. The method presented is exemplified by a case study including 10 companies.

  8. Methods of patient warming during abdominal surgery.

    Directory of Open Access Journals (Sweden)

    Li Shao

    Full Text Available BACKGROUND: Keeping abdominal surgery patients warm is common and warming methods are needed in power outages during natural disasters. We aimed to evaluate the efficacy of low-cost, low-power warming methods for maintaining normothermia in abdominal surgery patients. METHODS: Patients (n = 160 scheduled for elective abdominal surgery were included in this prospective clinical study. Five warming methods were applied: heated blood transfusion/fluid infusion vs. unheated; wrapping patients vs. not wrapping; applying moist dressings, heated or not; surgical field rinse heated or not; and applying heating blankets or not. Patients' nasopharyngeal and rectal temperatures were recorded to evaluate warming efficacy. Significant differences were found in mean temperatures of warmed patients compared to those not warmed. RESULTS: When we compared temperatures of abdominal surgery patient groups receiving three specific warming methods with temperatures of control groups not receiving these methods, significant differences were revealed in temperatures maintained during the surgeries between the warmed groups and controls. DISCUSSION: The value of maintaining normothermia in patients undergoing abdominal surgery under general anesthesia is accepted. Three effective economical and practically applicable warming methods are combined body wrapping and heating blanket; combined body wrapping, heated moist dressings, and heating blanket; combined body wrapping, heated moist dressings, and warmed surgical rinse fluid, with or without heating blanket. These methods are practically applicable when low-cost method is indeed needed.

  9. Drug overdose resulting in quadriplegia.

    Science.gov (United States)

    Wang, Teresa S; Grunch, Betsy H; Moreno, Jessica R; Bagley, Carlos A; Gottfried, Oren N

    2012-06-01

    To describe a case of cervical flexion myelopathy resulting from a drug overdose. A 56-year-old male presented to the emergency department unable to move his extremities following drug overdose. Neurological examination revealed him to be at C6 ASIA A spinal cord injury. The CT of his cervical spine revealed no fracture; however, an MRI revealed cord edema extending from C3 to C6 as well as posterior paraspinal signal abnormalities suggestive of ligamentous injury. The patient underwent a posterior cervical laminectomy and fusion from C3 to C7. Neurologically he regained 3/5 bilateral tricep function and 2/5 grip; otherwise, he remained at ASIA A spinal cord injury at 6 months. Our patient suffered a spinal cord injury likely due to existing cervical stenosis, and in addition to an overdose of sedating medications, he likely sat in flexed neck position for prolonged period of time with the inability to modify his position. This likely resulted in cervical spine vascular and/or neurological compromise producing an irreversible spinal cord injury. Spinal cord injury is a rare finding in patients presenting with drug overdose. The lack of physical exam findings suggestive of trauma may delay prompt diagnosis and treatment, and thus clinicians must have a high index of suspicion when evaluating patients in this setting.

  10. TMI-2 core debris analytical methods and results

    International Nuclear Information System (INIS)

    Akers, D.W.; Cook, B.A.

    1984-01-01

    A series of six grab samples was taken from the debris bed of the TMI-2 core in early September 1983. Five of these samples were sent to the Idaho National Engineering Laboratory for analysis. Presented is the analysis strategy for the samples and some of the data obtained from the early stages of examination of the samples (i.e., particle size-analysis, gamma spectrometry results, and fissile/fertile material analysis)

  11. National implementation of the UNECE convention on long-range transboundary air pollution (effects). Pt. 1. Deposition loads: methods, modelling and mapping results, trends

    Energy Technology Data Exchange (ETDEWEB)

    Gauger, Thomas [Federal Agricultural Research Centre, Braunschweig (DE). Inst. of Agroecology (FAL-AOE); Stuttgart Univ. (Germany). Inst. of Navigation; Haenel, Hans-Dieter; Roesemann, Claus [Federal Agricultural Research Centre, Braunschweig (DE). Inst. of Agroecology (FAL-AOE)

    2008-09-15

    The report on the implementation of the UNECE convention on long-range transboundary air pollution Pt.1, deposition loads (methods, modeling and mapping results, trends) includes the following chapters: Introduction, deposition on air pollutants used for the input for critical loads in exceeding calculations, methods applied for mapping total deposition loads, mapping wet deposition, wet deposition mapping results, mapping dry deposition, dry deposition mapping results, cloud and fog mapping results, total deposition mapping results, modeling the air concentration of acidifying components and heavy metals, agricultural emissions of acidifying and eutrophying species.

  12. Experimental Results for Direction of Arrival Estimation with a Single Acoustic Vector Sensor in Shallow Water

    Directory of Open Access Journals (Sweden)

    Alper Bereketli

    2015-01-01

    Full Text Available We study the performances of several computationally efficient and simple techniques for estimating direction of arrival (DOA of an underwater acoustic source using a single acoustic vector sensor (AVS in shallow water. Underwater AVS is a compact device, which consists of one hydrophone and three accelerometers in a packaged form, measuring scalar pressure and three-dimensional acceleration simultaneously at a single position. A very controlled experimental setup is prepared to test how well-known techniques, namely, arctan-based, intensity-based, time domain beamforming, and frequency domain beamforming methods, perform in estimating DOA of a source in different circumstances. Experimental results reveal that for almost all cases beamforming techniques perform best. Moreover, arctan-based method, which is the simplest of all, provides satisfactory results for practical purposes.

  13. Revealing time bunching effect in single-molecule enzyme conformational dynamics.

    Science.gov (United States)

    Lu, H Peter

    2011-04-21

    In this perspective, we focus our discussion on how the single-molecule spectroscopy and statistical analysis are able to reveal enzyme hidden properties, taking the study of T4 lysozyme as an example. Protein conformational fluctuations and dynamics play a crucial role in biomolecular functions, such as in enzymatic reactions. Single-molecule spectroscopy is a powerful approach to analyze protein conformational dynamics under physiological conditions, providing dynamic perspectives on a molecular-level understanding of protein structure-function mechanisms. Using single-molecule fluorescence spectroscopy, we have probed T4 lysozyme conformational motions under the hydrolysis reaction of a polysaccharide of E. coli B cell walls by monitoring the fluorescence resonant energy transfer (FRET) between a donor-acceptor probe pair tethered to T4 lysozyme domains involving open-close hinge-bending motions. Based on the single-molecule spectroscopic results, molecular dynamics simulation, a random walk model analysis, and a novel 2D statistical correlation analysis, we have revealed a time bunching effect in protein conformational motion dynamics that is critical to enzymatic functions. Bunching effect implies that conformational motion times tend to bunch in a finite and narrow time window. We show that convoluted multiple Poisson rate processes give rise to the bunching effect in the enzymatic reaction dynamics. Evidently, the bunching effect is likely common in protein conformational dynamics involving in conformation-gated protein functions. In this perspective, we will also discuss a new approach of 2D regional correlation analysis capable of analyzing fluctuation dynamics of complex multiple correlated and anti-correlated fluctuations under a non-correlated noise background. Using this new method, we are able to map out any defined segments along the fluctuation trajectories and determine whether they are correlated, anti-correlated, or non-correlated; after which, a

  14. Three-dimension visualization of transnasal approach for revealing the metasellar organization

    Directory of Open Access Journals (Sweden)

    Liang XUE

    2012-07-01

    Full Text Available Objective To elevate the anatomical cognitive level by investigating the metasellar organization viatransnasal approach in a virtual-reality (VR setting. Methods Twenty-eight patients, with spontaneous subarachnoid hemorrhage but without pathological changes of nasal cavity and sella turcica, underwent the lamellar imaging examination and CT angiogram with Discovery Ultra 16. The data were collected and entered in the Dextroscope in DICOM format. Visualization research was carried out viathe transnasal approach in a virtual-reality (VR setting. Results The anatomic structures of transnasal approach were allowed to be observed dynamically and spatially. When exposing the lateral border of cavernous carotid artery, it was important to excise the ethmoid cornu, open posterior ethmoid sinus and sphenopalatine foramen, control sphenopalatine artery, properly drill out pterygoid process and reveal pterygoid canal. Conclusion It is the key point to remove the ethmoid cornu, uncinate process and bone of the anterior region of sphenoidal sinus, and control sphenopalatine artery viatransnasal approach to expose the metasellar structure. The cavernous carotid arteries are the most important anatomic structure, should be adequately exposed and conserved.

  15. Polymyalgia Rheumatica Revealing a Lymphoma: A Two-Case Report

    Directory of Open Access Journals (Sweden)

    Frank Verhoeven

    2016-01-01

    Full Text Available Introduction. Polymyalgia rheumatica (PMR is one of the most common inflammatory rheumatism types in elderly population. The link between cancer and PMR is a matter of debate. Methods. We report two cases of PMR leading to the diagnosis of lymphoma and the growing interest of PET-TDM in this indication. Results. A 84-year-old man known for idiopathic neutropenia presented an inflammatory arthromyalgia of the limb girdle since one month. Blood exams highlighted the presence of a monoclonal B cell clone. Bone marrow concluded to a B cell lymphoma of the marginal zone. He was successfully treated with 0.3 mg/kg/d of prednisone, and response was sustained after 6 months. A 73-year-old man known for prostatic neoplasia in remission for 5 years presented arthromyalgia of the limb girdle since one month. PET-CT revealed bursitis of the hips and the shoulders, no prostatic cancer recurrence, and a metabolically active iliac lymphadenopathy whose pathologic exam concluded to a low grade follicular lymphoma. He was successfully treated with 0.3 mg/kg/d of prednisone. Conclusion. These observations may imply that lymphoma is sometimes already present when PMR is diagnosed and PET-CT is a useful tool in the initial assessment of PMR to avoid missing neoplasia.

  16. Basic studies on gastrin-radioimmunoassay and the results of its clinical application. Comparative studies between the double antibody method using Wilson's anti-gastrin serum and a gastrin kit (CIS) method

    Energy Technology Data Exchange (ETDEWEB)

    Yabana, T; Uchiya, T; Kakumoto, Y; Waga, Y; Konta, M [Sapporo Medical Coll. (Japan)

    1975-03-01

    Fundamental and practical problems in carrying out the radioimmunoassay of gastrin were studied by comparing the double antibody method, using guinea pig anti-porcine gastrin serum (Wilson Lab.) with the gastrin kit method (G-K, CIS). The former method was found to have a measurable gastrin concentration range between 60 and 1,000 pg/ml, whereas the range of the latter method was between 25 and 800 pg/ml. The reproducibility of each method was satisfactory. The G-K method was affected more readily by co-existing proteins, whereas the interferences by other biologically active factors, e.g., CCK/PZ, caerulein, etc., were negligible. While there was a highly significant correlation between the values, values obtained by the G-K method were generally slightly lower than the values obtained by the double antibody method. Results of fractionation analysis employing gel filtration of blood and tissue immunoreactive gastrin caused the authors to observe that the value of big gastrin as determined with the G-K method was lower than that obtained by the double antibody method, and that the difference was especially remarkable for gastrin in blood.

  17. A New Method for a Virtue-Based Responsible Conduct of Research Curriculum: Pilot Test Results.

    Science.gov (United States)

    Berling, Eric; McLeskey, Chet; O'Rourke, Michael; Pennock, Robert T

    2018-02-03

    Drawing on Pennock's theory of scientific virtues, we are developing an alternative curriculum for training scientists in the responsible conduct of research (RCR) that emphasizes internal values rather than externally imposed rules. This approach focuses on the virtuous characteristics of scientists that lead to responsible and exemplary behavior. We have been pilot-testing one element of such a virtue-based approach to RCR training by conducting dialogue sessions, modeled upon the approach developed by Toolbox Dialogue Initiative, that focus on a specific virtue, e.g., curiosity and objectivity. During these structured discussions, small groups of scientists explore the roles they think the focus virtue plays and should play in the practice of science. Preliminary results have shown that participants strongly prefer this virtue-based model over traditional methods of RCR training. While we cannot yet definitively say that participation in these RCR sessions contributes to responsible conduct, these pilot results are encouraging and warrant continued development of this virtue-based approach to RCR training.

  18. Surface-enhanced Raman scattering reveals adsorption of mitoxantrone on plasma membrane of living cells

    International Nuclear Information System (INIS)

    Breuzard, G.; Angiboust, J.-F.; Jeannesson, P.; Manfait, M.; Millot, J.-M.

    2004-01-01

    Surface-enhanced Raman scattering (SERS) spectroscopy was applied to analyze mitoxantrone (MTX) adsorption on the plasma membrane microenvironment of sensitive (HCT-116 S) or BCRP/MXR-type resistant (HCT-116 R) cells. The addition of silver colloid to MTX-treated cells revealed an enhanced Raman scattering of MTX. Addition of extracellular DNA induced a total extinction of MTX Raman intensity for both cell lines, which revealed an adsorption of MTX on plasma membrane. A threefold higher MTX Raman intensity was observed for HCT-116 R, suggesting a tight MTX adsorption in the plasma membrane microenvironment. Fluorescence confocal microscopy confirmed a relative MTX emission around plasma membrane for HCT-116 R. After 30 min at 4 deg. C, a threefold decrease of the MTX Raman scattering was observed for HCT-116 R, contrary to HCT-116 S. Permeation with benzyl alcohol revealed a threefold decrease of membrane MTX adsorption on HCT-116 R, exclusively. This additional MTX adsorption should correspond to the drug bound to an unstable site on the HCT-116 R membrane. This study showed that SERS spectroscopy could be a direct method to reveal drug adsorption to the membrane environment of living cells

  19. The Effect of Pleistocene Climate Fluctuations on Distribution of European Abalone (Haliotis tuberculata), Revealed by Combined Mitochondrial and Nuclear Marker Analyses.

    Science.gov (United States)

    Roussel, Valérie; Van Wormhoudt, Alain

    2017-04-01

    The genetic differentiation among the populations of the European abalone Haliotis tuberculata was investigated using different markers to better understand the evolutionary history and exchanges between populations. Three markers were used: mitochondrial cytochrome oxidase I (COI), the sperm lysin nuclear gene, and eight nuclear microsatellites. These markers present different characteristics concerning mutation rate and inheritance, which provided complementary information about abalone history and gene diversity. Genetic diversity and relationships among subspecies were calculated from a sample of approximately 500 individuals, collected from 17 different locations in the north-eastern Atlantic Ocean, Macaronesia, and Mediterranean Sea. COI marker was used to explore the phylogeny of the species with a network analysis and two phylogenetic methods. The analysis revealed 18 major haplotypes grouped into two distinct clades with a pairwise sequence divergence up to 3.5 %. These clades do not correspond to subspecies but revealed many contacts along Atlantic coast during the Pleistocene interglaciations. The sperm lysin gene analysis separated two different subtaxa: one associated to Macaronesian islands, and the other to all other populations. Moreover, a small population of the northern subtaxon was isolated in the Adriatic Sea-probably before the separation of the two lineages-and evolved independently. Microsatellites were analyzed by different genetics methods, including the Bayesian clustering method and migration patterns analysis. It revealed genetically distinct microsatellite patterns among populations from Mediterranean Sea, Brittany and Normandy, Morocco, and Canary and Balearic islands. Gene flow is asymmetric among the regions; the Azores and the Canary Islands are particularly isolated and have low effective population sizes. Our results support the hypothesis that climate changes since the Pleistocene glaciations have played a major role in the

  20. Engaging with mobile methods

    DEFF Research Database (Denmark)

    Jensen, Martin Trandberg

    2014-01-01

    This chapter showcases how mobile methods are more than calibrated techniques awaiting application by tourism researchers, but productive in the enactment of the mobile (Law and Urry, 2004). Drawing upon recent findings deriving from a PhD course on mobility and mobile methods it reveals...... the conceptual ambiguousness of the term ‘mobile methods’. In order to explore this ambiguousness the chapter provides a number of examples deriving from tourism research, to explore how mobile methods are always entangled in ideologies, predispositions, conventions and practice-realities. Accordingly......, the engagements with methods are acknowledged to be always political and contextual, reminding us to avoid essentialist discussions regarding research methods. Finally, the chapter draws on recent fieldwork to extend developments in mobilities-oriented tourism research, by employing auto-ethnography to call...

  1. The quality of reporting methods and results of cost-effectiveness analyses in Spain: a methodological systematic review.

    Science.gov (United States)

    Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian

    2016-01-07

    Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable

  2. MLFMA-accelerated Nyström method for ultrasonic scattering - Numerical results and experimental validation

    Science.gov (United States)

    Gurrala, Praveen; Downs, Andrew; Chen, Kun; Song, Jiming; Roberts, Ron

    2018-04-01

    Full wave scattering models for ultrasonic waves are necessary for the accurate prediction of voltage signals received from complex defects/flaws in practical nondestructive evaluation (NDE) measurements. We propose the high-order Nyström method accelerated by the multilevel fast multipole algorithm (MLFMA) as an improvement to the state-of-the-art full-wave scattering models that are based on boundary integral equations. We present numerical results demonstrating improvements in simulation time and memory requirement. Particularly, we demonstrate the need for higher order geom-etry and field approximation in modeling NDE measurements. Also, we illustrate the importance of full-wave scattering models using experimental pulse-echo data from a spherical inclusion in a solid, which cannot be modeled accurately by approximation-based scattering models such as the Kirchhoff approximation.

  3. A Method of Calculating Motion Error in a Linear Motion Bearing Stage

    Directory of Open Access Journals (Sweden)

    Gyungho Khim

    2015-01-01

    Full Text Available We report a method of calculating the motion error of a linear motion bearing stage. The transfer function method, which exploits reaction forces of individual bearings, is effective for estimating motion errors; however, it requires the rail-form errors. This is not suitable for a linear motion bearing stage because obtaining the rail-form errors is not straightforward. In the method described here, we use the straightness errors of a bearing block to calculate the reaction forces on the bearing block. The reaction forces were compared with those of the transfer function method. Parallelism errors between two rails were considered, and the motion errors of the linear motion bearing stage were measured and compared with the results of the calculations, revealing good agreement.

  4. A Method of Calculating Motion Error in a Linear Motion Bearing Stage

    Science.gov (United States)

    Khim, Gyungho; Park, Chun Hong; Oh, Jeong Seok

    2015-01-01

    We report a method of calculating the motion error of a linear motion bearing stage. The transfer function method, which exploits reaction forces of individual bearings, is effective for estimating motion errors; however, it requires the rail-form errors. This is not suitable for a linear motion bearing stage because obtaining the rail-form errors is not straightforward. In the method described here, we use the straightness errors of a bearing block to calculate the reaction forces on the bearing block. The reaction forces were compared with those of the transfer function method. Parallelism errors between two rails were considered, and the motion errors of the linear motion bearing stage were measured and compared with the results of the calculations, revealing good agreement. PMID:25705715

  5. Higher order analytical approximate solutions to the nonlinear pendulum by He's homotopy method

    International Nuclear Information System (INIS)

    Belendez, A; Pascual, C; Alvarez, M L; Mendez, D I; Yebra, M S; Hernandez, A

    2009-01-01

    A modified He's homotopy perturbation method is used to calculate the periodic solutions of a nonlinear pendulum. The method has been modified by truncating the infinite series corresponding to the first-order approximate solution and substituting a finite number of terms in the second-order linear differential equation. As can be seen, the modified homotopy perturbation method works very well for high values of the initial amplitude. Excellent agreement of the analytical approximate period with the exact period has been demonstrated not only for small but also for large amplitudes A (the relative error is less than 1% for A < 152 deg.). Comparison of the result obtained using this method with the exact ones reveals that this modified method is very effective and convenient.

  6. New results to BDD truncation method for efficient top event probability calculation

    International Nuclear Information System (INIS)

    Mo, Yuchang; Zhong, Farong; Zhao, Xiangfu; Yang, Quansheng; Cui, Gang

    2012-01-01

    A Binary Decision Diagram (BDD) is a graph-based data structure that calculates an exact top event probability (TEP). It has been a very difficult task to develop an efficient BDD algorithm that can solve a large problem since its memory consumption is very high. Recently, in order to solve a large reliability problem within limited computational resources, Jung presented an efficient method to maintain a small BDD size by a BDD truncation during a BDD calculation. In this paper, it is first identified that Jung's BDD truncation algorithm can be improved for a more practical use. Then, a more efficient truncation algorithm is proposed in this paper, which can generate truncated BDD with smaller size and approximate TEP with smaller truncation error. Empirical results showed this new algorithm uses slightly less running time and slightly more storage usage than Jung's algorithm. It was also found, that designing a truncation algorithm with ideal features for every possible fault tree is very difficult, if not impossible. The so-called ideal features of this paper would be that with the decrease of truncation limits, the size of truncated BDD converges to the size of exact BDD, but should never be larger than exact BDD.

  7. Effects of extraction methods and factors on leaching of metals from recycled concrete aggregates.

    Science.gov (United States)

    Bestgen, Janile O; Cetin, Bora; Tanyu, Burak F

    2016-07-01

    Leaching of metals (calcium (Ca), chromium (Cr), copper, (Cu), iron (Fe), and zinc (Zn)) of recycled concrete aggregates (RCAs) were investigated with four different leachate extraction methods (batch water leach tests (WLTs), toxicity leaching procedure test (TCLP), synthetic precipitation leaching procedure test (SPLP), and pH-dependent leach tests). WLTs were also used to perform a parametric study to evaluate factors including (i) effects of reaction time, (ii) atmosphere, (iii) liquid-to-solid (L/S) ratio, and (iv) particle size of RCA. The results from WLTs showed that reaction time and exposure to atmosphere had impact on leaching behavior of metals. An increase in L/S ratio decreased the effluent pH and all metal concentrations. Particle size of the RCA had impact on some metals but not all. Comparison of the leached concentrations of metals from select RCA samples with WLT method to leached concentrations from TCLP and SPLP methods revealed significant differences. For the same RCA samples, the highest metal concentrations were obtained with TCLP method, followed by WLT and SPLP methods. However, in all tests, the concentrations of all four (Cr, Cu, Fe, and Zn) metals were below the regulatory limits determined by EPA MCLs in all tests with few exceptions. pH-dependent batch water leach tests revealed that leaching pattern for Ca is more cationic whereas for other metals showed more amphoteric. The results obtained from the pH-dependent tests were evaluated with geochemical modeling (MINTEQA2) to estimate the governing leaching mechanisms for different metals. The results indicated that the releases of the elements were solubility-controlled except Cr.

  8. Coupling of THALES and FROST using MPI Method

    International Nuclear Information System (INIS)

    Park, Jin Woo; Ryu, Seok Hee; Jung, Chan Do; Jung, Jee Hoon; Um, Kil Sup; Lee, Jae Il

    2013-01-01

    This paper presents the coupling method between THALES and FROST and the simulation results with the coupled code system. In this study, subchannel analysis code THALES and transient fuel performance code FROST were coupled using MPI method as the first stage of the development of the multi-dimensional safety analysis methodology. As a part of the validation, the CEA ejection accident was simulated using the coupled THALES-FROST code and the results were compared with the ShinKori 3 and 4 FSAR. Comparison results revealed that CHASER using MPI method predicts fuel temperatures and heat flux quantitatively well. Thus it was confirmed that the THALES and FROST are properly coupled. In near future, ASTRA, multi-dimensional core neutron kinetics code, will be linked to THALESFROST code for the detailed three-dimensional CEA ejection analysis. The current safety analysis methodology for a CEA ejection accident based on numerous conservative assumptions with the point kinetics model results in quite adverse consequences. Thus, KNF is developing the multi-dimensional safety analysis methodology to enhance the consequences of the CEA ejection accident. For this purpose, three-dimensional core neutron kinetics code ASTRA, subchannel analysis code THALES, and transient fuel performance analysis code FROST are being coupled using message passing interface(MPI). For the first step, THALES and FROST are coupled and tested

  9. Parental diabetes status reveals association of mitochondrial DNA haplogroup J1 with type 2 diabetes

    Directory of Open Access Journals (Sweden)

    Wainstein Julio

    2009-06-01

    Full Text Available Abstract Background Although mitochondrial dysfunction is consistently manifested in patients with Type 2 Diabetes mellitus (T2DM, the association of mitochondrial DNA (mtDNA sequence variants with T2DM varies among populations. These differences might stem from differing environmental influences among populations. However, other potentially important considerations emanate from the very nature of mitochondrial genetics, namely the notable high degree of partitioning in the distribution of human mtDNA variants among populations, as well as the interaction of mtDNA and nuclear DNA-encoded factors working in concert to govern mitochondrial function. We hypothesized that association of mtDNA genetic variants with T2DM could be revealed while controlling for the effect of additional inherited factors, reflected in family history information. Methods To test this hypothesis we set out to investigate whether mtDNA genetic variants will be differentially associated with T2DM depending on the diabetes status of the parents. To this end, association of mtDNA genetic backgrounds (haplogroups with T2DM was assessed in 1055 Jewish patients with and without T2DM parents ('DP' and 'HP', respectively. Results Haplogroup J1 was found to be 2.4 fold under-represented in the 'HP' patients (p = 0.0035. These results are consistent with a previous observation made in Finnish T2DM patients. Moreover, assessing the haplogroup distribution in 'DP' versus 'HP' patients having diabetic siblings revealed that haplogroup J1 was virtually absent in the 'HP' group. Conclusion These results imply the involvement of inherited factors, which modulate the susceptibility of haplogroup J1 to T2DM.

  10. Relation between financial market structure and the real economy: comparison between clustering methods.

    Science.gov (United States)

    Musmeci, Nicoló; Aste, Tomaso; Di Matteo, T

    2015-01-01

    We quantify the amount of information filtered by different hierarchical clustering methods on correlations between stock returns comparing the clustering structure with the underlying industrial activity classification. We apply, for the first time to financial data, a novel hierarchical clustering approach, the Directed Bubble Hierarchical Tree and we compare it with other methods including the Linkage and k-medoids. By taking the industrial sector classification of stocks as a benchmark partition, we evaluate how the different methods retrieve this classification. The results show that the Directed Bubble Hierarchical Tree can outperform other methods, being able to retrieve more information with fewer clusters. Moreover,we show that the economic information is hidden at different levels of the hierarchical structures depending on the clustering method. The dynamical analysis on a rolling window also reveals that the different methods show different degrees of sensitivity to events affecting financial markets, like crises. These results can be of interest for all the applications of clustering methods to portfolio optimization and risk hedging [corrected].

  11. Relation between financial market structure and the real economy: comparison between clustering methods.

    Directory of Open Access Journals (Sweden)

    Nicoló Musmeci

    Full Text Available We quantify the amount of information filtered by different hierarchical clustering methods on correlations between stock returns comparing the clustering structure with the underlying industrial activity classification. We apply, for the first time to financial data, a novel hierarchical clustering approach, the Directed Bubble Hierarchical Tree and we compare it with other methods including the Linkage and k-medoids. By taking the industrial sector classification of stocks as a benchmark partition, we evaluate how the different methods retrieve this classification. The results show that the Directed Bubble Hierarchical Tree can outperform other methods, being able to retrieve more information with fewer clusters. Moreover,we show that the economic information is hidden at different levels of the hierarchical structures depending on the clustering method. The dynamical analysis on a rolling window also reveals that the different methods show different degrees of sensitivity to events affecting financial markets, like crises. These results can be of interest for all the applications of clustering methods to portfolio optimization and risk hedging [corrected].

  12. Uranium City radiation reduction program: further efforts at remedial measures for houses with block walls, concrete porosity test results, and intercomparison of Kuznetz method and Tsivoglau method

    International Nuclear Information System (INIS)

    Haubrich, E.; Leung, M.K.; Mackie, R.

    1980-01-01

    An attempt was made to reduce the levels of radon in a house in Uranium City by mechanically venting the plenums in the concrete block basement walls, with little success. A table compares the results obtained by measuring the radon WL using the Tsivoglau and the Kuznetz methods

  13. Diamagnetic measurements on ISX-B: method and results

    International Nuclear Information System (INIS)

    Neilson, G.H.

    1983-10-01

    A diamagnetic loop is used on the ISX-B tokamak to measure the change in toroidal magnetic flux, sigma phi, caused by finite plasma current and perpendicular pressure. From this measurement, the perpendicular poloidal beta β/sub I perpendicular to/ is determined. The principal difficulty encountered is in identifying and making corrections for various noise components which appear in the measured flux. These result from coupling between the measuring loops and the toroidal and poloidal field windings, both directly and through currents induced in the vacuum vessel and coils themselves. An analysis of these couplings is made and techniques for correcting them developed. Results from the diamagnetic measurement, employing some of these correction techniques, are presented and compared with other data. The obtained values of β/sub I perpendicular to/ agree with those obtained from the equilibrium magnetic analysis (β/sub IΔ/) in ohmically heated plasmas, indicating no anisotropy. However, with 0.3 to 2.0 MW of tangential neutral beam injection, β/sub IΔ/ is consistently greater than β/sub I pependicular to/ and qualitatively consistent with the formation of an anisotropic ion velocity distribution and with toroidal rotation. Quantitatively, the difference between β/sub IΔ/ and β/sub I perpendicular to/ is more than can be accounted for on the basis of the usual classical fast ion calculations and spectroscopic rotation measurements

  14. Application of NDE methods to green ceramics: initial results

    International Nuclear Information System (INIS)

    Kupperman, D.S.; Karplus, H.B.; Poeppel, R.B.; Ellingson, W.A.; Berger, H.; Robbins, C.; Fuller, E.

    1983-01-01

    The effectiveness of microradiography, ultrasonic methods, unclear magnetic resonance, and neutron radiography was assessed for the nondestructive evaluation of green (unfired) ceramics. The application of microradiography to ceramics is reviewed, and preliminary experiments with a commercial microradiography unit are described. Conventional ultrasonic techniques are difficult to apply to flaw detection green ceramics because of the high attenuation, fragility, and couplant-absorbing properties of these materials. However, velocity, attenuation, and spectral data were obtained with pressure-coupled transducers and provided useful informaion related to density variations and the presence of agglomerates. Nuclear magnetic resonance (NMR) imaging techniques and neutron radiography were considered for detection of anomalies in the distribution of porosity. With NMR, areas of high porosity might be detected after the samples are doped with water. In the case of neutron radiography, although imaging the binder distribution throughout the sample may not be feasible because of the low overall concentration of binder, regions of high binder concentration (thus high porosity) should be detectable

  15. EIT Imaging of admittivities with a D-bar method and spatial prior: experimental results for absolute and difference imaging.

    Science.gov (United States)

    Hamilton, S J

    2017-05-22

    Electrical impedance tomography (EIT) is an emerging imaging modality that uses harmless electrical measurements taken on electrodes at a body's surface to recover information about the internal electrical conductivity and or permittivity. The image reconstruction task of EIT is a highly nonlinear inverse problem that is sensitive to noise and modeling errors making the image reconstruction task challenging. D-bar methods solve the nonlinear problem directly, bypassing the need for detailed and time-intensive forward models, to provide absolute (static) as well as time-difference EIT images. Coupling the D-bar methodology with the inclusion of high confidence a priori data results in a noise-robust regularized image reconstruction method. In this work, the a priori D-bar method for complex admittivities is demonstrated effective on experimental tank data for absolute imaging for the first time. Additionally, the method is adjusted for, and tested on, time-difference imaging scenarios. The ability of the method to be used for conductivity, permittivity, absolute as well as time-difference imaging provides the user with great flexibility without a high computational cost.

  16. Cerebral Ischemia versus MS in Young Adults Clinical Imaging Diagnosis Difficulties and Recovery Methods

    OpenAIRE

    Any DOCU-AXELERAD; Dan DOCU-AXELERAD

    2012-01-01

    Ischemia in young adults is often the result of non-atherosclerotic vasculopathies, cardiac embolism or clotting disorders. One third of young adults ischemic stroke etiology remains undetermined. Materials and methods: We present the case of a patient aged 42, diagnosed with probable MS without cardiovascular or metabolic risk factors, presented to our clinic for decrease of force at right limbs and recent dysarthria. Results and discussion: The history revealed recurrent episodes of right h...

  17. The place of highly accurate methods by RNAA in metrology

    International Nuclear Information System (INIS)

    Dybczynski, R.; Danko, B.; Polkowska-Motrenko, H.; Samczynski, Z.

    2006-01-01

    With the introduction of physical metrological concepts to chemical analysis which require that the result should be accompanied by uncertainty statement written down in terms of Sl units, several researchers started to consider lD-MS as the only method fulfilling this requirement. However, recent publications revealed that in certain cases also some expert laboratories using lD-MS and analyzing the same material, produced results for which their uncertainty statements did not overlap, what theoretically should not have taken place. This shows that no monopoly is good in science and it would be desirable to widen the set of methods acknowledged as primary in inorganic trace analysis. Moreover, lD-MS cannot be used for monoisotopic elements. The need for searching for other methods having similar metrological quality as the lD-MS seems obvious. In this paper, our long-time experience on devising highly accurate ('definitive') methods by RNAA for the determination of selected trace elements in biological materials is reviewed. The general idea of definitive methods based on combination of neutron activation with the highly selective and quantitative isolation of the indicator radionuclide by column chromatography followed by gamma spectrometric measurement is reminded and illustrated by examples of the performance of such methods when determining Cd, Co, Mo, etc. lt is demonstrated that such methods are able to provide very reliable results with very low levels of uncertainty traceable to Sl units

  18. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  19. Studies on mycobacterium tuberculosis sensitivity test by using the method of rapid radiometry with appendixes of clinical results

    International Nuclear Information System (INIS)

    Yang Yongqing; Jiang Yimin; Lu Wendong; Zhu Rongen

    1987-01-01

    Three standard strains of mycobacterium tuberculosis (H 37 RV-fully sensitive, SM-R1000 μg/ml, RFP-R 100 μg/ml) were tested with 10 concentration of 5 antitubercular agent, INH, SM, PAS, RFP and EB. 114 isolates of mycobacterium tuberculosis taken from patients were tested with INH, PAS, SM and RFP. They were agreed with the results of standard Lowenstein-Jensen method in 81.7%. 82% of the isolate test were completed within 5 days. The method may be used in routine clinical work. The liquid media prepared by authors do not require human serum albumin and it is less expensive and readily available

  20. Approach for discrimination and quantification of electroactive species: kinetics difference revealed by higher harmonics of Fourier transformed sinusoidal voltammetry.

    Science.gov (United States)

    Fang, Yishan; Huang, Xinjian; Wang, Lishi

    2015-01-06

    Discrimination and quantification of electroactive species are traditionally realized by a potential difference which is mainly determined by thermodynamics. However, the resolution of this approach is limited to tens of millivolts. In this paper, we described an application of Fourier transformed sinusoidal voltammetry (FT-SV) that provides a new approach for discrimination and quantitative evaluation of electroactive species, especially thermodynamic similar ones. Numerical simulation indicates that electron transfer kinetics difference between electroactive species can be revealed by the phase angle of higher order harmonics of FT-SV, and the difference can be amplified order by order. Thus, even a very subtle kinetics difference can be amplified to be distinguishable at a certain order of harmonics. This method was verified with structurally similar ferrocene derivatives which were chosen as the model systems. Although these molecules have very close redox potential (harmonics. The results demonstrated the feasibility and reliability of the method. It was also implied that the combination of the traditional thermodynamic method and this kinetics method can form a two-dimension resolved detection method, and it has the potential to extend the resolution of voltammetric techniques to a new level.

  1. The solution of a coupled system of nonlinear physical problems using the homotopy analysis method

    International Nuclear Information System (INIS)

    El-Wakil, S A; Abdou, M A

    2010-01-01

    In this article, the homotopy analysis method (HAM) has been applied to solve coupled nonlinear evolution equations in physics. The validity of this method has been successfully demonstrated by applying it to two nonlinear evolution equations, namely coupled nonlinear diffusion reaction equations and the (2+1)-dimensional Nizhnik-Novikov Veselov system. The results obtained by this method show good agreement with the ones obtained by other methods. The proposed method is a powerful and easy to use analytic tool for nonlinear problems and does not need small parameters in the equations. The HAM solutions contain an auxiliary parameter that provides a convenient way of controlling the convergence region of series solutions. The results obtained here reveal that the proposed method is very effective and simple for solving nonlinear evolution equations. The basic ideas of this approach can be widely employed to solve other strongly nonlinear problems.

  2. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  3. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  4. Motivation Beliefs of Secondary School Teachers in Canada and Singapore: A Mixed Methods Study

    Science.gov (United States)

    Klassen, Robert M.; Chong, Wan Har; Huan, Vivien S.; Wong, Isabella; Kates, Allison; Hannok, Wanwisa

    2008-01-01

    A mixed methods approach was used to explore secondary teachers' motivation beliefs in Canada and Singapore. Results from Study 1 revealed that socio-economic status (SES) was the strongest predictor of school climate in Canada, and that collective efficacy mediated the effect of SES on school climate in Singapore, but not in Canada. In Study 2,…

  5. A Quantitative Method for Localizing User Interface Problems: The D-TEO Method

    Directory of Open Access Journals (Sweden)

    Juha Lamminen

    2009-01-01

    Full Text Available A large array of evaluation methods have been proposed to identify Website usability problems. In log-based evaluation, information about the performance of users is collected and stored into log files, and used to find problems and deficiencies in Web page designs. Most methods require the programming and modeling of large task models, which are cumbersome processes for evaluators. Also, because much statistical data is collected onto log files, recognizing which Web pages require deeper usability analysis is difficult. This paper suggests a novel quantitative method, called the D-TEO, for locating problematic Web pages. This semiautomated method explores the decomposition of interaction tasks of directed information search into elementary operations, deploying two quantitative usability criteria, search success and search time, to reveal how a user navigates within a web of hypertext.

  6. Application of machine learning methods to histone methylation ChIP-Seq data reveals H4R3me2 globally represses gene expression

    Science.gov (United States)

    2010-01-01

    Background In the last decade, biochemical studies have revealed that epigenetic modifications including histone modifications, histone variants and DNA methylation form a complex network that regulate the state of chromatin and processes that depend on it including transcription and DNA replication. Currently, a large number of these epigenetic modifications are being mapped in a variety of cell lines at different stages of development using high throughput sequencing by members of the ENCODE consortium, the NIH Roadmap Epigenomics Program and the Human Epigenome Project. An extremely promising and underexplored area of research is the application of machine learning methods, which are designed to construct predictive network models, to these large-scale epigenomic data sets. Results Using a ChIP-Seq data set of 20 histone lysine and arginine methylations and histone variant H2A.Z in human CD4+ T-cells, we built predictive models of gene expression as a function of histone modification/variant levels using Multilinear (ML) Regression and Multivariate Adaptive Regression Splines (MARS). Along with extensive crosstalk among the 20 histone methylations, we found H4R3me2 was the most and second most globally repressive histone methylation among the 20 studied in the ML and MARS models, respectively. In support of our finding, a number of experimental studies show that PRMT5-catalyzed symmetric dimethylation of H4R3 is associated with repression of gene expression. This includes a recent study, which demonstrated that H4R3me2 is required for DNMT3A-mediated DNA methylation--a known global repressor of gene expression. Conclusion In stark contrast to univariate analysis of the relationship between H4R3me2 and gene expression levels, our study showed that the regulatory role of some modifications like H4R3me2 is masked by confounding variables, but can be elucidated by multivariate/systems-level approaches. PMID:20653935

  7. Merging metagenomics and geochemistry reveals environmental controls on biological diversity and evolution.

    Science.gov (United States)

    Alsop, Eric B; Boyd, Eric S; Raymond, Jason

    2014-05-28

    facilitated accurate prediction of the ordering of community functional composition along geochemical gradients, despite a lack of geochemical input. The consistency in the results obtained from the application of Markov clustering and multivariate methods to distinct natural systems underscore their utility in predicting the functional potential of microbial communities within a natural system based on system geochemistry alone, allowing geochemical measurements to be used to predict purely biological metrics such as microbial community composition and metabolism.

  8. Puerto Rico Revealed Preference Survey Data 2004

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Revealed preference models provide insights into recreational angler behavior and the economic value of recreational fishing trips. Revealed preference data is...

  9. Differences in quantitative methods for measuring subjective cognitive decline - results from a prospective memory clinic study.

    Science.gov (United States)

    Vogel, Asmus; Salem, Lise Cronberg; Andersen, Birgitte Bo; Waldemar, Gunhild

    2016-09-01

    Cognitive complaints occur frequently in elderly people and may be a risk factor for dementia and cognitive decline. Results from studies on subjective cognitive decline are difficult to compare due to variability in assessment methods, and little is known about how different methods influence reports of cognitive decline. The Subjective Memory Complaints Scale (SMC) and The Memory Complaint Questionnaire (MAC-Q) were applied in 121 mixed memory clinic patients with mild cognitive symptoms (mean MMSE = 26.8, SD 2.7). The scales were applied independently and raters were blinded to results from the other scale. Scales were not used for diagnostic classification. Cognitive performances and depressive symptoms were also rated. We studied the association between the two measures and investigated the scales' relation to depressive symptoms, age, and cognitive status. SMC and MAC-Q were significantly associated (r = 0.44, N = 121, p = 0.015) and both scales had a wide range of scores. In this mixed cohort of patients, younger age was associated with higher SMC scores. There were no significant correlations between cognitive test performances and scales measuring subjective decline. Depression scores were significantly correlated to both scales measuring subjective decline. Linear regression models showed that age did not have a significant contribution to the variance in subjective memory beyond that of depressive symptoms. Measures for subjective cognitive decline are not interchangeable when used in memory clinics and the application of different scales in previous studies is an important factor as to why studies show variability in the association between subjective cognitive decline and background data and/or clinical results. Careful consideration should be taken as to which questions are relevant and have validity when operationalizing subjective cognitive decline.

  10. Qualitative approaches to use of the RE-AIM framework: rationale and methods.

    Science.gov (United States)

    Holtrop, Jodi Summers; Rabin, Borsika A; Glasgow, Russell E

    2018-03-13

    There have been over 430 publications using the RE-AIM model for planning and evaluation of health programs and policies, as well as numerous applications of the model in grant proposals and national programs. Full use of the model includes use of qualitative methods to understand why and how results were obtained on different RE-AIM dimensions, however, recent reviews have revealed that qualitative methods have been used infrequently. Having quantitative and qualitative methods and results iteratively inform each other should enhance understanding and lessons learned. Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings. We present directions for, examples of, and guidance on how qualitative methods can be used to address each of the five RE-AIM dimensions. Formative qualitative methods can be helpful in planning interventions and designing for dissemination. Summative qualitative methods are useful when used in an iterative, mixed methods approach for understanding how and why different patterns of results occur. In summary, qualitative and mixed methods approaches to RE-AIM help understand complex situations and results, why and how outcomes were obtained, and contextual factors not easily assessed using quantitative measures.

  11. Assessment of extension agents' use of communication methods ...

    African Journals Online (AJOL)

    Findings from correlation analysis revealed that there was significant relationship between linkage and communication methods between institute (r =-0.377), linkage and communication methods among extension agents (r =0.379). However, the relationship between communication within and between institute was highly ...

  12. Simulation of neutral gas flow in a tokamak divertor using the Direct Simulation Monte Carlo method

    International Nuclear Information System (INIS)

    Gleason-González, Cristian; Varoutis, Stylianos; Hauer, Volker; Day, Christian

    2014-01-01

    Highlights: • Subdivertor gas flows calculations in tokamaks by coupling the B2-EIRENE and DSMC method. • The results include pressure, temperature, bulk velocity and particle fluxes in the subdivertor. • Gas recirculation effect towards the plasma chamber through the vertical targets is found. • Comparison between DSMC and the ITERVAC code reveals a very good agreement. - Abstract: This paper presents a new innovative scientific and engineering approach for describing sub-divertor gas flows of fusion devices by coupling the B2-EIRENE (SOLPS) code and the Direct Simulation Monte Carlo (DSMC) method. The present study exemplifies this with a computational investigation of neutral gas flow in the ITER's sub-divertor region. The numerical results include the flow fields and contours of the overall quantities of practical interest such as the pressure, the temperature and the bulk velocity assuming helium as model gas. Moreover, the study unravels the gas recirculation effect located behind the vertical targets, viz. neutral particles flowing towards the plasma chamber. Comparison between calculations performed by the DSMC method and the ITERVAC code reveals a very good agreement along the main sub-divertor ducts

  13. Application of Reproducing Kernel Method for Solving Nonlinear Fredholm-Volterra Integrodifferential Equations

    Directory of Open Access Journals (Sweden)

    Omar Abu Arqub

    2012-01-01

    Full Text Available This paper investigates the numerical solution of nonlinear Fredholm-Volterra integro-differential equations using reproducing kernel Hilbert space method. The solution ( is represented in the form of series in the reproducing kernel space. In the mean time, the n-term approximate solution ( is obtained and it is proved to converge to the exact solution (. Furthermore, the proposed method has an advantage that it is possible to pick any point in the interval of integration and as well the approximate solution and its derivative will be applicable. Numerical examples are included to demonstrate the accuracy and applicability of the presented technique. The results reveal that the method is very effective and simple.

  14. Studies of LMFBR: method of analysis and some results

    International Nuclear Information System (INIS)

    Ishiguro, Y.; Dias, A.F.; Nascimento, J.A. do.

    1983-01-01

    Some results of recent studies of LMFBR characteristics are summarized. A two-dimensional model of the LMFBR is taken from a publication and used as the base model for the analysis. Axial structures are added to the base model and a three-dimensional (Δ - Z) calculation has been done. Two dimensional (Δ and RZ) calculations are compared with the three-dimensional and published results. The eigenvalue, flux and power distributions, breeding characteristics, control rod worth, sodium-void and Doppler reactivities are analysed. Calculations are done by CITATION using six-group cross sections collapsed regionwise by EXPANDA in one-dimensional geometries from the 70-group JFS library. Burnup calculations of a simplified thorium-cycle LMFBR have also been done in the RZ geometry. Principal results of the studies are: (1) the JFS library appears adequate for predicting overall characteristics of an LMFBR, (2) the sodium void reactivity is negative within - 25 cm from the outer boundary of the core, (3) the halflife of Pa-233 must be considered explicitly in burnup analyses, and (4) two-dimensional (RZ and Δ) calculations can be used iteratively to analyze three-dimensional reactor systems. (Author) [pt

  15. Analysis of Transcriptional Signatures in Response to Listeria monocytogenes Infection Reveals Temporal Changes That Result from Type I Interferon Signaling

    Science.gov (United States)

    Potempa, Krzysztof; Graham, Christine M.; Moreira-Teixeira, Lucia; McNab, Finlay W.; Howes, Ashleigh; Stavropoulos, Evangelos; Pascual, Virginia; Banchereau, Jacques; Chaussabel, Damien; O’Garra, Anne

    2016-01-01

    Analysis of the mouse transcriptional response to Listeria monocytogenes infection reveals that a large set of genes are perturbed in both blood and tissue and that these transcriptional responses are enriched for pathways of the immune response. Further we identified enrichment for both type I and type II interferon (IFN) signaling molecules in the blood and tissues upon infection. Since type I IFN signaling has been reported widely to impair bacterial clearance we examined gene expression from blood and tissues of wild type (WT) and type I IFNαβ receptor-deficient (Ifnar1-/-) mice at the basal level and upon infection with L. monocytogenes. Measurement of the fold change response upon infection in the absence of type I IFN signaling demonstrated an upregulation of specific genes at day 1 post infection. A less marked reduction of the global gene expression signature in blood or tissues from infected Ifnar1-/- as compared to WT mice was observed at days 2 and 3 after infection, with marked reduction in key genes such as Oasg1 and Stat2. Moreover, on in depth analysis, changes in gene expression in uninfected mice of key IFN regulatory genes including Irf9, Irf7, Stat1 and others were identified, and although induced by an equivalent degree upon infection this resulted in significantly lower final gene expression levels upon infection of Ifnar1-/- mice. These data highlight how dysregulation of this network in the steady state and temporally upon infection may determine the outcome of this bacterial infection and how basal levels of type I IFN-inducible genes may perturb an optimal host immune response to control intracellular bacterial infections such as L. monocytogenes. PMID:26918359

  16. Strengthening of limestone by the impregnation - gamma irradiation method. Results of tests

    International Nuclear Information System (INIS)

    Ramiere, R.; Tassigny, C. de

    1975-04-01

    The method developed by the Centre d'Etudes Nucleaires de Grenoble (France) strengthens the stones by impregnation with a styrene resin/liquid polystyrene mixture followed by polymerization under gamma irradiation. This method is applicable to stones which can be taken into the laboratory for treatment. The increase in strength of 6 different species of French limestone has been quantitatively recorded. The following parameters were studied: possibility of water migration inside the stones, improvements of the mechanical properties of the impregnated stone, standing up to freeze-thaw conditions and artificial ageing of the stones which causes only minor changes in the appearance of the stone and a negligible decrease in weight [fr

  17. Method of eliminating undesirable gaseous products resulting in underground uranium ore leaching

    International Nuclear Information System (INIS)

    Krizek, J.; Dedic, K.; Johann, J.; Haas, F.; Sokola, K.

    1980-01-01

    The method described is characteristic of the fact that gases being formed or dissolved are oxidized using a combined oxidation-reduction system consisting of airborne oxygen, oxygen carriers and a strong irreversible oxidant. The oxygen carrier system consists of a mixture of Fe 2+ and Fe 3+ cations or of Cu + and Cu 2+ cations introduced in solutions in form of iron salts at a concentration of 0.0001 to 0.003 M, or copper salts maximally of 0.0003 M. The irreversible oxidant shows a standard redox potential of at least +1.0 V. In addition to undesirable product elimination, the method allows increasing the leaching process yield. (J.B.)

  18. Molecular Characterization and Genetic Diversity of the Macaw Palm Ex Situ Germplasm Collection Revealed by Microsatellite Markers

    Directory of Open Access Journals (Sweden)

    Fekadu G. Mengistu

    2016-10-01

    Full Text Available Macaw palm (Acrocomia aculeata is native to tropical forests in South America and highly abundant in Brazil. It is cited as a highly productive oleaginous palm tree presenting high potential for biodiesel production. The aim of this work was to characterize and study the genetic diversity of A. aculeata ex situ collections from different geographical states in Brazil using microsatellite (Simple Sequence Repeats, SSR markers. A total of 192 accessions from 10 provenances were analyzed with 10 SSR, and variations were detected in allelic diversity, polymorphism, and heterozygosity in the collections. Three major groups of accessions were formed using PCoA—principal coordinate analysis, UPGMA—unweighted pair-group method with arithmetic mean, and Tocher. The Mantel test revealed a weak correlation (r = 0.07 between genetic and geographic distances among the provenances reaffirming the result of the grouping. Reduced average heterozygosity (Ho < 50% per locus (or provenance confirmed the predominance of endogamy (or inbreeding in the germplasm collections as evidenced by positive inbreeding coefficient (F > 0 per locus (or per provenance. AMOVA—Analysis of Molecular Variance revealed higher (48.2% genetic variation within population than among populations (36.5%. SSR are useful molecular markers in characterizing A. aculeata germplasm and could facilitate the process of identifying, grouping, and selecting genotypes. Present results could be used to formulate appropriate conservation strategies in the genebank.

  19. Reliable and rapid characterization of functional FCN2 gene variants reveals diverse geographical patterns

    Directory of Open Access Journals (Sweden)

    Ojurongbe Olusola

    2012-05-01

    Full Text Available Abstract Background Ficolin-2 coded by FCN2 gene is a soluble serum protein and an innate immune recognition element of the complement system. FCN2 gene polymorphisms reveal distinct geographical patterns and are documented to alter serum ficolin levels and modulate disease susceptibility. Methods We employed a real-time PCR based on Fluorescence Resonance Energy Transfer (FRET method to genotype four functional SNPs including -986 G > A (#rs3124952, -602 G > A (#rs3124953, -4A > G (#rs17514136 and +6424 G > T (#rs7851696 in the ficolin-2 (FCN2 gene. We characterized the FCN2 variants in individuals representing Brazilian (n = 176, Nigerian (n = 180, Vietnamese (n = 172 and European Caucasian ethnicity (n = 165. Results We observed that the genotype distribution of three functional SNP variants (−986 G > A, -602 G > A and -4A > G differ significantly between the populations investigated (p p  Conclusions The observed distribution of the FCN2 functional SNP variants may likely contribute to altered serum ficolin levels and this may depend on the different disease settings in world populations. To conclude, the use of FRET based real-time PCR especially for FCN2 gene will benefit a larger scientific community who extensively depend on rapid, reliable method for FCN2 genotyping.

  20. Teaching the Scientific Method in the Social Sciences

    Science.gov (United States)

    Keyes, Grace

    2010-01-01

    Many undergraduates can tell you what the scientific method means but just a little probing reveals a rather shallow understanding as well as a number of misconceptions about the method. The purpose of this paper is to indicate why such misconceptions occur and to point out some implications and suggestions for teaching the scientific method in…