WorldWideScience

Sample records for methods study results

  1. German precursor study: methods and results

    International Nuclear Information System (INIS)

    Hoertner, H.; Frey, W.; von Linden, J.; Reichart, G.

    1985-01-01

    This study has been prepared by the GRS by contract of the Federal Minister of Interior. The purpose of the study is to show how the application of system-analytic tools and especially of probabilistic methods on the Licensee Event Reports (LERs) and on other operating experience can support a deeper understanding of the safety-related importance of the events reported in reactor operation, the identification of possible weak points, and further conclusions to be drawn from the events. Additionally, the study aimed at a comparison of its results for the severe core damage frequency with those of the German Risk Study as far as this is possible and useful. The German Precursor Study is a plant-specific study. The reference plant is Biblis NPP with its very similar Units A and B, whereby the latter was also the reference plant for the German Risk Study

  2. Visual Display of Scientific Studies, Methods, and Results

    Science.gov (United States)

    Saltus, R. W.; Fedi, M.

    2015-12-01

    The need for efficient and effective communication of scientific ideas becomes more urgent each year.A growing number of societal and economic issues are tied to matters of science - e.g., climate change, natural resource availability, and public health. Societal and political debate should be grounded in a general understanding of scientific work in relevant fields. It is difficult for many participants in these debates to access science directly because the formal method for scientific documentation and dissemination is the journal paper, generally written for a highly technical and specialized audience. Journal papers are very effective and important for documentation of scientific results and are essential to the requirements of science to produce citable and repeatable results. However, journal papers are not effective at providing a quick and intuitive summary useful for public debate. Just as quantitative data are generally best viewed in graphic form, we propose that scientific studies also can benefit from visual summary and display. We explore the use of existing methods for diagramming logical connections and dependencies, such as Venn diagrams, mind maps, flow charts, etc., for rapidly and intuitively communicating the methods and results of scientific studies. We also discuss a method, specifically tailored to summarizing scientific papers that we introduced last year at AGU. Our method diagrams the relative importance and connections between data, methods/models, results/ideas, and implications/importance using a single-page format with connected elements in these four categories. Within each category (e.g., data) the spatial location of individual elements (e.g., seismic, topographic, gravity) indicates relative novelty (e.g., are these new data?) and importance (e.g., how critical are these data to the results of the paper?). The goal is to find ways to rapidly and intuitively share both the results and the process of science, both for communication

  3. Comparison of multiple-criteria decision-making methods - results of simulation study

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2016-12-01

    Full Text Available Background: Today, both researchers and practitioners have many methods for supporting the decision-making process. Due to the conditions in which supply chains function, the most interesting are multi-criteria methods. The use of sophisticated methods for supporting decisions requires the parameterization and execution of calculations that are often complex. So is it efficient to use sophisticated methods? Methods: The authors of the publication compared two popular multi-criteria decision-making methods: the  Weighted Sum Model (WSM and the Analytic Hierarchy Process (AHP. A simulation study reflects these two decision-making methods. Input data for this study was a set of criteria weights and the value of each in terms of each criterion. Results: The iGrafx Process for Six Sigma simulation software recreated how both multiple-criteria decision-making methods (WSM and AHP function. The result of the simulation was a numerical value defining the preference of each of the alternatives according to the WSM and AHP methods. The alternative producing a result of higher numerical value  was considered preferred, according to the selected method. In the analysis of the results, the relationship between the values of the parameters and the difference in the results presented by both methods was investigated. Statistical methods, including hypothesis testing, were used for this purpose. Conclusions: The simulation study findings prove that the results obtained with the use of two multiple-criteria decision-making methods are very similar. Differences occurred more frequently in lower-value parameters from the "value of each alternative" group and higher-value parameters from the "weight of criteria" group.

  4. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  5. Comparative study of methods on outlying data detection in experimental results

    International Nuclear Information System (INIS)

    Oliveira, P.M.S.; Munita, C.S.; Hazenfratz, R.

    2009-01-01

    The interpretation of experimental results through multivariate statistical methods might reveal the outliers existence, which is rarely taken into account by the analysts. However, their presence can influence the results interpretation, generating false conclusions. This paper shows the importance of the outliers determination for one data base of 89 samples of ceramic fragments, analyzed by neutron activation analysis. The results were submitted to five procedures to detect outliers: Mahalanobis distance, cluster analysis, principal component analysis, factor analysis, and standardized residual. The results showed that although cluster analysis is one of the procedures most used to identify outliers, it can fail by not showing the samples that are easily identified as outliers by other methods. In general, the statistical procedures for the identification of the outliers are little known by the analysts. (author)

  6. Prospective and Retrospective Studies of Substance Abuse Treatment Outcomes: Methods and Results of Four Large-Scale Follow-Up Studies.

    Science.gov (United States)

    Gerstein, Dean R.; Johnson, Robert A.

    This report compares the research methods, provider and patient characteristics, and outcome results from four large-scale followup studies of drug treatment during the 1990s: (1) the California Drug and Alcohol Treatment Assessment (CALDATA); (2) Services Research Outcomes Study (SROS); (3) National Treatment Improvement Evaluation Study (NTIES);…

  7. Learning Method and Its Influence on Nutrition Study Results Throwing the Ball

    Science.gov (United States)

    Samsudin; Nugraha, Bayu

    2015-01-01

    This study aimed to know the difference between playing and learning methods of exploratory learning methods to learning outcomes throwing the ball. In addition, this study also aimed to determine the effect of nutritional status of these two learning methods mentioned above. This research was conducted at SDN Cipinang Besar Selatan 16 Pagi East…

  8. The healthy building intervention study: Objectives, methods and results of selected environmental measurements

    Energy Technology Data Exchange (ETDEWEB)

    Fisk, W.J.; Faulkner, D.; Sullivan, D. [and others

    1998-02-17

    To test proposed methods for reducing SBS symptoms and to learn about the causes of these symptoms, a double-blind controlled intervention study was designed and implemented. This study utilized two different interventions designed to reduce occupants` exposures to airborne particles: (1) high efficiency filters in the building`s HVAC systems; and (2) thorough cleaning of carpeted floors and fabric-covered chairs with an unusually powerful vacuum cleaner. The study population was the workers on the second and fourth floors of a large office building with mechanical ventilation, air conditioning, and sealed windows. Interventions were implemented on one floor while the occupants on the other floor served as a control group. For the enhanced-filtration intervention, a multiple crossover design was used (a crossover is a repeat of the experiment with the former experimental group as the control group and vice versa). Demographic and health symptom data were collected via an initial questionnaire on the first study week and health symptom data were obtained each week, for eight additional weeks, via weekly questionnaires. A large number of indoor environmental parameters were measured during the study including air temperatures and humidities, carbon dioxide concentrations, particle concentrations, concentrations of several airborne bioaerosols, and concentrations of several microbiologic compounds within the dust sampled from floors and chairs. This report describes the study methods and summarizes the results of selected environmental measurements.

  9. Studies of LMFBR: method of analysis and some results

    International Nuclear Information System (INIS)

    Ishiguro, Y.; Dias, A.F.; Nascimento, J.A. do.

    1983-01-01

    Some results of recent studies of LMFBR characteristics are summarized. A two-dimensional model of the LMFBR is taken from a publication and used as the base model for the analysis. Axial structures are added to the base model and a three-dimensional (Δ - Z) calculation has been done. Two dimensional (Δ and RZ) calculations are compared with the three-dimensional and published results. The eigenvalue, flux and power distributions, breeding characteristics, control rod worth, sodium-void and Doppler reactivities are analysed. Calculations are done by CITATION using six-group cross sections collapsed regionwise by EXPANDA in one-dimensional geometries from the 70-group JFS library. Burnup calculations of a simplified thorium-cycle LMFBR have also been done in the RZ geometry. Principal results of the studies are: (1) the JFS library appears adequate for predicting overall characteristics of an LMFBR, (2) the sodium void reactivity is negative within - 25 cm from the outer boundary of the core, (3) the halflife of Pa-233 must be considered explicitly in burnup analyses, and (4) two-dimensional (RZ and Δ) calculations can be used iteratively to analyze three-dimensional reactor systems. (Author) [pt

  10. A Literature Study of Matrix Element Influenced to the Result of Analysis Using Absorption Atomic Spectroscopy Method (AAS)

    International Nuclear Information System (INIS)

    Tyas-Djuhariningrum

    2004-01-01

    The gold sample analysis can be deviated more than >10% to those thrue value caused by the matrix element. So that the matrix element character need to be study in order to reduce the deviation. In rock samples, the matrix elements can cause self quenching, self absorption and ionization process, so there is a result analysis error. In the rock geochemical process, the elements of the same group at the periodic system have the tendency to be together because of their same characteristic. In absorption Atomic Spectroscopy analysis, the elements associate can absorb primer energy with similar wave length so that it can cause deviation in the result interpretation. The aim of study is to predict matrix element influences from rock sample with application standard method for reducing deviation. In quantitative way, assessment of primer light intensity that will be absorbed is proportional to the concentration atom in the sample that relationship between photon intensity with concentration in part per million is linier (ppm). These methods for eliminating matrix elements influence consist of three methods : external standard method, internal standard method, and addition standard method. External standard method for all matrix element, internal standard method for elimination matrix element that have similar characteristics, addition standard methods for elimination matrix elements in Au, Pt samples. The third of standard posess here accuracy are about 95-97%. (author)

  11. Steady-state transport equation resolution by particle methods, and numerical results

    International Nuclear Information System (INIS)

    Mercier, B.

    1985-10-01

    A method to solve steady-state transport equation has been given. Principles of the method are given. The method is studied in two different cases; estimations given by the theory are compared to numerical results. Results got in 1-D (spherical geometry) and in 2-D (axisymmetric geometry) are given [fr

  12. Differences in quantitative methods for measuring subjective cognitive decline - results from a prospective memory clinic study.

    Science.gov (United States)

    Vogel, Asmus; Salem, Lise Cronberg; Andersen, Birgitte Bo; Waldemar, Gunhild

    2016-09-01

    Cognitive complaints occur frequently in elderly people and may be a risk factor for dementia and cognitive decline. Results from studies on subjective cognitive decline are difficult to compare due to variability in assessment methods, and little is known about how different methods influence reports of cognitive decline. The Subjective Memory Complaints Scale (SMC) and The Memory Complaint Questionnaire (MAC-Q) were applied in 121 mixed memory clinic patients with mild cognitive symptoms (mean MMSE = 26.8, SD 2.7). The scales were applied independently and raters were blinded to results from the other scale. Scales were not used for diagnostic classification. Cognitive performances and depressive symptoms were also rated. We studied the association between the two measures and investigated the scales' relation to depressive symptoms, age, and cognitive status. SMC and MAC-Q were significantly associated (r = 0.44, N = 121, p = 0.015) and both scales had a wide range of scores. In this mixed cohort of patients, younger age was associated with higher SMC scores. There were no significant correlations between cognitive test performances and scales measuring subjective decline. Depression scores were significantly correlated to both scales measuring subjective decline. Linear regression models showed that age did not have a significant contribution to the variance in subjective memory beyond that of depressive symptoms. Measures for subjective cognitive decline are not interchangeable when used in memory clinics and the application of different scales in previous studies is an important factor as to why studies show variability in the association between subjective cognitive decline and background data and/or clinical results. Careful consideration should be taken as to which questions are relevant and have validity when operationalizing subjective cognitive decline.

  13. Basic studies on gastrin-radioimmunoassay and the results of its clinical application. Comparative studies between the double antibody method using Wilson's anti-gastrin serum and a gastrin kit (CIS) method

    Energy Technology Data Exchange (ETDEWEB)

    Yabana, T; Uchiya, T; Kakumoto, Y; Waga, Y; Konta, M [Sapporo Medical Coll. (Japan)

    1975-03-01

    Fundamental and practical problems in carrying out the radioimmunoassay of gastrin were studied by comparing the double antibody method, using guinea pig anti-porcine gastrin serum (Wilson Lab.) with the gastrin kit method (G-K, CIS). The former method was found to have a measurable gastrin concentration range between 60 and 1,000 pg/ml, whereas the range of the latter method was between 25 and 800 pg/ml. The reproducibility of each method was satisfactory. The G-K method was affected more readily by co-existing proteins, whereas the interferences by other biologically active factors, e.g., CCK/PZ, caerulein, etc., were negligible. While there was a highly significant correlation between the values, values obtained by the G-K method were generally slightly lower than the values obtained by the double antibody method. Results of fractionation analysis employing gel filtration of blood and tissue immunoreactive gastrin caused the authors to observe that the value of big gastrin as determined with the G-K method was lower than that obtained by the double antibody method, and that the difference was especially remarkable for gastrin in blood.

  14. No Results? No Problem! Why We Are Publishing Methods of a Landmark Study With Results Still Pending.

    Science.gov (United States)

    Lacy, Brian E; Spiegel, Brennan

    2017-11-01

    Colorectal cancer (CRC) is the third most commonly diagnosed cancer in both men and women in the United States, and screening for CRC is a national health-care priority. In this issue, investigators from the CONFIRM study group report on the aims and study design of a large, multicenter, randomized prospective study of whether screening colonoscopy is superior to an annual fecal immunochemical test (FIT). CONFRIM hopes to enroll 50,000 individuals, aged 50-75 years, from 46 Veterans Affairs Medical Centers and monitor them for 10 years. This article is unique in that no results are presented as the study is not yet complete. We have taken this unusual step as we believe the topic of CRC screening is critically important for our readers and that the results of this massive study have the potential to change clinical practice throughout all fields of medicine.

  15. Comparative Study of Daylighting Calculation Methods

    Directory of Open Access Journals (Sweden)

    Mandala Ariani

    2018-01-01

    Full Text Available The aim of this study is to assess five daylighting calculation method commonly used in architectural study. The methods used include hand calculation methods (SNI/DPMB method and BRE Daylighting Protractors, scale models studied in an artificial sky simulator and computer programs using Dialux and Velux lighting software. The test room is conditioned by the uniform sky conditions, simple room geometry with variations of the room reflectance (black, grey, and white color. The analyses compared the result (including daylight factor, illumination, and coefficient of uniformity value and examines the similarity and contrast the result different. The color variations trial is used to analyses the internally reflection factor contribution to the result.

  16. Differences in quantitative methods for measuring subjective cognitive decline - results from a prospective memory clinic study

    DEFF Research Database (Denmark)

    Vogel, Asmus; Salem, Lise Cronberg; Andersen, Birgitte Bo

    2016-01-01

    influence reports of cognitive decline. METHODS: The Subjective Memory Complaints Scale (SMC) and The Memory Complaint Questionnaire (MAC-Q) were applied in 121 mixed memory clinic patients with mild cognitive symptoms (mean MMSE = 26.8, SD 2.7). The scales were applied independently and raters were blinded...... decline. Depression scores were significantly correlated to both scales measuring subjective decline. Linear regression models showed that age did not have a significant contribution to the variance in subjective memory beyond that of depressive symptoms. CONCLUSIONS: Measures for subjective cognitive...... decline are not interchangeable when used in memory clinics and the application of different scales in previous studies is an important factor as to why studies show variability in the association between subjective cognitive decline and background data and/or clinical results. Careful consideration...

  17. Convergence results for a class of abstract continuous descent methods

    Directory of Open Access Journals (Sweden)

    Sergiu Aizicovici

    2004-03-01

    Full Text Available We study continuous descent methods for the minimization of Lipschitzian functions defined on a general Banach space. We establish convergence theorems for those methods which are generated by approximate solutions to evolution equations governed by regular vector fields. Since the complement of the set of regular vector fields is $sigma$-porous, we conclude that our results apply to most vector fields in the sense of Baire's categories.

  18. Use of statistical study methods for the analysis of the results of the imitation modeling of radiation transfer

    Science.gov (United States)

    Alekseenko, M. A.; Gendrina, I. Yu.

    2017-11-01

    Recently, due to the abundance of various types of observational data in the systems of vision through the atmosphere and the need for their processing, the use of various methods of statistical research in the study of such systems as correlation-regression analysis, dynamic series, variance analysis, etc. is actual. We have attempted to apply elements of correlation-regression analysis for the study and subsequent prediction of the patterns of radiation transfer in these systems same as in the construction of radiation models of the atmosphere. In this paper, we present some results of statistical processing of the results of numerical simulation of the characteristics of vision systems through the atmosphere obtained with the help of a special software package.1

  19. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP project: study design and methods for pooling results of genetic epidemiological studies

    Directory of Open Access Journals (Sweden)

    Raimondi Sara

    2012-08-01

    Full Text Available Abstract Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R gene, SKin cancer and Phenotypic characteristics (M-SKIP project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields.

  20. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies

    Science.gov (United States)

    2012-01-01

    Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields. PMID:22862891

  1. How the RNA isolation method can affect microRNA microarray results

    DEFF Research Database (Denmark)

    Podolska, Agnieszka; Kaczkowski, Bogumil; Litman, Thomas

    2011-01-01

    RNA microarray analysis on porcine brain tissue. One method is a phenol-guanidine isothiocyanate-based procedure that permits isolation of total RNA. The second method, miRVana™ microRNA isolation, is column based and recovers the small RNA fraction alone. We found that microarray analyses give different results...... that depend on the RNA fraction used, in particular because some microRNAs appear very sensitive to the RNA isolation method. We conclude that precautions need to be taken when comparing microarray studies based on RNA isolated with different methods.......The quality of RNA is crucial in gene expression experiments. RNA degradation interferes in the measurement of gene expression, and in this context, microRNA quantification can lead to an incorrect estimation. In the present study, two different RNA isolation methods were used to perform micro...

  2. The relationship between team climate and interprofessional collaboration: Preliminary results of a mixed methods study.

    Science.gov (United States)

    Agreli, Heloise F; Peduzzi, Marina; Bailey, Christopher

    2017-03-01

    Relational and organisational factors are key elements of interprofessional collaboration (IPC) and team climate. Few studies have explored the relationship between IPC and team climate. This article presents a study that aimed to explore IPC in primary healthcare teams and understand how the assessment of team climate may provide insights into IPC. A mixed methods study design was adopted. In Stage 1 of the study, team climate was assessed using the Team Climate Inventory with 159 professionals in 18 interprofessional teams based in São Paulo, Brazil. In Stage 2, data were collected through in-depth interviews with a sample of team members who participated in the first stage of the study. Results from Stage 1 provided an overview of factors relevant to teamwork, which in turn informed our exploration of the relationship between team climate and IPC. Preliminary findings from Stage 2 indicated that teams with a more positive team climate (in particular, greater participative safety) also reported more effective communication and mutual support. In conclusion, team climate provided insights into IPC, especially regarding aspects of communication and interaction in teams. Further research will provide a better understanding of differences and areas of overlap between team climate and IPC. It will potentially contribute for an innovative theoretical approach to explore interprofessional work in primary care settings.

  3. Standardization of glycohemoglobin results and reference values in whole blood studied in 103 laboratories using 20 methods.

    Science.gov (United States)

    Weykamp, C W; Penders, T J; Miedema, K; Muskiet, F A; van der Slik, W

    1995-01-01

    We investigated the effect of calibration with lyophilized calibrators on whole-blood glycohemoglobin (glyHb) results. One hundred three laboratories, using 20 different methods, determined glyHb in two lyophilized calibrators and two whole-blood samples. For whole-blood samples with low (5%) and high (9%) glyHb percentages, respectively, calibration decreased overall interlaboratory variation (CV) from 16% to 9% and from 11% to 6% and decreased intermethod variation from 14% to 6% and from 12% to 5%. Forty-seven laboratories, using 14 different methods, determined mean glyHb percentages in self-selected groups of 10 nondiabetic volunteers each. With calibration their overall mean (2SD) was 5.0% (0.5%), very close to the 5.0% (0.3%) derived from the reference method used in the Diabetes Control and Complications Trial. In both experiments the Abbott IMx and Vision showed deviating results. We conclude that, irrespective of the analytical method used, calibration enables standardization of glyHb results, reference values, and interpretation criteria.

  4. PALEOEARTHQUAKES IN THE PRIBAIKALIE: METHODS AND RESULTS OF DATING

    Directory of Open Access Journals (Sweden)

    Oleg P. Smekalin

    2010-01-01

    Full Text Available In the Pribaikalie and adjacent territories, seismogeological studies have been underway for almost a half of the century and resulted in discovery of more than 70 dislocations of seismic or presumably seismic origin. With commencement of paleoseismic studies, dating of paleo-earthquakes was focused on as an indicator useful for long-term prediction of strong earthquakes. V.P. Solonenko [Solonenko, 1977] distinguished five methods for dating paleoseismogenic deformations, i.e. geological, engineering geological, historico-archeological, dendrochronological and radiocarbon methods. However, ages of the majority of seismic deformations, which were subject to studies at the initial stage of development of seismogeology in Siberia, were defined by methods of relative or correlation age determination.Since the 1980s, studies of seismogenic deformation in the Pribaikalie have been widely conducted with trenching. Mass sampling, followed with radiocarbon analyses and definition of absolute ages of paleo-earthquakes, provided new data on seismic regimes of the territory and rates of and recent displacements along active faults, and enhanced validity of methods of relative dating, in particular morphometry. Capacities of the morphometry method has significantly increased with introduction of laser techniques in surveys and digital processing of 3D relief models.Comprehensive seismogeological studies conducted in the Pribaikalie revealed 43 paleo-events within 16 seismogenic structures. Absolute ages of 18 paleo-events were defined by the radiocarbon age determination method. Judging by their ages, a number of dislocations were related with historical earthquakes which occurred in the 18th and 19th centuries, yet any reliable data on epicenters of such events are not available. The absolute and relative dating methods allowed us to identify sections in some paleoseismogenic structures by differences in ages of activation and thus provided new data for

  5. Two different hematocrit detection methods: Different methods, different results?

    Directory of Open Access Journals (Sweden)

    Schuepbach Reto A

    2010-03-01

    Full Text Available Abstract Background Less is known about the influence of hematocrit detection methodology on transfusion triggers. Therefore, the aim of the present study was to compare two different hematocrit-assessing methods. In a total of 50 critically ill patients hematocrit was analyzed using (1 blood gas analyzer (ABLflex 800 and (2 the central laboratory method (ADVIA® 2120 and compared. Findings Bland-Altman analysis for repeated measurements showed a good correlation with a bias of +1.39% and 2 SD of ± 3.12%. The 24%-hematocrit-group showed a correlation of r2 = 0.87. With a kappa of 0.56, 22.7% of the cases would have been transfused differently. In the-28%-hematocrit group with a similar correlation (r2 = 0.8 and a kappa of 0.58, 21% of the cases would have been transfused differently. Conclusions Despite a good agreement between the two methods used to determine hematocrit in clinical routine, the calculated difference of 1.4% might substantially influence transfusion triggers depending on the employed method.

  6. Active teaching methods, studying responses and learning

    DEFF Research Database (Denmark)

    Christensen, Hans Peter; Vigild, Martin Etchells; Thomsen, Erik Vilain

    2010-01-01

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed.......Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed....

  7. The Value of Mixed Methods Research: A Mixed Methods Study

    Science.gov (United States)

    McKim, Courtney A.

    2017-01-01

    The purpose of this explanatory mixed methods study was to examine the perceived value of mixed methods research for graduate students. The quantitative phase was an experiment examining the effect of a passage's methodology on students' perceived value. Results indicated students scored the mixed methods passage as more valuable than those who…

  8. The relationship of the local food environment with obesity: A systematic review of methods, study quality and results

    Science.gov (United States)

    Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl AM

    2015-01-01

    Objective To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Methods Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. Results We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Conclusions Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. PMID:26096983

  9. New method of scoliosis assessment: preliminary results using computerized photogrammetry.

    Science.gov (United States)

    Aroeira, Rozilene Maria Cota; Leal, Jefferson Soares; de Melo Pertence, Antônio Eustáquio

    2011-09-01

    A new method for nonradiographic evaluation of scoliosis was independently compared with the Cobb radiographic method, for the quantification of scoliotic curvature. To develop a protocol for computerized photogrammetry, as a nonradiographic method, for the quantification of scoliosis, and to mathematically relate this proposed method with the Cobb radiographic method. Repeated exposure to radiation of children can be harmful to their health. Nevertheless, no nonradiographic method until now proposed has gained popularity as a routine method for evaluation, mainly due to a low correspondence to the Cobb radiographic method. Patients undergoing standing posteroanterior full-length spine radiographs, who were willing to participate in this study, were submitted to dorsal digital photography in the orthostatic position with special surface markers over the spinous process, specifically the vertebrae C7 to L5. The radiographic and photographic images were sent separately for independent analysis to two examiners, trained in quantification of scoliosis for the types of images received. The scoliosis curvature angles obtained through computerized photogrammetry (the new method) were compared to those obtained through the Cobb radiographic method. Sixteen individuals were evaluated (14 female and 2 male). All presented idiopathic scoliosis, and were between 21.4 ± 6.1 years of age; 52.9 ± 5.8 kg in weight; 1.63 ± 0.05 m in height, with a body mass index of 19.8 ± 0.2. There was no statistically significant difference between the scoliosis angle measurements obtained in the comparative analysis of both methods, and a mathematical relationship was formulated between both methods. The preliminary results presented demonstrate equivalence between the two methods. More studies are needed to firmly assess the potential of this new method as a coadjuvant tool in the routine following of scoliosis treatment.

  10. The WOMBAT Attack Attribution Method: Some Results

    Science.gov (United States)

    Dacier, Marc; Pham, Van-Hau; Thonnard, Olivier

    In this paper, we present a new attack attribution method that has been developed within the WOMBAT project. We illustrate the method with some real-world results obtained when applying it to almost two years of attack traces collected by low interaction honeypots. This analytical method aims at identifying large scale attack phenomena composed of IP sources that are linked to the same root cause. All malicious sources involved in a same phenomenon constitute what we call a Misbehaving Cloud (MC). The paper offers an overview of the various steps the method goes through to identify these clouds, providing pointers to external references for more detailed information. Four instances of misbehaving clouds are then described in some more depth to demonstrate the meaningfulness of the concept.

  11. Results of a study assessing teaching methods of faculty after measuring student learning style preference.

    Science.gov (United States)

    Stirling, Bridget V

    2017-08-01

    Learning style preference impacts how well groups of students respond to their curricula. Faculty have many choices in the methods for delivering nursing content, as well as assessing students. The purpose was to develop knowledge around how faculty delivered curricula content, and then considering these findings in the context of the students learning style preference. Following an in-service on teaching and learning styles, faculty completed surveys on their methods of teaching and the proportion of time teaching, using each learning style (visual, aural, read/write and kinesthetic). This study took place at the College of Nursing a large all-female university in Saudi Arabia. 24 female nursing faculty volunteered to participate in the project. A cross-sectional design was used. Faculty reported teaching using mostly methods that were kinesthetic and visual, although lecture was also popular (aural). Students preferred kinesthetic and aural learning methods. Read/write was the least preferred by students and the least used method of teaching by faculty. Faculty used visual methods about one third of the time, although they were not preferred by the students. Students' preferred learning style (kinesthetic) was the method most used by faculty. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Correction the Bias of Odds Ratio resulting from the Misclassification of Exposures in the Study of Environmental Risk Factors of Lung Cancer using Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Alireza Abadi

    2015-07-01

    Full Text Available Background & Objective: Inability to measure exact exposure in epidemiological studies is a common problem in many studies, especially cross-sectional studies. Depending on the extent of misclassification, results may be affected. Existing methods for solving this problem require a lot of time and money and it is not practical for some of the exposures. Recently, new methods have been proposed in 1:1 matched case–control studies that have solved these problems to some extent. In the present study we have aimed to extend the existing Bayesian method to adjust for misclassification in matched case–control Studies with 1:2 matching. Methods: Here, the standard Dirichlet prior distribution for a multinomial model was extended to allow the data of exposure–disease (OR parameter to be imported into the model excluding other parameters. Information that exist in literature about association between exposure and disease were used as prior information about OR. In order to correct the misclassification Sensitivity Analysis was accomplished and the results were obtained under three Bayesian Methods. Results: The results of naïve Bayesian model were similar to the classic model. The second Bayesian model by employing prior information about the OR, was heavily affected by these information. The third proposed model provides maximum bias adjustment for the risk of heavy metals, smoking and drug abuse. This model showed that heavy metals are not an important risk factor although raw model (logistic regression Classic detected this exposure as an influencing factor on the incidence of lung cancer. Sensitivity analysis showed that third model is robust regarding to different levels of Sensitivity and Specificity. Conclusion: The present study showed that although in most of exposures the results of the second and third model were similar but the proposed model would be able to correct the misclassification to some extent.

  13. The relationship between team climate and interprofessional collaboration: preliminary results of a mixed methods study

    OpenAIRE

    Bailey, Christopher; Agreli, Heloise F.; Peduzzi, Marina

    2016-01-01

    Relational and organisational factors are key elements of interprofessional collaboration (IPC) and team climate. Few studies have explored the relationship between IPC and team climate. This article presents a study that 10 aimed to explore IPC in primary healthcare teams and understand how the assessment of team climate may provide insights into IPC. A mixed methods study design was adopted. In Stage 1 of the study, team climate was assessed using the Team Climate Inventory with 159 profess...

  14. Evaluation and perceived results of moral case deliberation: A mixed methods study

    NARCIS (Netherlands)

    Janssens, R.; van Zadelhoff, E.; van Loo, G.; Widdershoven, G.A.; Molewijk, A.C.

    2015-01-01

    Background: Moral case deliberation is increasingly becoming part of various Dutch healthcare organizations. Although some evaluation studies of moral case deliberation have been carried out, research into the results of moral case deliberation within aged care is scarce. Research questions: How did

  15. Processing method and results of meteor shower radar observations

    International Nuclear Information System (INIS)

    Belkovich, O.I.; Suleimanov, N.I.; Tokhtasjev, V.S.

    1987-01-01

    Studies of meteor showers permit the solving of some principal problems of meteor astronomy: to obtain the structure of a stream in cross section and along its orbits; to retrace the evolution of particle orbits of the stream taking into account gravitational and nongravitational forces and to discover the orbital elements of its parent body; to find out the total mass of solid particles ejected from the parent body taking into account physical and chemical evolution of meteor bodies; and to use meteor streams as natural probes for investigation of the average characteristics of the meteor complex in the solar system. A simple and effective method of determining the flux density and mass exponent parameter was worked out. This method and its results are discussed

  16. Comparative study of the geostatistical ore reserve estimation method over the conventional methods

    International Nuclear Information System (INIS)

    Kim, Y.C.; Knudsen, H.P.

    1975-01-01

    Part I contains a comprehensive treatment of the comparative study of the geostatistical ore reserve estimation method over the conventional methods. The conventional methods chosen for comparison were: (a) the polygon method, (b) the inverse of the distance squared method, and (c) a method similar to (b) but allowing different weights in different directions. Briefly, the overall result from this comparative study is in favor of the use of geostatistics in most cases because the method has lived up to its theoretical claims. A good exposition on the theory of geostatistics, the adopted study procedures, conclusions and recommended future research are given in Part I. Part II of this report contains the results of the second and the third study objectives, which are to assess the potential benefits that can be derived by the introduction of the geostatistical method to the current state-of-the-art in uranium reserve estimation method and to be instrumental in generating the acceptance of the new method by practitioners through illustrative examples, assuming its superiority and practicality. These are given in the form of illustrative examples on the use of geostatistics and the accompanying computer program user's guide

  17. Multiband discrete ordinates method: formalism and results

    International Nuclear Information System (INIS)

    Luneville, L.

    1998-06-01

    The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author)

  18. Method for Developing Descriptions of Hard-to-Price Products: Results of the Telecommunications Product Study

    Energy Technology Data Exchange (ETDEWEB)

    Conrad, F.; Tonn, B.

    1999-05-01

    This report presents the results of a study to test a new method for developing descriptions of hard-to-price products. The Bureau of Labor Statistics (BLS) is responsible for collecting data to estimate price indices such as the Consumers Price Index (BLS) is responsible for collecting data to estimate price indices such as the Consumers Price Index (CPI). BLS accomplishes this task by sending field staff to places of business to price actual products. The field staff are given product checklists to help them determine whether products found today are comparable to products priced the previous month. Prices for non-comparable products are not included in the current month's price index calculations. A serious problem facing BLS is developing product checklists for dynamic product areas, new industries, and the service sector. It is difficult to keep checklists up-to-date and quite often simply to develop checklists for service industry products. Some people estimates that upwards of 50 % of US economic activity is not accounted for in the CPI

  19. Experimental Results and Numerical Simulation of the Target RCS using Gaussian Beam Summation Method

    Directory of Open Access Journals (Sweden)

    Ghanmi Helmi

    2018-05-01

    Full Text Available This paper presents a numerical and experimental study of Radar Cross Section (RCS of radar targets using Gaussian Beam Summation (GBS method. The purpose GBS method has several advantages over ray method, mainly on the caustic problem. To evaluate the performance of the chosen method, we started the analysis of the RCS using Gaussian Beam Summation (GBS and Gaussian Beam Launching (GBL, the asymptotic models Physical Optic (PO, Geometrical Theory of Diffraction (GTD and the rigorous Method of Moment (MoM. Then, we showed the experimental validation of the numerical results using experimental measurements which have been executed in the anechoic chamber of Lab-STICC at ENSTA Bretagne. The numerical and experimental results of the RCS are studied and given as a function of various parameters: polarization type, target size, Gaussian beams number and Gaussian beams width.

  20. [Do different interpretative methods used for evaluation of checkerboard synergy test affect the results?].

    Science.gov (United States)

    Ozseven, Ayşe Gül; Sesli Çetin, Emel; Ozseven, Levent

    2012-07-01

    In recent years, owing to the presence of multi-drug resistant nosocomial bacteria, combination therapies are more frequently applied. Thus there is more need to investigate the in vitro activity of drug combinations against multi-drug resistant bacteria. Checkerboard synergy testing is among the most widely used standard technique to determine the activity of antibiotic combinations. It is based on microdilution susceptibility testing of antibiotic combinations. Although this test has a standardised procedure, there are many different methods for interpreting the results. In many previous studies carried out with multi-drug resistant bacteria, different rates of synergy have been reported with various antibiotic combinations using checkerboard technique. These differences might be attributed to the different features of the strains. However, different synergy rates detected by checkerboard method have also been reported in other studies using the same drug combinations and same types of bacteria. It was thought that these differences in synergy rates might be due to the different methods of interpretation of synergy test results. In recent years, multi-drug resistant Acinetobacter baumannii has been the most commonly encountered nosocomial pathogen especially in intensive-care units. For this reason, multidrug resistant A.baumannii has been the subject of a considerable amount of research about antimicrobial combinations. In the present study, the in vitro activities of frequently preferred combinations in A.baumannii infections like imipenem plus ampicillin/sulbactam, and meropenem plus ampicillin/sulbactam were tested by checkerboard synergy method against 34 multi-drug resistant A.baumannii isolates. Minimum inhibitory concentration (MIC) values for imipenem, meropenem and ampicillin/sulbactam were determined by the broth microdilution method. Subsequently the activity of two different combinations were tested in the dilution range of 4 x MIC and 0.03 x MIC in

  1. Developing a bone mineral density test result letter to send to patients: a mixed-methods study

    Directory of Open Access Journals (Sweden)

    Edmonds SW

    2014-06-01

    Full Text Available Stephanie W Edmonds,1,2 Samantha L Solimeo,3 Xin Lu,1 Douglas W Roblin,4,8 Kenneth G Saag,5 Peter Cram6,7 1Department of Internal Medicine, 2College of Nursing, University of Iowa, Iowa City, IA, USA; 3Center for Comprehensive Access and Delivery Research and Evaluation, Iowa City Veterans Affairs Health Care System, Iowa City, IA, USA; 4Kaiser Permanente of Atlanta, Atlanta, GA, USA; 5Department of Rheumatology, University of Alabama at Birmingham, Birmingham, AL, USA; 6Faculty of Medicine, University of Toronto, Toronto, ON, Canada; 7University Health Network and Mount Sinai Hospital, Toronto, ON, Canada; 8School of Public Health, Georgia State University, Atlanta, GA, USA Purpose: To use a mixed-methods approach to develop a letter that can be used to notify patients of their bone mineral density (BMD results by mail that may activate patients in their bone-related health care. Patients and methods: A multidisciplinary team developed three versions of a letter for reporting BMD results to patients. Trained interviewers presented these letters in a random order to a convenience sample of adults, aged 50 years and older, at two different health care systems. We conducted structured interviews to examine the respondents’ preferences and comprehension among the various letters. Results: A total of 142 participants completed the interview. A majority of the participants were female (64.1% and white (76.1%. A plurality of the participants identified a specific version of the three letters as both their preferred version (45.2%; P<0.001 and as the easiest to understand (44.6%; P<0.01. A majority of participants preferred that the letters include specific next steps for improving their bone health. Conclusion: Using a mixed-methods approach, we were able to develop and optimize a printed letter for communicating a complex test result (BMD to patients. Our results may offer guidance to clinicians, administrators, and researchers who are

  2. The estimation of the measurement results with using statistical methods

    International Nuclear Information System (INIS)

    Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" data-affiliation=" (State Enterprise Ukrmetrteststandard, 4, Metrologichna Str., 03680, Kyiv (Ukraine))" >Velychko, O; UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" data-affiliation=" (State Scientific Institution UkrNDIspirtbioprod, 3, Babushkina Lane, 03190, Kyiv (Ukraine))" >Gordiyenko, T

    2015-01-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed

  3. The estimation of the measurement results with using statistical methods

    Science.gov (United States)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  4. Non-Destructive Evaluation Method Based On Dynamic Invariant Stress Resultants

    Directory of Open Access Journals (Sweden)

    Zhang Junchi

    2015-01-01

    Full Text Available Most of the vibration based damage detection methods are based on changes in frequencies, mode shapes, mode shape curvature, and flexibilities. These methods are limited and typically can only detect the presence and location of damage. Current methods seldom can identify the exact severity of damage to structures. This paper will present research in the development of a new non-destructive evaluation method to identify the existence, location, and severity of damage for structural systems. The method utilizes the concept of invariant stress resultants (ISR. The basic concept of ISR is that at any given cross section the resultant internal force distribution in a structural member is not affected by the inflicted damage. The method utilizes dynamic analysis of the structure to simulate direct measurements of acceleration, velocity and displacement simultaneously. The proposed dynamic ISR method is developed and utilized to detect the damage of corresponding changes in mass, damping and stiffness. The objectives of this research are to develop the basic theory of the dynamic ISR method, apply it to the specific types of structures, and verify the accuracy of the developed theory. Numerical results that demonstrate the application of the method will reflect the advanced sensitivity and accuracy in characterizing multiple damage locations.

  5. The "Interval Walking in Colorectal Cancer" (I-WALK-CRC) study: Design, methods and recruitment results of a randomized controlled feasibility trial.

    Science.gov (United States)

    Banck-Petersen, Anna; Olsen, Cecilie K; Djurhuus, Sissal S; Herrstedt, Anita; Thorsen-Streit, Sarah; Ried-Larsen, Mathias; Østerlind, Kell; Osterkamp, Jens; Krarup, Peter-Martin; Vistisen, Kirsten; Mosgaard, Camilla S; Pedersen, Bente K; Højman, Pernille; Christensen, Jesper F

    2018-03-01

    Low physical activity level is associated with poor prognosis in patients with colorectal cancer (CRC). To increase physical activity, technology-based platforms are emerging and provide intriguing opportunities to prescribe and monitor active lifestyle interventions. The "Interval Walking in Colorectal Cancer"(I-WALK-CRC) study explores the feasibility and efficacy a home-based interval-walking intervention delivered by a smart-phone application in order to improve cardio-metabolic health profile among CRC survivors. The aim of the present report is to describe the design, methods and recruitment results of the I-WALK-CRC study.Methods/Results: The I-WALK-CRC study is a randomized controlled trial designed to evaluate the feasibility and efficacy of a home-based interval walking intervention compared to a waiting-list control group for physiological and patient-reported outcomes. Patients who had completed surgery for local stage disease and patients who had completed surgery and any adjuvant chemotherapy for locally advanced stage disease were eligible for inclusion. Between October 1st , 2015, and February 1st , 2017, 136 inquiries were recorded; 83 patients were eligible for enrollment, and 42 patients accepted participation. Age and employment status were associated with participation, as participants were significantly younger (60.5 vs 70.8 years, P CRC survivors was feasible but we aim to better the recruitment rate in future studies. Further, the study clearly favored younger participants. The I-WALK-CRC study will provide important information regarding feasibility and efficacy of a home-based walking exercise program in CRC survivors.

  6. Experimental study on rapid embankment construction methods

    International Nuclear Information System (INIS)

    Hirano, Hideaki; Egawa, Kikuji; Hyodo, Kazuya; Kannoto, Yasuo; Sekimoto, Tsuyoshi; Kobayashi, Kokichi.

    1982-01-01

    In the construction of a thermal or nuclear power plant in a coastal area, shorter embankment construction period has come to be called for recently. This tendency is remarkable where construction period is limited due to meteorological or sea conditions. To meet this requirement, the authors have been conducting basic experimental studies on two methods for the rapid execution of embankment construction, that is, Steel Plate Cellular Bulkhead Embedding Method and Ship Hull Caisson Method. This paper presents an outline of the results of the experimental study on these two methods. (author)

  7. Life cycle analysis of electricity systems: Methods and results

    International Nuclear Information System (INIS)

    Friedrich, R.; Marheineke, T.

    1996-01-01

    The two methods for full energy chain analysis, process analysis and input/output analysis, are discussed. A combination of these two methods provides the most accurate results. Such a hybrid analysis of the full energy chains of six different power plants is presented and discussed. The results of such analyses depend on time, site and technique of each process step and, therefore have no general validity. For renewable energy systems the emissions form the generation of a back-up system should be added. (author). 7 figs, 1 fig

  8. The impact of secure messaging on workflow in primary care: Results of a multiple-case, multiple-method study.

    Science.gov (United States)

    Hoonakker, Peter L T; Carayon, Pascale; Cartmill, Randi S

    2017-04-01

    Secure messaging is a relatively new addition to health information technology (IT). Several studies have examined the impact of secure messaging on (clinical) outcomes but very few studies have examined the impact on workflow in primary care clinics. In this study we examined the impact of secure messaging on workflow of clinicians, staff and patients. We used a multiple case study design with multiple data collections methods (observation, interviews and survey). Results show that secure messaging has the potential to improve communication and information flow and the organization of work in primary care clinics, partly due to the possibility of asynchronous communication. However, secure messaging can also have a negative effect on communication and increase workload, especially if patients send messages that are not appropriate for the secure messaging medium (for example, messages that are too long, complex, ambiguous, or inappropriate). Results show that clinicians are ambivalent about secure messaging. Secure messaging can add to their workload, especially if there is high message volume, and currently they are not compensated for these activities. Staff is -especially compared to clinicians- relatively positive about secure messaging and patients are overall very satisfied with secure messaging. Finally, clinicians, staff and patients think that secure messaging can have a positive effect on quality of care and patient safety. Secure messaging is a tool that has the potential to improve communication and information flow. However, the potential of secure messaging to improve workflow is dependent on the way it is implemented and used. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. EQUITY SHARES EQUATING THE RESULTS OF FCFF AND FCFE METHODS

    Directory of Open Access Journals (Sweden)

    Bartłomiej Cegłowski

    2012-06-01

    Full Text Available The aim of the article is to present the method of establishing equity shares in weight average cost of capital (WACC, in which the value of loan capital results from the fixed assumptions accepted in the financial plan (for example a schedule of loan repayment and own equity is evaluated by means of a discount method. The described method causes that, regardless of whether cash flows are calculated as FCFF or FCFE, the result of the company valuation will be identical.

  10. Decay correction methods in dynamic PET studies

    International Nuclear Information System (INIS)

    Chen, K.; Reiman, E.; Lawson, M.

    1995-01-01

    In order to reconstruct positron emission tomography (PET) images in quantitative dynamic studies, the data must be corrected for radioactive decay. One of the two commonly used methods ignores physiological processes including blood flow that occur at the same time as radioactive decay; the other makes incorrect use of time-accumulated PET counts. In simulated dynamic PET studies using 11 C-acetate and 18 F-fluorodeoxyglucose (FDG), these methods are shown to result in biased estimates of the time-activity curve (TAC) and model parameters. New methods described in this article provide significantly improved parameter estimates in dynamic PET studies

  11. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  12. STANDARDIZATION OF GLYCOHEMOGLOBIN RESULTS AND REFERENCE VALUES IN WHOLE-BLOOD STUDIED IN 103 LABORATORIES USING 20 METHODS

    NARCIS (Netherlands)

    WEYKAMP, CW; PENDERS, TJ; MUSKIET, FAJ; VANDERSLIK, W

    We investigated the effect of calibration with lyophilized calibrators on whole-blood glycohemoglobin (glyHb) results. One hundred three laboratories, using 20 different methods, determined glyHb in two lyophilized calibrators and two whole-blood samples. For whole-blood samples with low (5%) and

  13. Optimizing Usability Studies by Complementary Evaluation Methods

    NARCIS (Netherlands)

    Schmettow, Martin; Bach, Cedric; Scapin, Dominique

    2014-01-01

    This paper examines combinations of complementary evaluation methods as a strategy for efficient usability problem discovery. A data set from an earlier study is re-analyzed, involving three evaluation methods applied to two virtual environment applications. Results of a mixed-effects logistic

  14. Evaluation and perceived results of moral case deliberation: A mixed methods study.

    Science.gov (United States)

    Janssens, Rien M J P A; van Zadelhoff, Ezra; van Loo, Ger; Widdershoven, Guy A M; Molewijk, Bert A C

    2015-12-01

    Moral case deliberation is increasingly becoming part of various Dutch healthcare organizations. Although some evaluation studies of moral case deliberation have been carried out, research into the results of moral case deliberation within aged care is scarce. How did participants evaluate moral case deliberation? What has moral case deliberation brought to them? What has moral case deliberation contributed to care practice? Should moral case deliberation be further implemented and, if so, how? Quantitative analysis of a questionnaire study among participants of moral case deliberation, both caregivers and team leaders. Qualitative analysis of written answers to open questions, interview study and focus group meetings among caregivers and team leaders. Caregivers and team leaders in a large organization for aged care in the Netherlands. A total of 61 moral case deliberation sessions, carried out on 16 care locations belonging to the organization, were evaluated and perceived results were assessed. Participants gave informed consent and anonymity was guaranteed. In the Netherlands, the law does not prescribe independent ethical review by an Institutional Review Board for this kind of research among healthcare professionals. Moral case deliberation was evaluated positively by the participants. Content and atmosphere of moral case deliberation received high scores, while organizational issues regarding the moral case deliberation sessions scored lower and merit further attention. Respondents indicated that moral case deliberation has the potential to contribute to care practice as relationships among team members improve, more openness is experienced and more understanding for different perspectives is fostered. If moral case deliberation is to be successfully implemented, top-down approaches should go hand in hand with bottom-up approaches. The relevance of moral case deliberation for care practice received wide acknowledgement from the respondents. It can contribute

  15. On Calculation Methods and Results for Straight Cylindrical Roller Bearing Deflection, Stiffness, and Stress

    Science.gov (United States)

    Krantz, Timothy L.

    2011-01-01

    The purpose of this study was to assess some calculation methods for quantifying the relationships of bearing geometry, material properties, load, deflection, stiffness, and stress. The scope of the work was limited to two-dimensional modeling of straight cylindrical roller bearings. Preparations for studies of dynamic response of bearings with damaged surfaces motivated this work. Studies were selected to exercise and build confidence in the numerical tools. Three calculation methods were used in this work. Two of the methods were numerical solutions of the Hertz contact approach. The third method used was a combined finite element surface integral method. Example calculations were done for a single roller loaded between an inner and outer raceway for code verification. Next, a bearing with 13 rollers and all-steel construction was used as an example to do additional code verification, including an assessment of the leading order of accuracy of the finite element and surface integral method. Results from that study show that the method is at least first-order accurate. Those results also show that the contact grid refinement has a more significant influence on precision as compared to the finite element grid refinement. To explore the influence of material properties, the 13-roller bearing was modeled as made from Nitinol 60, a material with very different properties from steel and showing some potential for bearing applications. The codes were exercised to compare contact areas and stress levels for steel and Nitinol 60 bearings operating at equivalent power density. As a step toward modeling the dynamic response of bearings having surface damage, static analyses were completed to simulate a bearing with a spall or similar damage.

  16. A Fuzzy Logic Based Method for Analysing Test Results

    Directory of Open Access Journals (Sweden)

    Le Xuan Vinh

    2017-11-01

    Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.

  17. A holographic method for investigating cylindrical symmetry plasmas resulting from electric discharges

    International Nuclear Information System (INIS)

    Rosu, N.; Ralea, M.; Foca, M.; Iova, I.

    1992-01-01

    A new method based on holographic interferometry in real time with reference fringes for diagnosing gas electric discharges in cylindrical symmetry tubes is presented. A method for obtaining and quantitatively investigating interferograms obtained with a video camera is described. By studying the resulting images frame by frame and introducing the measurements into an adequate computer programme one gets a graphical recording of the radial distribution of the charged particle concentration in the plasma in any region of the tube at a given time, as well as their axial distribution. The real time evolution of certain phenomena occurring in the discharge tube can also be determined by this non-destructive method. The method is used for electric discharges in Ar at average pressures in a discharge tube with hollow cathode effect. (Author)

  18. Activating teaching methods, studying responses and learning

    OpenAIRE

    Christensen, Hans Peter; Vigild, Martin E.; Thomsen, Erik; Szabo, Peter; Horsewell, Andy

    2009-01-01

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed. Peer Reviewed

  19. Validation of a near infrared microscopy method for the detection of animal products in feedingstuffs: results of a collaborative study.

    Science.gov (United States)

    Boix, A; Fernández Pierna, J A; von Holst, C; Baeten, V

    2012-01-01

    The performance characteristics of a near infrared microscopy (NIRM) method, when applied to the detection of animal products in feedingstuffs, were determined via a collaborative study. The method delivers qualitative results in terms of the presence or absence of animal particles in feed and differentiates animal from vegetable feed ingredients on the basis of the evaluation of near infrared spectra obtained from individual particles present in the sample. The specificity ranged from 86% to 100%. The limit of detection obtained on the analysis of the sediment fraction, prepared as for the European official method, was 0.1% processed animal proteins (PAPs) in feed, since all laboratories correctly identified the positive samples. This limit has to be increased up to 2% for the analysis of samples which are not sedimented. The required sensitivity for the official control is therefore achieved in the analysis of the sediment fraction of the samples where the method can be applied for the detection of the presence of animal meal. Criteria for the classification of samples, when fewer than five spectra are found, as being of animal origin needs to be set up in order to harmonise the approach taken by the laboratories when applying NIRM for the detection of the presence of animal meal in feed.

  20. Do qualitative methods validate choice experiment-results? A case study on the economic valuation of peatland restoration in Central Kalimantan, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Schaafsma, M.; Van Beukering, P.J.H.; Davies, O.; Oskolokaite, I.

    2009-05-15

    This study explores the benefits of combining independent results of qualitative focus group discussions (FGD) with a quantitative choice experiment (CE) in a developing country context. The assessment addresses the compensation needed by local communities in Central Kalimantan to cooperate in peatland restoration programs by using a CE combined with a series of FGD to validate and explain the CE-results. The main conclusion of this study is that a combination of qualitative and quantitative methods is necessary to assess the economic value of ecological services in monetary terms and to better understand the underlying attitudes and motives that drive these outcomes. The FGD not only cross-validate results of the CE, but also help to interpret the differences in preferences of respondents arising from environmental awareness and ecosystem characteristics. The FGD confirms that the CE results provide accurate information for ecosystem valuation. Additional to the advantages of FGD listed in the literature, this study finds that FGD provide the possibility to identify the specific terms and conditions on which respondents will accept land-use change scenarios. The results show that FGD may help to address problems regarding the effects of distribution of costs and benefits over time that neo-classical economic theory poses for the interpretation of economic valuation results in the demand it puts on the rationality of trade-offs and the required calculations.

  1. The Influence of Parenting toward Religious Behavior and Study Result

    Directory of Open Access Journals (Sweden)

    Yulisna Yulisna

    2017-06-01

    Full Text Available The aim of this article is to present the results of research concerning empirical description of the parenting and its influences on religious behavior and students’ study results in the subject of PAI (Pendidikan Agama Islam/Islamic Education. The research method used is qualitative and quantitative methods. The population of the research is all students and their parents in the fifth grade of elementary school in one group of Pulau Kijang, in Reteh Subdistrict, Indragiri Hilir, Riau. The sampling used the technique of cluster sampling for 80 students and 80 parents. The results of the research show that the parenting determines the height and low of students’ religious behavior and PAI study results. Students who have high and average religious behavior are educated by the parents having the authoritative parenting, while the students having low religious behavior are those who are educated by authoritarian, authoritative, permissive, authoritarian-authoritative combination, and authoritative-permissive combination parentings. Meanwhile, students who have the high study results are educated by the parents having the authoritative parenting, while the students whose study results are average are educated by the authoritarian, authoritative, permissive, authoritarian-authoritative combination, and authoritative-permissive combination parentings. 

  2. [Results of treatment of milk teeth pulp by modified formocresol method].

    Science.gov (United States)

    Wochna-Sobańska, M

    1989-01-01

    The purpose of the study was evaluation of the results of treatment of molar pulp diseases by the formocresol method from the standpoint of the development of inflammatory complications in periapical tissues, disturbances of physiological resorption of roots, disturbances of mineralization of crowns of homologous permanent teeth. For the treatment milk molars were qualified with the diagnosis of grade II pulpopathy in children aged from 3 to 9 years. The treatment was done using formocresol by a modified method of pulp amputation according to Buckley after previous devitalization with parapaste. The status of 143 teeth was examined again 1 to 4 years after completion of treatment. The proportion of positive results after one year was 94%, after two years it was 90%, after three years 87% and after four years 80%. The cause of premature loss of most teeth was root resorption acceleration by 18-24 months. No harmful action of formocresol on the buds of permanent teeth was noted.

  3. A result-driven minimum blocking method for PageRank parallel computing

    Science.gov (United States)

    Tao, Wan; Liu, Tao; Yu, Wei; Huang, Gan

    2017-01-01

    Matrix blocking is a common method for improving computational efficiency of PageRank, but the blocking rules are hard to be determined, and the following calculation is complicated. In tackling these problems, we propose a minimum blocking method driven by result needs to accomplish a parallel implementation of PageRank algorithm. The minimum blocking just stores the element which is necessary for the result matrix. In return, the following calculation becomes simple and the consumption of the I/O transmission is cut down. We do experiments on several matrixes of different data size and different sparsity degree. The results show that the proposed method has better computational efficiency than traditional blocking methods.

  4. Results of an interlaboratory comparison of analytical methods for contaminants of emerging concern in water.

    Science.gov (United States)

    Vanderford, Brett J; Drewes, Jörg E; Eaton, Andrew; Guo, Yingbo C; Haghani, Ali; Hoppe-Jones, Christiane; Schluesener, Michael P; Snyder, Shane A; Ternes, Thomas; Wood, Curtis J

    2014-01-07

    An evaluation of existing analytical methods used to measure contaminants of emerging concern (CECs) was performed through an interlaboratory comparison involving 25 research and commercial laboratories. In total, 52 methods were used in the single-blind study to determine method accuracy and comparability for 22 target compounds, including pharmaceuticals, personal care products, and steroid hormones, all at ng/L levels in surface and drinking water. Method biases ranged from caffeine, NP, OP, and triclosan had false positive rates >15%. In addition, some methods reported false positives for 17β-estradiol and 17α-ethynylestradiol in unspiked drinking water and deionized water, respectively, at levels higher than published predicted no-effect concentrations for these compounds in the environment. False negative rates were also generally contamination, misinterpretation of background interferences, and/or inappropriate setting of detection/quantification levels for analysis at low ng/L levels. The results of both comparisons were collectively assessed to identify parameters that resulted in the best overall method performance. Liquid chromatography-tandem mass spectrometry coupled with the calibration technique of isotope dilution were able to accurately quantify most compounds with an average bias of <10% for both matrixes. These findings suggest that this method of analysis is suitable at environmentally relevant levels for most of the compounds studied. This work underscores the need for robust, standardized analytical methods for CECs to improve data quality, increase comparability between studies, and help reduce false positive and false negative rates.

  5. The Influence of Parenting toward Religious Behavior and Study Result

    OpenAIRE

    Yulisna Yulisna; Hadi Arbiyanto; Munawar Rahmat

    2017-01-01

    The aim of this article is to present the results of research concerning empirical description of the parenting and its influences on religious behavior and students’ study results in the subject of PAI (Pendidikan Agama Islam/Islamic Education). The research method used is qualitative and quantitative methods. The population of the research is all students and their parents in the fifth grade of elementary school in one group of Pulau Kijang, in Reteh Subdistrict, Indragiri Hilir, Riau. The ...

  6. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON

    International Nuclear Information System (INIS)

    BEEBE - WANG, J.; LUCCIO, A.U.; D IMPERIO, N.; MACHIDA, S.

    2002-01-01

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed

  7. SPACE CHARGE SIMULATION METHODS INCORPORATED IN SOME MULTI - PARTICLE TRACKING CODES AND THEIR RESULTS COMPARISON.

    Energy Technology Data Exchange (ETDEWEB)

    BEEBE - WANG,J.; LUCCIO,A.U.; D IMPERIO,N.; MACHIDA,S.

    2002-06-03

    Space charge in high intensity beams is an important issue in accelerator physics. Due to the complicity of the problems, the most effective way of investigating its effect is by computer simulations. In the resent years, many space charge simulation methods have been developed and incorporated in various 2D or 3D multi-particle-tracking codes. It has becoming necessary to benchmark these methods against each other, and against experimental results. As a part of global effort, we present our initial comparison of the space charge methods incorporated in simulation codes ORBIT++, ORBIT and SIMPSONS. In this paper, the methods included in these codes are overviewed. The simulation results are presented and compared. Finally, from this study, the advantages and disadvantages of each method are discussed.

  8. Radioisotopic methods in the study of salivary glands

    International Nuclear Information System (INIS)

    Salvatori, M.; Valenza, V.; Focacci, C.

    1986-01-01

    The results achieved by dynamic and static salivary gland scintigraphy in 272 patients over a ten year time (January 1976-December 1985) are reported. On the basis of a semi-quantitative assessment of time/activity curves, dynamic studies prove to be the most suitable method for studying functional disorders (phlogosis, facial paralisis, etc.). Harmlesness, easy execution and functional results are the mains advantages of radioisotope techniques. Salivary gland scintigraphy has some limits in the study of space occupying lesions (SOL): however, ultrasounds, CT and sialography represent the methods of choice in this field of salivary gland pathology

  9. Application of Statistical Methods to Activation Analytical Results near the Limit of Detection

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Wanscher, B.

    1978-01-01

    Reporting actual numbers instead of upper limits for analytical results at or below the detection limit may produce reliable data when these numbers are subjected to appropriate statistical processing. Particularly in radiometric methods, such as activation analysis, where individual standard...... deviations of analytical results may be estimated, improved discrimination may be based on the Analysis of Precision. Actual experimental results from a study of the concentrations of arsenic in human skin demonstrate the power of this principle....

  10. Long-term mental wellbeing of adolescents and young adults diagnosed with venous thromboembolism: results from a multistage mixed methods study.

    Science.gov (United States)

    Højen, A A; Sørensen, E E; Dreyer, P S; Søgaard, M; Larsen, T B

    2017-12-01

    Essentials Long-term mental wellbeing of adolescents and young adults with venous thromboembolism is unclear. This multistage mixed methods study was based on Danish nationwide registry data and interviews. Mental wellbeing is negatively impacted in the long-term and uncertainty of recurrence is pivotal. The perceived health threat is more important than disease severity for long-term mental wellbeing. Background Critical and chronic illness in youth can lead to impaired mental wellbeing. Venous thromboembolism (VTE) is a potentially traumatic and life-threatening condition. Nonetheless, the long-term mental wellbeing of adolescents and young adults (AYAS) with VTE is unclear. Objectives To investigate the long-term mental wellbeing of AYAS (aged 13-33 years) diagnosed with VTE. Methods We performed a multistage mixed method study based on data from the Danish nationwide health registries, and semistructured interviews with 12 AYAS diagnosed with VTE. An integrated mixed methods interpretation of the findings was conducted through narrative weaving and joint displays. Results The integrated mixed methods interpretation showed that the mental wellbeing of AYAS with VTE had a chronic perspective, with a persistently higher risk of psychotropic drug purchase among AYAS with a first-time diagnosis of VTE than among sex-matched and age-matched population controls and AYAS with a first-time diagnosis of insulin-dependent diabetes mellitus. Impaired mental wellbeing was largely connected to a fear of recurrence and concomitant uncertainty. Therefore, it was important for the long-term mental wellbeing to navigate uncertainty. The perceived health threat played a more profound role in long-term mental wellbeing than disease severity, as the potential life threat was the pivot which pointed back to the initial VTE and forward to the perception of future health threat and the potential risk of dying of a recurrent event. Conclusion Our findings show that the long

  11. Methods uncovering usability issues in medication-related alerting functions: results from a systematic review.

    Science.gov (United States)

    Marcilly, Romaric; Vasseur, Francis; Ammenwerth, Elske; Beuscart-Zephir, Marie-Catherine

    2014-01-01

    This paper aims at listing the methods used to evaluate the usability of medication-related alerting functions and at knowing what type of usability issues those methods allow to detect. A sub-analysis of data from this systematic review has been performed. Methods applied in the included papers were collected. Then, included papers were sorted in four types of evaluation: "expert evaluation", "user- testing/simulation", "on site observation" and "impact studies". The types of usability issues (usability flaws, usage problems and negative outcomes) uncovered by those evaluations were analyzed. Results show that a large set of methods are used. The largest proportion of papers uses "on site observation" evaluation. This is the only evaluation type for which every kind of usability flaws, usage problems and outcomes are detected. It is somehow surprising that, in a usability systematic review, most of the papers included use a method that is not often presented as a usability method. Results are discussed about the opportunity to provide usability information collected after the implementation of the technology during their design process, i.e. before their implementation.

  12. The anchors of steel wire ropes, testing methods and their results

    Directory of Open Access Journals (Sweden)

    J. Krešák

    2012-10-01

    Full Text Available The present paper introduces an application of the acoustic and thermographic method in the defectoscopic testing of immobile steel wire ropes at the most critical point, the anchor. First measurements and their results by these new defectoscopic methods are shown. In defectoscopic tests at the anchor, the widely used magnetic method gives unreliable results, and therefore presents a problem for steel wire defectoscopy. Application of the two new methods in the steel wire defectoscopy at the anchor point will enable increased safety measures at the anchor of steel wire ropes in bridge, roof, tower and aerial cable lift constructions.

  13. Designing A Mixed Methods Study In Primary Care

    Science.gov (United States)

    Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.

    2004-01-01

    BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277

  14. Development and application of a new deterministic method for calculating computer model result uncertainties

    International Nuclear Information System (INIS)

    Maerker, R.E.; Worley, B.A.

    1989-01-01

    Interest in research into the field of uncertainty analysis has recently been stimulated as a result of a need in high-level waste repository design assessment for uncertainty information in the form of response complementary cumulative distribution functions (CCDFs) to show compliance with regulatory requirements. The solution to this problem must obviously rely on the analysis of computer code models, which, however, employ parameters that can have large uncertainties. The motivation for the research presented in this paper is a search for a method involving a deterministic uncertainty analysis approach that could serve as an improvement over those methods that make exclusive use of statistical techniques. A deterministic uncertainty analysis (DUA) approach based on the use of first derivative information is the method studied in the present procedure. The present method has been applied to a high-level nuclear waste repository problem involving use of the codes ORIGEN2, SAS, and BRINETEMP in series, and the resulting CDF of a BRINETEMP result of interest is compared with that obtained through a completely statistical analysis

  15. Air sampling methods to evaluate microbial contamination in operating theatres: results of a comparative study in an orthopaedics department.

    Science.gov (United States)

    Napoli, C; Tafuri, S; Montenegro, L; Cassano, M; Notarnicola, A; Lattarulo, S; Montagna, M T; Moretti, B

    2012-02-01

    To evaluate the level of microbial contamination of air in operating theatres using active [i.e. surface air system (SAS)] and passive [i.e. index of microbial air contamination (IMA) and nitrocellulose membranes positioned near the wound] sampling systems. Sampling was performed between January 2010 and January 2011 in the operating theatre of the orthopaedics department in a university hospital in Southern Italy. During surgery, the mean bacterial loads recorded were 2232.9 colony-forming units (cfu)/m(2)/h with the IMA method, 123.2 cfu/m(3) with the SAS method and 2768.2 cfu/m(2)/h with the nitrocellulose membranes. Correlation was found between the results of the three methods. Staphylococcus aureus was detected in 12 of 60 operations (20%) with the membranes, five (8.3%) operations with the SAS method, and three operations (5%) with the IMA method. Use of nitrocellulose membranes placed near a wound is a valid method for measuring the microbial contamination of air. This method was more sensitive than the IMA method and was not subject to any calibration bias, unlike active air monitoring systems. Copyright © 2011 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  16. European external quality control study on the competence of laboratories to recognize rare sequence variants resulting in unusual genotyping results.

    Science.gov (United States)

    Márki-Zay, János; Klein, Christoph L; Gancberg, David; Schimmel, Heinz G; Dux, László

    2009-04-01

    Depending on the method used, rare sequence variants adjacent to the single nucleotide polymorphism (SNP) of interest may cause unusual or erroneous genotyping results. Because such rare variants are known for many genes commonly tested in diagnostic laboratories, we organized a proficiency study to assess their influence on the accuracy of reported laboratory results. Four external quality control materials were processed and sent to 283 laboratories through 3 EQA organizers for analysis of the prothrombin 20210G>A mutation. Two of these quality control materials contained sequence variants introduced by site-directed mutagenesis. One hundred eighty-nine laboratories participated in the study. When samples gave a usual result with the method applied, the error rate was 5.1%. Detailed analysis showed that more than 70% of the failures were reported from only 9 laboratories. Allele-specific amplification-based PCR had a much higher error rate than other methods (18.3% vs 2.9%). The variants 20209C>T and [20175T>G; 20179_20180delAC] resulted in unusual genotyping results in 67 and 85 laboratories, respectively. Eighty-three (54.6%) of these unusual results were not recognized, 32 (21.1%) were attributed to technical issues, and only 37 (24.3%) were recognized as another sequence variant. Our findings revealed that some of the participating laboratories were not able to recognize and correctly interpret unusual genotyping results caused by rare SNPs. Our study indicates that the majority of the failures could be avoided by improved training and careful selection and validation of the methods applied.

  17. Personality, Study Methods and Academic Performance

    Science.gov (United States)

    Entwistle, N. J.; Wilson, J. D.

    1970-01-01

    A questionnaire measuring four student personality types--stable introvert, unstable introvert, stable extrovert and unstable extrovert--along with the Eysenck Personality Inventory (Form A) were give to 72 graduate students at Aberdeen University and the results showed recognizable interaction between study methods, motivation and personality…

  18. THE RESULTS OF THE ANALYSIS OF THE STUDENTS’ BODY COMPOSITION BY BIOIMPEDANCE METHOD

    Directory of Open Access Journals (Sweden)

    Dmitry S. Blinov

    2016-06-01

    Full Text Available Introduction. Tissues of the human body can conduct electricity. Liquid medium (water, blood, the contents of hollow bodies, have a low impedance, i.e. good conductors, while denser tissue (muscle, nerves, etc. resistance is significantly higher. The biggest impedance have fat and bone tissues. The bioimpendancemetry – a method which allows to determine the composition of the human body by measuring electrical resistance (impedance of its tissues. Relevance. This technique is indispensable to dieticians and fitness trainers. In addition, the results of the study can provide invaluable assistance in the appointment of effective treatment physicians, gynecologists, orthopedists, and other specialists. The bioimpedance method helps to determine the risks of developing diabetes type 2, atherosclerosis, hypertension, diseases of the musculoskeletal system, disorders of the endocrine system, gall-stone disease and etc. Materials and Methods. In the list of parameters of body composition assessed by bioimpedance analysis method, included absolute and relative indicators. Depending on the method of measurement of the absolute rates were determined for the whole body. To absolute performance were: fat and skinny body mass index, active cell and skeletal muscle mass, total body water, cellular and extracellular fluid. Along with them were calculated relatively (normalized to body weight, lean mass, or other variables indicators of body composition. Results. In the result of the comparison of anthropometric and bioimpedance method found that growth performance, vital capacity, weight, waist circumference, circumfer¬ence of waist and hip, basal metabolism, body fat mass, normalized on growth, lean mass, percentage skeletal muscle mass in boys and girls with normal and excessive body weight had statistically significant differences. Discussion and Conclusions. In the present study physical development with consideration of body composition in students

  19. Mechanics of Nanostructures: Methods and Results

    Science.gov (United States)

    Ruoff, Rod

    2003-03-01

    We continue to develop and use new tools to measure the mechanics and electromechanics of nanostructures. Here we discuss: (a) methods for making nanoclamps and the resulting: nanoclamp geometry, chemical composition and type of chemical bonding, and nanoclamp strength (effectiveness as a nanoclamp for the mechanics measurements to be made); (b) mechanics of carbon nanocoils. We have received carbon nanocoils from colleagues in Japan [1], measured their spring constants, and have observed extensions exceeding 100% relative to the unloaded length, using our scanning electron microscope nanomanipulator tool; (c) several new devices that are essentially MEMS-based, that allow for improved measurements of the mechanics of psuedo-1D and planar nanostructures. [1] Zhang M., Nakayama Y., Pan L., Japanese J. Appl. Phys. 39, L1242-L1244 (2000).

  20. Comparison Of Simulation Results When Using Two Different Methods For Mold Creation In Moldflow Simulation

    Directory of Open Access Journals (Sweden)

    Kaushikbhai C. Parmar

    2017-04-01

    Full Text Available Simulation gives different results when using different methods for the same simulation. Autodesk Moldflow Simulation software provide two different facilities for creating mold for the simulation of injection molding process. Mold can be created inside the Moldflow or it can be imported as CAD file. The aim of this paper is to study the difference in the simulation results like mold temperature part temperature deflection in different direction time for the simulation and coolant temperature for this two different methods.

  1. The Use of Data Mining Methods to Predict the Result of Infertility Treatment Using the IVF ET Method

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2014-12-01

    Full Text Available The IVF ET method is a scientifically recognized infertility treat- ment method. The problem, however, is this method’s unsatisfactory efficiency. This calls for a more thorough analysis of the information available in the treat- ment process, in order to detect the factors that have an effect on the results, as well as to effectively predict result of treatment. Classical statistical methods have proven to be inadequate in this issue. Only the use of modern methods of data mining gives hope for a more effective analysis of the collected data. This work provides an overview of the new methods used for the analysis of data on infertility treatment, and formulates a proposal for further directions for research into increasing the efficiency of the predicted result of the treatment process.

  2. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    Science.gov (United States)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  3. Methods of Teaching Reading to EFL Learners: A Case Study

    Science.gov (United States)

    Sanjaya, Dedi; Rahmah; Sinulingga, Johan; Lubis, Azhar Aziz; Yusuf, Muhammad

    2014-01-01

    Methods of teaching reading skill are not the same in different countries. It depends on the condition and situation of the learners. Observing the method of teaching in Malaysia was the purpose of this study and the result of the study shows that there are 5 methods that are applied in classroom activities namely Grammar Translation Method (GTM),…

  4. Studies on mycobacterium tuberculosis sensitivity test by using the method of rapid radiometry with appendixes of clinical results

    International Nuclear Information System (INIS)

    Yang Yongqing; Jiang Yimin; Lu Wendong; Zhu Rongen

    1987-01-01

    Three standard strains of mycobacterium tuberculosis (H 37 RV-fully sensitive, SM-R1000 μg/ml, RFP-R 100 μg/ml) were tested with 10 concentration of 5 antitubercular agent, INH, SM, PAS, RFP and EB. 114 isolates of mycobacterium tuberculosis taken from patients were tested with INH, PAS, SM and RFP. They were agreed with the results of standard Lowenstein-Jensen method in 81.7%. 82% of the isolate test were completed within 5 days. The method may be used in routine clinical work. The liquid media prepared by authors do not require human serum albumin and it is less expensive and readily available

  5. Pharmaceutical companies' policies on access to trial data, results, and methods: audit study.

    Science.gov (United States)

    Goldacre, Ben; Lane, Síle; Mahtani, Kamal R; Heneghan, Carl; Onakpoya, Igho; Bushfield, Ian; Smeeth, Liam

    2017-07-26

    Objectives  To identify the policies of major pharmaceutical companies on transparency of trials, to extract structured data detailing each companies' commitments, and to assess concordance with ethical and professional guidance. Design  Structured audit. Setting  Pharmaceutical companies, worldwide. Participants  42 pharmaceutical companies. Main outcome measures  Companies' commitments on sharing summary results, clinical study reports (CSRs), individual patient data (IPD), and trial registration, for prospective and retrospective trials. Results  Policies were highly variable. Of 23 companies eligible from the top 25 companies by revenue, 21 (91%) committed to register all trials and 22 (96%) committed to share summary results; however, policies commonly lacked timelines for disclosure, and trials on unlicensed medicines and off-label uses were only included in six (26%). 17 companies (74%) committed to share the summary results of past trials. The median start date for this commitment was 2005. 22 companies (96%) had a policy on sharing CSRs, mostly on request: two committed to share only synopses and only two policies included unlicensed treatments. 22 companies (96%) had a policy to share IPD; 14 included phase IV trials (one included trials on unlicensed medicines and off-label uses). Policies in the exploratory group of smaller companies made fewer transparency commitments. Two companies fell short of industry body commitments on registration, three on summary results. Examples of contradictory and ambiguous language were documented and summarised by theme. 23/42 companies (55%) responded to feedback; 7/1806 scored policy elements were revised in light of feedback from companies (0.4%). Several companies committed to changing policy; some made changes immediately. Conclusions  The commitments made by companies to transparency of trials were highly variable. Other than journal submission for all trials within 12 months, all elements of best practice

  6. Structural issues affecting mixed methods studies in health research: a qualitative study

    Science.gov (United States)

    2009-01-01

    Background Health researchers undertake studies which combine qualitative and quantitative methods. Little attention has been paid to the structural issues affecting this mixed methods approach. We explored the facilitators and barriers to undertaking mixed methods studies in health research. Methods Face-to-face semi-structured interviews with 20 researchers experienced in mixed methods research in health in the United Kingdom. Results Structural facilitators for undertaking mixed methods studies included a perception that funding bodies promoted this approach, and the multidisciplinary constituency of some university departments. Structural barriers to exploiting the potential of these studies included a lack of education and training in mixed methods research, and a lack of templates for reporting mixed methods articles in peer-reviewed journals. The 'hierarchy of evidence' relating to effectiveness studies in health care research, with the randomised controlled trial as the gold standard, appeared to pervade the health research infrastructure. Thus integration of data and findings from qualitative and quantitative components of mixed methods studies, and dissemination of integrated outputs, tended to occur through serendipity and effort, further highlighting the presence of structural constraints. Researchers are agents who may also support current structures - journal reviewers and editors, and directors of postgraduate training courses - and thus have the ability to improve the structural support for exploiting the potential of mixed methods research. Conclusion The environment for health research in the UK appears to be conducive to mixed methods research but not to exploiting the potential of this approach. Structural change, as well as change in researcher behaviour, will be necessary if researchers are to fully exploit the potential of using mixed methods research. PMID:20003210

  7. Studying the method of linearization of exponential calibration curves

    International Nuclear Information System (INIS)

    Bunzh, Z.A.

    1989-01-01

    The results of study of the method for linearization of exponential calibration curves are given. The calibration technique and comparison of the proposed method with piecewise-linear approximation and power series expansion, are given

  8. Numerical proceessing of radioimmunoassay results using logit-log transformation method

    International Nuclear Information System (INIS)

    Textoris, R.

    1983-01-01

    The mathematical model and algorithm are described of the numerical processing of the results of a radioimmunoassay by the logit-log transformation method and by linear regression with weight factors. The limiting value of the curve for zero concentration is optimized with regard to the residual sum by the iterative method by multiple repeats of the linear regression. Typical examples are presented of the approximation of calibration curves. The method proved suitable for all hitherto used RIA sets and is well suited for small computers with internal memory of min. 8 Kbyte. (author)

  9. Initial experience with a group presentation of study results to research participants

    Directory of Open Access Journals (Sweden)

    Bent Stephen

    2008-03-01

    Full Text Available Abstract Background Despite ethical imperatives, informing research participants about the results of the studies in which they take part is not often performed. This is due, in part, to the costs and burdens of communicating with each participant after publication of the results. Methods Following the closeout and publication of a randomized clinical trial of saw palmetto for treatment of symptoms of benign prostatic hyperplasia, patients were invited back to the research center to participate in a group presentation of the study results. Results Approximately 10% of participants attended one of two presentation sessions. Reaction to the experience of the group presentation was very positive among the attendees. Conclusion A group presentation to research participants is an efficient method of communicating study results to those who desire to be informed and was highly valued by those who attended. Prospectively planning for such presentations and greater scheduling flexibility may result in higher attendance rates. Trial Registration Number Clinicaltrials.gov #NCT00037154

  10. Case-control vaccine effectiveness studies: Data collection, analysis and reporting results.

    Science.gov (United States)

    Verani, Jennifer R; Baqui, Abdullah H; Broome, Claire V; Cherian, Thomas; Cohen, Cheryl; Farrar, Jennifer L; Feikin, Daniel R; Groome, Michelle J; Hajjeh, Rana A; Johnson, Hope L; Madhi, Shabir A; Mulholland, Kim; O'Brien, Katherine L; Parashar, Umesh D; Patel, Manish M; Rodrigues, Laura C; Santosham, Mathuram; Scott, J Anthony; Smith, Peter G; Sommerfelt, Halvor; Tate, Jacqueline E; Victor, J Chris; Whitney, Cynthia G; Zaidi, Anita K; Zell, Elizabeth R

    2017-06-05

    The case-control methodology is frequently used to evaluate vaccine effectiveness post-licensure. The results of such studies provide important insight into the level of protection afforded by vaccines in a 'real world' context, and are commonly used to guide vaccine policy decisions. However, the potential for bias and confounding are important limitations to this method, and the results of a poorly conducted or incorrectly interpreted case-control study can mislead policies. In 2012, a group of experts met to review recent experience with case-control studies evaluating vaccine effectiveness; we summarize the recommendations of that group regarding best practices for data collection, analysis, and presentation of the results of case-control vaccine effectiveness studies. Vaccination status is the primary exposure of interest, but can be challenging to assess accurately and with minimal bias. Investigators should understand factors associated with vaccination as well as the availability of documented vaccination status in the study context; case-control studies may not be a valid method for evaluating vaccine effectiveness in settings where many children lack a documented immunization history. To avoid bias, it is essential to use the same methods and effort gathering vaccination data from cases and controls. Variables that may confound the association between illness and vaccination are also important to capture as completely as possible, and where relevant, adjust for in the analysis according to the analytic plan. In presenting results from case-control vaccine effectiveness studies, investigators should describe enrollment among eligible cases and controls as well as the proportion with no documented vaccine history. Emphasis should be placed on confidence intervals, rather than point estimates, of vaccine effectiveness. Case-control studies are a useful approach for evaluating vaccine effectiveness; however careful attention must be paid to the collection

  11. Statistical methods for elimination of guarantee-time bias in cohort studies: a simulation study

    Directory of Open Access Journals (Sweden)

    In Sung Cho

    2017-08-01

    Full Text Available Abstract Background Aspirin has been considered to be beneficial in preventing cardiovascular diseases and cancer. Several pharmaco-epidemiology cohort studies have shown protective effects of aspirin on diseases using various statistical methods, with the Cox regression model being the most commonly used approach. However, there are some inherent limitations to the conventional Cox regression approach such as guarantee-time bias, resulting in an overestimation of the drug effect. To overcome such limitations, alternative approaches, such as the time-dependent Cox model and landmark methods have been proposed. This study aimed to compare the performance of three methods: Cox regression, time-dependent Cox model and landmark method with different landmark times in order to address the problem of guarantee-time bias. Methods Through statistical modeling and simulation studies, the performance of the above three methods were assessed in terms of type I error, bias, power, and mean squared error (MSE. In addition, the three statistical approaches were applied to a real data example from the Korean National Health Insurance Database. Effect of cumulative rosiglitazone dose on the risk of hepatocellular carcinoma was used as an example for illustration. Results In the simulated data, time-dependent Cox regression outperformed the landmark method in terms of bias and mean squared error but the type I error rates were similar. The results from real-data example showed the same patterns as the simulation findings. Conclusions While both time-dependent Cox regression model and landmark analysis are useful in resolving the problem of guarantee-time bias, time-dependent Cox regression is the most appropriate method for analyzing cumulative dose effects in pharmaco-epidemiological studies.

  12. Evaluating rehabilitation methods - some practical results from Rum Jungle

    International Nuclear Information System (INIS)

    Ryan, P.

    1987-01-01

    Research and analysis of the following aspects of rehabilitation have been conducted at the Rum Jungle mine site over the past three years: drainage structure stability; rock batter stability; soil fauna; tree growth in compacted soils; rehabilitation costs. The results show that, for future rehabilitation projects adopting refined methods, attention to final construction detail and biospheric influences is most important. The mine site offers a unique opportunity to evaluate the success of a variety of rehabilitation methods to the benefit of the industry in Australia overseas. It is intended that practical, economic, research will continue for some considerable time

  13. Parallelization methods study of thermal-hydraulics codes

    International Nuclear Information System (INIS)

    Gaudart, Catherine

    2000-01-01

    The variety of parallelization methods and machines leads to a wide selection for programmers. In this study we suggest, in an industrial context, some solutions from the experience acquired through different parallelization methods. The study is about several scientific codes which simulate a large variety of thermal-hydraulics phenomena. A bibliography on parallelization methods and a first analysis of the codes showed the difficulty of our process on the whole applications to study. Therefore, it would be necessary to identify and extract a representative part of these applications and parallelization methods. The linear solver part of the codes forced itself. On this particular part several parallelization methods had been used. From these developments one could estimate the necessary work for a non initiate programmer to parallelize his application, and the impact of the development constraints. The different methods of parallelization tested are the numerical library PETSc, the parallelizer PAF, the language HPF, the formalism PEI and the communications library MPI and PYM. In order to test several methods on different applications and to follow the constraint of minimization of the modifications in codes, a tool called SPS (Server of Parallel Solvers) had be developed. We propose to describe the different constraints about the optimization of codes in an industrial context, to present the solutions given by the tool SPS, to show the development of the linear solver part with the tested parallelization methods and lastly to compare the results against the imposed criteria. (author) [fr

  14. Multiple predictor smoothing methods for sensitivity analysis: Example results

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described in the first part of this presentation: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. In this, the second and concluding part of the presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  15. A randomized phase II dose-response exercise trial among colon cancer survivors: Purpose, study design, methods, and recruitment results.

    Science.gov (United States)

    Brown, Justin C; Troxel, Andrea B; Ky, Bonnie; Damjanov, Nevena; Zemel, Babette S; Rickels, Michael R; Rhim, Andrew D; Rustgi, Anil K; Courneya, Kerry S; Schmitz, Kathryn H

    2016-03-01

    Observational studies indicate that higher volumes of physical activity are associated with improved disease outcomes among colon cancer survivors. The aim of this report is to describe the purpose, study design, methods, and recruitment results of the courage trial, a National Cancer Institute (NCI) sponsored, phase II, randomized, dose-response exercise trial among colon cancer survivors. The primary objective of the courage trial is to quantify the feasibility, safety, and physiologic effects of low-dose (150 min·week(-1)) and high-dose (300 min·week(-1)) moderate-intensity aerobic exercise compared to usual-care control group over six months. The exercise groups are provided with in-home treadmills and heart rate monitors. Between January and July 2015, 1433 letters were mailed using a population-based state cancer registry; 126 colon cancer survivors inquired about participation, and 39 were randomized onto the study protocol. Age was associated with inquiry about study participation (Pclinical, or geographic characteristics were associated with study inquiry or randomization. The final trial participant was randomized in August 2015. Six month endpoint data collection was completed in February 2016. The recruitment of colon cancer survivors into an exercise trial is feasible. The findings from this trial will inform key design aspects for future phase 2 and phase 3 randomized controlled trials to examine the efficacy of exercise to improve clinical outcomes among colon cancer survivors. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Method of vacuum correlation functions: Results and prospects

    International Nuclear Information System (INIS)

    Badalian, A. M.; Simonov, Yu. A.; Shevchenko, V. I.

    2006-01-01

    Basic results obtained within the QCD method of vacuum correlation functions over the past 20 years in the context of investigations into strong-interaction physics at the Institute of Theoretical and Experimental Physics (ITEP, Moscow) are formulated Emphasis is placed primarily on the prospects of the general theory developed within QCD by employing both nonperturbative and perturbative methods. On the basis of ab initio arguments, it is shown that the lowest two field correlation functions play a dominant role in QCD dynamics. A quantitative theory of confinement and deconfinement, as well as of the spectra of light and heavy quarkonia, glueballs, and hybrids, is given in terms of these two correlation functions. Perturbation theory in a nonperturbative vacuum (background perturbation theory) plays a significant role, not possessing drawbacks of conventional perturbation theory and leading to the infrared freezing of the coupling constant α s

  17. Generalist palliative care in hospital - Cultural and organisational interactions. Results of a mixed-methods study.

    Science.gov (United States)

    Bergenholtz, Heidi; Jarlbaek, Lene; Hølge-Hazelton, Bibi

    2016-06-01

    It can be challenging to provide generalist palliative care in hospitals, owing to difficulties in integrating disease-oriented treatment with palliative care and the influences of cultural and organisational conditions. However, knowledge on the interactions that occur is sparse. To investigate the interactions between organisation and culture as conditions for integrated palliative care in hospital and, if possible, to suggest workable solutions for the provision of generalist palliative care. A convergent parallel mixed-methods design was chosen using two independent studies: a quantitative study, in which three independent datasets were triangulated to study the organisation and evaluation of generalist palliative care, and a qualitative, ethnographic study exploring the culture of generalist palliative nursing care in medical departments. A Danish regional hospital with 29 department managements and one hospital management. Two overall themes emerged: (1) 'generalist palliative care as a priority at the hospital', suggesting contrasting issues regarding prioritisation of palliative care at different organisational levels, and (2) 'knowledge and use of generalist palliative care clinical guideline', suggesting that the guideline had not reached all levels of the organisation. Contrasting issues in the hospital's provision of generalist palliative care at different organisational levels seem to hamper the interactions between organisation and culture - interactions that appear to be necessary for the provision of integrated palliative care in the hospital. The implementation of palliative care is also hindered by the main focus being on disease-oriented treatment, which is reflected at all the organisational levels. © The Author(s) 2015.

  18. Study designs may influence results

    DEFF Research Database (Denmark)

    Johansen, Christoffer; Schüz, Joachim; Andreasen, Anne-Marie Serena

    2017-01-01

    appeared to show an inverse association, whereas nested case-control and cohort studies showed no association. For allergies, the inverse association was observed irrespective of study design. We recommend that the questionnaire-based case-control design be placed lower in the hierarchy of studies...... for establishing cause-and-effect for diseases such as glioma. We suggest that a state-of-the-art case-control study should, as a minimum, be accompanied by extensive validation of the exposure assessment methods and the representativeness of the study sample with regard to the exposures of interest. Otherwise...

  19. Methodological study of volcanic glass dating by fission track method

    International Nuclear Information System (INIS)

    Araya, A.M.O.

    1987-01-01

    After a description of the method and from the analysis of the age equation we show the methodology used in the plotting of the correction curve and the results of the study of correction curves and corrected ages. From a study of the size correction method we see that the reactor irradiation effect on the curve is negligible and that the correction curve is independent of the thermal treatment but, it depends on chemical treatment and sample. Comparing the corrected ages obtained from both correction method and the ages given by other authors we can conclude that they are in agreement and concerning the plateau method, both isothermal and isochronic plateau give the same results. (author) [pt

  20. A comparative study on effective dynamic modeling methods for flexible pipe

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Ho; Hong, Sup; Kim, Hyung Woo [Korea Research Institute of Ships and Ocean Engineering, Daejeon (Korea, Republic of); Kim, Sung Soo [Chungnam National University, Daejeon (Korea, Republic of)

    2015-07-15

    In this paper, in order to select a suitable method that is applicable to the large deflection with a small strain problem of pipe systems in the deep seabed mining system, the finite difference method with lumped mass from the field of cable dynamics and the substructure method from the field of flexible multibody dynamics were compared. Due to the difficulty of obtaining experimental results from an actual pipe system in the deep seabed mining system, a thin cantilever beam model with experimental results was employed for the comparative study. Accuracy of the methods was investigated by comparing the experimental results and simulation results from the cantilever beam model with different numbers of elements. Efficiency of the methods was also examined by comparing the operational counts required for solving equations of motion. Finally, this cantilever beam model with comparative study results can be promoted to be a benchmark problem for the flexible multibody dynamics.

  1. 3D ultrasound computer tomography: Hardware setup, reconstruction methods and first clinical results

    Science.gov (United States)

    Gemmeke, Hartmut; Hopp, Torsten; Zapf, Michael; Kaiser, Clemens; Ruiter, Nicole V.

    2017-11-01

    A promising candidate for improved imaging of breast cancer is ultrasound computer tomography (USCT). Current experimental USCT systems are still focused in elevation dimension resulting in a large slice thickness, limited depth of field, loss of out-of-plane reflections, and a large number of movement steps to acquire a stack of images. 3D USCT emitting and receiving spherical wave fronts overcomes these limitations. We built an optimized 3D USCT, realizing for the first time the full benefits of a 3D system. The point spread function could be shown to be nearly isotropic in 3D, to have very low spatial variability and fit the predicted values. The contrast of the phantom images is very satisfactory in spite of imaging with a sparse aperture. The resolution and imaged details of the reflectivity reconstruction are comparable to a 3 T MRI volume. Important for the obtained resolution are the simultaneously obtained results of the transmission tomography. The KIT 3D USCT was then tested in a pilot study on ten patients. The primary goals of the pilot study were to test the USCT device, the data acquisition protocols, the image reconstruction methods and the image fusion techniques in a clinical environment. The study was conducted successfully; the data acquisition could be carried out for all patients with an average imaging time of six minutes per breast. The reconstructions provide promising images. Overlaid volumes of the modalities show qualitative and quantitative information at a glance. This paper gives a summary of the involved techniques, methods, and first results.

  2. Patient housing barriers to hematopoietic cell transplantation: results from a mixed-methods study of transplant center social workers.

    Science.gov (United States)

    Preussler, Jaime M; Mau, Lih-Wen; Majhail, Navneet S; Bevans, Margaret; Clancy, Emilie; Messner, Carolyn; Parran, Leslie; Pederson, Kate A; Ferguson, Stacy Stickney; Walters, Kent; Murphy, Elizabeth A; Denzen, Ellen M

    2016-03-01

    Hematopoietic cell transplantation (HCT) is performed in select centers in the United States (U.S.), and patients are often required to temporarily relocate to receive care. The purpose of this study was to identify housing barriers impacting access to HCT and potential solutions. A mixed-methods primary study of HCT social workers was conducted to learn about patient housing challenges and solutions in place that help address those barriers. Three telephone focus groups were conducted with adult and pediatric transplant social workers (n = 15). Focus group results informed the design of a national survey. The online survey was e-mailed to a primary social worker contact at 133 adult and pediatric transplant centers in the U.S. Transplant centers were classified based on the patient population cared for by the social worker. The survey response rate was 49%. Among adult programs (n = 45), 93% of centers had patients that had to relocate closer to the transplant center to proceed with HCT. The most common type of housing option offered was discounted hotel rates. Among pediatric programs (n = 20), 90% of centers had patients that had to relocate closer to the transplant center to proceed with HCT. Ronald McDonald House was the most common option available. This study is the first to explore housing challenges faced by patients undergoing HCT in the U.S. from the perspective of social workers and to highlight solutions that centers use. Transplant centers will benefit from this knowledge by learning about options for addressing housing barriers for their patients.

  3. A semantics-based method for clustering of Chinese web search results

    Science.gov (United States)

    Zhang, Hui; Wang, Deqing; Wang, Li; Bi, Zhuming; Chen, Yong

    2014-01-01

    Information explosion is a critical challenge to the development of modern information systems. In particular, when the application of an information system is over the Internet, the amount of information over the web has been increasing exponentially and rapidly. Search engines, such as Google and Baidu, are essential tools for people to find the information from the Internet. Valuable information, however, is still likely submerged in the ocean of search results from those tools. By clustering the results into different groups based on subjects automatically, a search engine with the clustering feature allows users to select most relevant results quickly. In this paper, we propose an online semantics-based method to cluster Chinese web search results. First, we employ the generalised suffix tree to extract the longest common substrings (LCSs) from search snippets. Second, we use the HowNet to calculate the similarities of the words derived from the LCSs, and extract the most representative features by constructing the vocabulary chain. Third, we construct a vector of text features and calculate snippets' semantic similarities. Finally, we improve the Chameleon algorithm to cluster snippets. Extensive experimental results have shown that the proposed algorithm has outperformed over the suffix tree clustering method and other traditional clustering methods.

  4. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    Science.gov (United States)

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  5. Multiband discrete ordinates method: formalism and results; Methode multibande aux ordonnees discretes: formalisme et resultats

    Energy Technology Data Exchange (ETDEWEB)

    Luneville, L

    1998-06-01

    The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author) 15 refs.

  6. Algorithms for monitoring warfarin use: Results from Delphi Method.

    Science.gov (United States)

    Kano, Eunice Kazue; Borges, Jessica Bassani; Scomparini, Erika Burim; Curi, Ana Paula; Ribeiro, Eliane

    2017-10-01

    Warfarin stands as the most prescribed oral anticoagulant. New oral anticoagulants have been approved recently; however, their use is limited and the reversibility techniques of the anticoagulation effect are little known. Thus, our study's purpose was to develop algorithms for therapeutic monitoring of patients taking warfarin based on the opinion of physicians who prescribe this medicine in their clinical practice. The development of the algorithm was performed in two stages, namely: (i) literature review and (ii) algorithm evaluation by physicians using a Delphi Method. Based on the articles analyzed, two algorithms were developed: "Recommendations for the use of warfarin in anticoagulation therapy" and "Recommendations for the use of warfarin in anticoagulation therapy: dose adjustment and bleeding control." Later, these algorithms were analyzed by 19 medical doctors that responded to the invitation and agreed to participate in the study. Of these, 16 responded to the first round, 11 to the second and eight to the third round. A 70% consensus or higher was reached for most issues and at least 50% for six questions. We were able to develop algorithms to monitor the use of warfarin by physicians using a Delphi Method. The proposed method is inexpensive and involves the participation of specialists, and it has proved adequate for the intended purpose. Further studies are needed to validate these algorithms, enabling them to be used in clinical practice.

  7. Text-in-context: a method for extracting findings in mixed-methods mixed research synthesis studies.

    Science.gov (United States)

    Sandelowski, Margarete; Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L

    2013-06-01

    Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. The data extraction challenges described here were encountered, and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011-2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. © 2012 Blackwell Publishing Ltd.

  8. Soil Particle Size Analysis by Laser Diffractometry: Result Comparison with Pipette Method

    Science.gov (United States)

    Šinkovičová, Miroslava; Igaz, Dušan; Kondrlová, Elena; Jarošová, Miriam

    2017-10-01

    Soil texture as the basic soil physical property provides a basic information on the soil grain size distribution as well as grain size fraction representation. Currently, there are several methods of particle dimension measurement available that are based on different physical principles. Pipette method based on the different sedimentation velocity of particles with different diameter is considered to be one of the standard methods of individual grain size fraction distribution determination. Following the technical advancement, optical methods such as laser diffraction can be also used nowadays for grain size distribution determination in the soil. According to the literature review of domestic as well as international sources related to this topic, it is obvious that the results obtained by laser diffractometry do not correspond with the results obtained by pipette method. The main aim of this paper was to analyse 132 samples of medium fine soil, taken from the Nitra River catchment in Slovakia, from depths of 15-20 cm and 40-45 cm, respectively, using laser analysers: ANALYSETTE 22 MicroTec plus (Fritsch GmbH) and Mastersizer 2000 (Malvern Instruments Ltd). The results obtained by laser diffractometry were compared with pipette method and the regression relationships using linear, exponential, power and polynomial trend were derived. Regressions with the three highest regression coefficients (R2) were further investigated. The fit with the highest tightness was observed for the polynomial regression. In view of the results obtained, we recommend using the estimate of the representation of the clay fraction (analysis is done according to laser diffractometry. The advantages of laser diffraction method comprise the short analysis time, usage of small sample amount, application for the various grain size fraction and soil type classification systems, and a wide range of determined fractions. Therefore, it is necessary to focus on this issue further to address the

  9. Results of the determination of He in cenozoic aquifers using the GC method.

    Science.gov (United States)

    Kotowski, Tomasz; Najman, Joanna

    2015-04-01

    Applications of the Helium (He) method known so far consisted mainly of 4He measurements using a special mass spectrometer. 4He measurements for groundwater dating purposes can be replaced by total He (3He+4He) concentration measurements because the content of 3He can be ignored. The concentrations of 3He are very low and 3He/4 He ratios do not exceed 1.0·10(-5) in most cases. In this study, the total He concentrations in groundwater were determined using the gas chromatographic (GC) method as an alternative to methods based on spectrometry measurement. He concentrations in groundwater were used for the determination of residence time and groundwater circulation. Additionally, the radiocarbon method was used to determine the value of the external He flux (JHe) in the study area. Obtained low He concentrations and their small variation within the ca. 65 km long section along which groundwater flows indicate that it is likely there is relatively short residence time and a strong hydraulic connection between the aquifers. The estimated residence time (ca. 3000 years) is heavily dependent on the great uncertainty of the He concentration resulting from the low concentrations of He, the external 4He flux value adopted for calculation purposes and the 14C ages used to estimate the external 4He flux. © 2015, National Ground Water Association.

  10. Development of new therapeutic methods of lung cancer through team approach study

    International Nuclear Information System (INIS)

    Park, Jong Ho; Zo, Jae Ill; Baek, Hee Jong; Jung, Jin Haeng; Lee, Jae Cheol; Ryoo, Baek Yeol; Kim, Mi Sook; Choi, Du Hwan; Park, Sun Young; Lee, Hae Young

    2000-12-01

    The aims of this study were to make the lung cancer clinics in Korea Cancer Center Hospital, and to establish new therapeutic methods of lung cancer for increasing the cure rate and survival rate of patients. Also another purpose of this study was to establish a common treatment method in our hospital. All patients who were operated in Korea Cancer Center Hospital from 1987 due to lung cancer were followed up and evaluated. And we have been studied the effect of postoperative adjuvant therapy in stage I, II, IIIA non-small cell lung cancer patients from 1989 with the phase three study form. Follow-up examinations were scheduled in these patients and interim analysis was made. Also we have been studied the effect of chemo-therapeutic agents in small cell lung cancer patients from 1997 with the phase two study form. We evaluated the results of this study. Some important results of this study were as follows. 1. The new therapeutic method (surgery + MVP chemotherapy) was superior to the standard therapeutic one in stage I Non-small cell lung cancer patients. So, we have to change the standard method of treatment in stage I NSCLC. 2. Also, this new therapeutic method made a good result in stage II NSCLC patients. And this result was reported in The Annals of Thoracic Surgery. 3. However, this new therapeutic method was not superior to the standard treatment method (surgery only) in stage IIIA NSCLC patients. So, we must develop new chemo-therapeutic agents in the future for advanced NSCLC patients. 4. In the results of the randomized phase II studies about small cell lung cancer, there was no difference in survival between Etoposide + Carboplatin + Ifosfamide + Cisplatin group and Etoposide + Carboplatin + Ifosfamide + Cisplatin + Tamoxifen group in both the limited and extended types of small cell lung cancer patients

  11. Basic principles and results of the German risk study

    International Nuclear Information System (INIS)

    Heuser, F.W.; Bayer, A.

    1980-01-01

    In June 1976 the Federal Ministry for Research and Technology had commissioned the Gesellschaft fuer Reaktorsicherheit to write the German Risk Study, the first part of which has now been completed after three years of work and has been publicized recently. The German Risk Study is an attempt to define the societal risk posed by accidents in nuclear power plants under conditions in Germany. For this purpose, the accident rates and the resultant health hazards were determined. By adopting most of the basic premises and methods of the American Rasmussen Study, the German study is to allow a comparison to be made with the results of that study. The calculations were based on 19 sites with a total of 25 nuclear generating units presently in operation, under construction or in the licensing procedure in the Federal Republic of Germany. The technical studies were conducted on a 1300 MW PWR as the representative example. The results show that the decisive contributions are made by uncontrolled minor loss-of-coolant accidents and by failures of power supply (emergency power case). Large loss-of-coolant accidents do not play a role. The study also shows the decisive safety function of the containment. (orig.) [de

  12. Radiochemical studies of some preparation methods for phosphorus

    International Nuclear Information System (INIS)

    Loos-Neskovic, C.; Fedoroff, M.

    1983-01-01

    Various methods of radiochemical separation were tested for the determination of phosphorus in metals and alloys by neutron activation analysis. Classical methods of separation revealed some defects when they were applied to this problem. Methods using liquid extraction gave low yields and were not reproducible. Methods based on precipitation gave better results, but were not selective enough in most cases. Retention on alumina was not possible without preliminary separations. Authors studied a new radiochemical separation based on the extraction of elemental phosphorus in the gaseous phase after reduction at high temperature with carbon. Measurements with radioactive phosphorus showed that the extraction yield is better than 99%. (author)

  13. Development of standard methods for activity measurement of natural radionuclides in waterworks as basis for dose and risk assessment—First results of an Austrian study

    International Nuclear Information System (INIS)

    Stietka, M.; Baumgartner, A.; Seidel, C.; Maringer, F.J.

    2013-01-01

    A comprehensive study with the aim to evaluate the risks due to radiation exposure for workers in water supply is conducted in 21 Austrian waterworks. The development of standard methods for the assessment of occupational exposure of water work staff is a part of this study. Preliminary results of this study show a wide range of Rn-222 activity concentration in waterworks with values from (28±10) Bq/m 3 to (38,000±4000) Bq/m 3 . Also seasonal variations of the Rn-222 activity concentration could be observed. - Highlights: • In this study operational exposure of water work staff was evaluated. • The Rn-222 concentration in indoor air in waterworks was measured for 1 year. • Results show a wide range of Rn-222 activity concentration in waterworks. • Seasonal variations of the Rn-222 activity concentration could be observed

  14. Comparison of the analysis result between two laboratories using different methods

    International Nuclear Information System (INIS)

    Sri Murniasih; Agus Taftazani

    2017-01-01

    Comparison of the analysis result of volcano ash sample between two laboratories using different analysis methods. The research aims to improve the testing laboratory quality and cooperate with the testing laboratory from other country. Samples were tested at the Center for Accelerator of Science and Technology (CAST)-NAA laboratory using NAA, while at the University of Texas (UT) USA using ICP-MS and ENAA method. From 12 elements of target, CAST-NAA able to present 11 elements of data analysis. The comparison results shows that the analysis of the K, Mn, Ti and Fe elements from both laboratories have a very good comparison and close one to other. It is known from RSD values and correlation coefficients of the both laboratories analysis results. While observed of the results difference known that the analysis results of Al, Na, K, Fe, V, Mn, Ti, Cr and As elements from both laboratories is not significantly different. From 11 elements were reported, only Zn which have significantly different values for both laboratories. (author)

  15. Three magnetic particles solid phase radioimmunoassay for T4: Comparison of their results with established methods

    International Nuclear Information System (INIS)

    Bashir, T.

    1996-01-01

    The introduction of solid phase separation techniques is an important improvement in radioimmunoassays and immunoradiometric assays. Magnetic particle solid phase method has additional advantages over others, as the separation is rapid and centrifugation is not required. Three types of magnetic particles have been studied in T 4 RIA and the results have been compared with commercial kits and other established methods. (author). 4 refs, 9 figs, 2 tabs

  16. Studies on analytical method and nondestructive measuring method on the sensitization of austenitic stainless steels

    International Nuclear Information System (INIS)

    Onimura, Kichiro; Arioka, Koji; Horai, Manabu; Noguchi, Shigeru.

    1982-03-01

    Austenitic stainless steels are widely used as structural materials for the machine and equipment of various kinds of plants, such as thermal power, nuclear power, and chemical plants. The machines and equipment using this kind of material, however, have the possibility of suffering corrosion damage while in service, and these damages are considered to be largely due to the sensitization of the material in sometimes. So, it is necessary to develop an analytical method for grasping the sensitization of the material more in detail and a quantitative nondestructive measuring method which is applicable to various kinds of structures in order to prevent the corrosion damage. From the above viewpoint, studies have been made on the analytical method based on the theory of diffusion of chromium in austenitic stainless steels and on Electro-Potentiokinetics Reactivation Method (EPR Method) as a nondestructive measuring method, using 304 and 316 austenitic stainless steels having different carbon contents in base metals. This paper introduces the results of EPR test on the sensitization of austenitic stainless steels and the correlation between analytical and experimental results. (author)

  17. A Kinematic Study of Prosodic Structure in Articulatory and Manual Gestures: Results from a Novel Method of Data Collection

    Directory of Open Access Journals (Sweden)

    Jelena Krivokapić

    2017-03-01

    Full Text Available The primary goal of this work is to examine prosodic structure as expressed concurrently through articulatory and manual gestures. Specifically, we investigated the effects of phrase-level prominence (Experiment 1 and of prosodic boundaries (Experiments 2 and 3 on the kinematic properties of oral constriction and manual gestures. The hypothesis guiding this work is that prosodic structure will be similarly expressed in both modalities. To test this, we have developed a novel method of data collection that simultaneously records speech audio, vocal tract gestures (using electromagnetic articulometry and manual gestures (using motion capture. This method allows us, for the first time, to investigate kinematic properties of body movement and vocal tract gestures simultaneously, which in turn allows us to examine the relationship between speech and body gestures with great precision. A second goal of the paper is thus to establish the validity of this method. Results from two speakers show that manual and oral gestures lengthen under prominence and at prosodic boundaries, indicating that the effects of prosodic structure extend beyond the vocal tract to include body movement.1

  18. Designing a mixed methods study in primary care.

    Science.gov (United States)

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  19. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  20. Feasibility study on X-ray source with pinhole imaging method

    International Nuclear Information System (INIS)

    Qiu Rui; Li Junli

    2007-01-01

    In order to verify the feasibility of study on X-ray source with pinhole imaging method, and optimize the design of X-ray pinhole imaging system, an X-ray pinhole imaging equipment was set up. The change of image due to the change of the position and intensity of X-ray source was estimated with mathematical method and validated with experiment. The results show that the change of the spot position and gray of the spot is linearly related with the change of the position and intensity of X-ray source, so it is feasible to study X-ray source with pinhole imaging method in this application. The results provide some references for the design of X-ray pinhole imaging system. (authors)

  1. Studies on Tasar Cocoon Cooking Using Permeation Method

    Science.gov (United States)

    Javali, Uday C.; Malali, Kiran B.; Ramya, H. G.; Naik, Subhas V.; Padaki, Naveen V.

    2018-02-01

    Cocoon cooking is an important process before reeling of tasar silk yarn. Cooking ensures loosening of the filaments in the tasar cocoons thereby easing the process of yarn withdrawal during reeling process. Tasar cocoons have very hard shell and hence these cocoons need chemical cooking process to loosen the silk filaments. Attempt has been made in this article to study the effect of using vacuum permeation chamber for tasar cocoon cooking in order to reduce the cooking time and improve the quality of tasar silk yarn. Vacuum assisted permeation cooking method has been studied in this article on tasar daba cocoons for cooking efficiency, deflossing and reelability. Its efficiency has been evaluated with respect to different cooking methods viz, traditional and open pan cooking methods. The tasar silk produced after reeling process has been tested for fineness, strength and cohesion properties. Results indicate that permeation method of tasar cooking ensures uniform cooking with higher efficiency along with better reeling performance and improved yarn properties.

  2. Comparative study between EDXRF and ASTM E572 methods using two-way ANOVA

    Science.gov (United States)

    Krummenauer, A.; Veit, H. M.; Zoppas-Ferreira, J.

    2018-03-01

    Comparison with reference method is one of the necessary requirements for the validation of non-standard methods. This comparison was made using the experiment planning technique with two-way ANOVA. In ANOVA, the results obtained using the EDXRF method, to be validated, were compared with the results obtained using the ASTM E572-13 standard test method. Fisher's tests (F-test) were used to comparative study between of the elements: molybdenum, niobium, copper, nickel, manganese, chromium and vanadium. All F-tests of the elements indicate that the null hypothesis (Ho) has not been rejected. As a result, there is no significant difference between the methods compared. Therefore, according to this study, it is concluded that the EDXRF method was approved in this method comparison requirement.

  3. Maxillary sinusitis - a comparative study of different imaging diagnosis methods

    International Nuclear Information System (INIS)

    Hueb, Marcelo Miguel; Borges, Fabiano de Almeida; Pulcinelli, Emilte; Souza, Wandir Ferreira; Borges, Luiz Marcondes

    1999-01-01

    We conducted prospective study comparing different methods (plain X-rays, computed tomography and ultrasonography mode-A) for the initial diagnosis of maxillary sinusitis. Twenty patients (40 maxillary sinuses) with a clinical history suggestive of sinusitis included in this study. The results were classified as abnormal or normal, using computed tomography as gold standard. The sensitivity for ultrasonography and plain X-rays was 84.6% and 69.2%, respectively. The specificity of both methods was 92.6%. This study suggests that ultrasonography can be used as a good follow-up method for patients with maxillary. sinusitis. (author)

  4. A statistical method for testing epidemiological results, as applied to the Hanford worker population

    International Nuclear Information System (INIS)

    Brodsky, A.

    1979-01-01

    Some recent reports of Mancuso, Stewart and Kneale claim findings of radiation-produced cancer in the Hanford worker population. These claims are based on statistical computations that use small differences in accumulated exposures between groups dying of cancer and groups dying of other causes; actual mortality and longevity were not reported. This paper presents a statistical method for evaluation of actual mortality and longevity longitudinally over time, as applied in a primary analysis of the mortality experience of the Hanford worker population. Although available, this method was not utilized in the Mancuso-Stewart-Kneale paper. The author's preliminary longitudinal analysis shows that the gross mortality experience of persons employed at Hanford during 1943-70 interval did not differ significantly from that of certain controls, when both employees and controls were selected from families with two or more offspring and comparison were matched by age, sex, race and year of entry into employment. This result is consistent with findings reported by Sanders (Health Phys. vol.35, 521-538, 1978). The method utilizes an approximate chi-square (1 D.F.) statistic for testing population subgroup comparisons, as well as the cumulation of chi-squares (1 D.F.) for testing the overall result of a particular type of comparison. The method is available for computer testing of the Hanford mortality data, and could also be adapted to morbidity or other population studies. (author)

  5. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    Science.gov (United States)

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This

  6. A method for data handling numerical results in parallel OpenFOAM simulations

    International Nuclear Information System (INIS)

    nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" data-affiliation=" (Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2nd Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania))" >Anton, Alin; th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" data-affiliation=" (Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24th Mihai Viteazu Ave., 300221, TM Timişoara (Romania))" >Muntean, Sebastian

    2015-01-01

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit ® [1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms

  7. A method for data handling numerical results in parallel OpenFOAM simulations

    Energy Technology Data Exchange (ETDEWEB)

    Anton, Alin [Faculty of Automatic Control and Computing, Politehnica University of Timişoara, 2" n" d Vasile Pârvan Ave., 300223, TM Timişoara, Romania, alin.anton@cs.upt.ro (Romania); Muntean, Sebastian [Center for Advanced Research in Engineering Science, Romanian Academy – Timişoara Branch, 24" t" h Mihai Viteazu Ave., 300221, TM Timişoara (Romania)

    2015-12-31

    Parallel computational fluid dynamics simulations produce vast amount of numerical result data. This paper introduces a method for reducing the size of the data by replaying the interprocessor traffic. The results are recovered only in certain regions of interest configured by the user. A known test case is used for several mesh partitioning scenarios using the OpenFOAM toolkit{sup ®}[1]. The space savings obtained with classic algorithms remain constant for more than 60 Gb of floating point data. Our method is most efficient on large simulation meshes and is much better suited for compressing large scale simulation results than the regular algorithms.

  8. Relationship of Indoor, Outdoor and Personal Air (RIOPA) study: study design, methods and quality assurance/control results.

    Science.gov (United States)

    Weisel, Clifford P; Zhang, Junfeng; Turpin, Barbara J; Morandi, Maria T; Colome, Steven; Stock, Thomas H; Spektor, Dalia M; Korn, Leo; Winer, Arthur; Alimokhtari, Shahnaz; Kwon, Jaymin; Mohan, Krishnan; Harrington, Robert; Giovanetti, Robert; Cui, William; Afshar, Masoud; Maberti, Silvia; Shendell, Derek

    2005-03-01

    The Relationship of Indoor, Outdoor and Personal Air (RIOPA) Study was undertaken to evaluate the contribution of outdoor sources of air toxics, as defined in the 1990 Clean Air Act Amendments, to indoor concentrations and personal exposures. The concentrations of 18 volatile organic compounds (VOCs), 17 carbonyl compounds, and fine particulate matter mass (PM(2.5)) were measured using 48-h outdoor, indoor and personal air samples collected simultaneously. PM2.5 mass, as well as several component species (elemental carbon, organic carbon, polyaromatic hydrocarbons and elemental analysis) were also measured; only PM(2.5) mass is reported here. Questionnaires were administered to characterize homes, neighborhoods and personal activities that might affect exposures. The air exchange rate was also measured in each home. Homes in close proximity (<0.5 km) to sources of air toxics were preferentially (2:1) selected for sampling. Approximately 100 non-smoking households in each of Elizabeth, NJ, Houston, TX, and Los Angeles, CA were sampled (100, 105, and 105 respectively) with second visits performed at 84, 93, and 81 homes in each city, respectively. VOC samples were collected at all homes, carbonyls at 90% and PM(2.5) at 60% of the homes. Personal samples were collected from nonsmoking adults and a portion of children living in the target homes. This manuscript provides the RIOPA study design and quality control and assurance data. The results from the RIOPA study can potentially provide information on the influence of ambient sources on indoor air concentrations and exposure for many air toxics and will furnish an opportunity to evaluate exposure models for these compounds.

  9. Comparative study on γ-ray spectrum by several filtering method

    International Nuclear Information System (INIS)

    Yuan Xinyu; Liu Liangjun; Zhou Jianliang

    2011-01-01

    Comparative study was conducted on results of gamma-ray spectrum by using a majority of active smoothing method, which were used to show filtering effect. The results showed that peak was widened and overlap peaks increased with energy domain filter in γ-ray spectrum. Filter and its parameters should be seriously taken into consideration in frequency domain. Wavelet transformation can keep signal in high frequency region well. Improved threshold method showed the advantages of hard and soft threshold method at the same time by comparison, which was suitable for weak peaks detection. A new filter was put forward to eke out gravity model approach, whose denoise level was detected by standard deviation. This method not only kept signal and net area of peak well,but also attained better result and had simple computer program. (authors)

  10. Determination of the food consumption in eleven regions of the european community with a view to studying the radioactive contamination level: Methods used. Results of family enquiries

    International Nuclear Information System (INIS)

    Cresta, M.; Lacourly, G.

    1966-01-01

    In the present report are given the results obtained from food surveys carried out during the period 1963-1965 and involving 9000 families living in eleven regions spread out over the six European Community countries. A partial analysis of the results obtained covers a reduced sample of 3725 families; it makes it possible to fix the composition of the mean individual, monthly and annual food consumptions for each of the eleven regions. Details of the organisation of the survey, of the data processing methods and of the method of presenting the results are given in the first part of the report. the second part presents, in numerical table form, the consumption of various foodstuffs and the feeding principles for each region covered by the survey. Tables summarizing the data make it possible to compare the mean individual consumptions in the various regions studied. (author) [fr

  11. Epidemiological methods for research with drug misusers: review of methods for studying prevalence and morbidity

    Directory of Open Access Journals (Sweden)

    Dunn John

    1999-01-01

    Full Text Available Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.

  12. Comparison results on preconditioned SOR-type iterative method for Z-matrices linear systems

    Science.gov (United States)

    Wang, Xue-Zhong; Huang, Ting-Zhu; Fu, Ying-Ding

    2007-09-01

    In this paper, we present some comparison theorems on preconditioned iterative method for solving Z-matrices linear systems, Comparison results show that the rate of convergence of the Gauss-Seidel-type method is faster than the rate of convergence of the SOR-type iterative method.

  13. Radioimmunological determination of plasma progesterone. Methods - Results - Indications

    International Nuclear Information System (INIS)

    Gonon-Estrangin, Chantal.

    1978-10-01

    The aim of this work is to describe the radioimmunological determination of plasma progesterone carried out at the hormonology Laboratory of the Grenoble University Hospital Centre (Professor E. Chambaz), to compare our results with those of the literature and to present the main clinical indications of this analysis. The measurement method has proved reproducible, specific (the steroid purification stage is unnecessary) and sensitive (detection: 10 picograms of progesterone per tube). In seven normally menstruating women our results agree with published values: (in nanograms per millilitre: ng/ml) 0.07 ng/ml to 0.9 ng/ml in the follicular phase, from the start of menstruation until ovulation, then rapid increase at ovulation with a maximum in the middle of the luteal phase (our values for this maximum range from 7.9 ng/ml to 21.7 ng/ml) and gradual drop in progesterone secretion until the next menstrual period. In gynecology the radioimmunoassay of plasma progesterone is valuable for diagnostic and therapeutic purposes: - to diagnosis the absence of corpus luteum, - to judge the effectiveness of an ovulation induction treatment [fr

  14. Effect of Chemistry Triangle Oriented Learning Media on Cooperative, Individual and Conventional Method on Chemistry Learning Result

    Science.gov (United States)

    Latisma D, L.; Kurniawan, W.; Seprima, S.; Nirbayani, E. S.; Ellizar, E.; Hardeli, H.

    2018-04-01

    The purpose of this study was to see which method are well used with the Chemistry Triangle-oriented learning media. This quasi experimental research involves first grade of senior high school students in six schools namely each two SMA N in Solok city, in Pasaman and two SMKN in Pariaman. The sampling technique was done by Cluster Random Sampling. Data were collected by test and analyzed by one-way anova and Kruskall Wallish test. The results showed that the high school students in Solok learning taught by cooperative method is better than the results of student learning taught by conventional and Individual methods, both for students who have high initial ability and low-ability. Research in SMK showed that the overall student learning outcomes taught by conventional method is better than the student learning outcomes taught by cooperative and individual methods. Student learning outcomes that have high initial ability taught by individual method is better than student learning outcomes that are taught by cooperative method and for students who have low initial ability, there is no difference in student learning outcomes taught by cooperative, individual and conventional methods. Learning in high school in Pasaman showed no significant difference in learning outcomes of the three methods undertaken.

  15. Finnsjoen study site. Scope of activities and main results

    International Nuclear Information System (INIS)

    Ahlbom, K.; Andersson, J.E.; Andersson, Peter; Ittner, T.; Tiren, S.; Ljunggren, C.

    1992-12-01

    The Finnsjoen study site was selected in 1977 to provide input to the KBS-1 and KBS-2 performance assessments. The site was later used as a test site for testing new instruments and new site characterization methods, as well as a research site for studying mainly groundwater flow and groundwater transport. All together, the Finnsjoen studies have involved 11 cored boreholes, down to max 700 m depth, and extensive borehole geophysical, geochemical and geohydraulic measurements, as well as rock stress measurements and tracer tests. This report presents the scope of the Finnsjoen studies together with main results. Conceptual uncertainties in assumptions and models are discussed with emphasis on the models used for the performance assessment SKB91. Of special interest for the Finnsjoen study site is the strong influence caused by a subhorizontal fracture zone on groundwater flow, transport and chemistry

  16. Some results about the dating of pre hispanic mexican ceramics by the thermoluminescence method

    International Nuclear Information System (INIS)

    Gonzalez M, P.; Mendoza A, D.; Ramirez L, A.; Schaaf, P.

    2004-01-01

    One of the most frequently recurring questions in Archaeometry concerns the age of the studied objects. The some first dating methods were based in historical narrations, style of buildings manufacture techniques. However, has been observed that as consequence the continuous irradiation from naturally occurring radioisotopes and from cosmic rays some materials, such as archaeological ceramic, accumulate certain quantity of energy. These types of material can, in principle, be dated through the analysis of these accumulate energy. In that case, ceramic dating can be realized by thermoluminescence (TL) dating. In this work, results obtained by our research group about TL dating of ceramic belonging to several archaeological zones like to Edzna (Campeche), Calixtlahuaca and Teotenango (Mexico State) and Hervideros (Durango) are presented. The analysis was realized using the fine grained mode in a Daybreak model 1100 reader TL system. The radioisotopes that contribute in the accumulate annual dose in ceramic samples ( 40 K, 238 U, 232 Th) were determined by means of techniques such as Energy Dispersive X-ray Spectroscopy (EDS) and Neutron Activation Analysis (AAN). Our results are agree with results obtained through other methods. (Author) 7 refs., 2 tabs., 5 figs

  17. A new method for studying the structure relaxation of amorphous matters

    International Nuclear Information System (INIS)

    Cao Xiaowen

    1989-11-01

    A new method for studying the structure relaxation of amorphous matters by Hall effect is proposed. The structure relaxation of the metal-type amorphous InSb has been experimentally studied. The experimental results show that this method is highly sensitive to the structure relaxation, and the mechanism of structure relaxation can be observed

  18. Comparative study of heuristic evaluation and usability testing methods.

    Science.gov (United States)

    Thyvalikakath, Thankam Paul; Monaco, Valerie; Thambuganipalle, Himabindu; Schleyer, Titus

    2009-01-01

    Usability methods, such as heuristic evaluation, cognitive walk-throughs and user testing, are increasingly used to evaluate and improve the design of clinical software applications. There is still some uncertainty, however, as to how those methods can be used to support the development process and evaluation in the most meaningful manner. In this study, we compared the results of a heuristic evaluation with those of formal user tests in order to determine which usability problems were detected by both methods. We conducted heuristic evaluation and usability testing on four major commercial dental computer-based patient records (CPRs), which together cover 80% of the market for chairside computer systems among general dentists. Both methods yielded strong evidence that the dental CPRs have significant usability problems. An average of 50% of empirically-determined usability problems were identified by the preceding heuristic evaluation. Some statements of heuristic violations were specific enough to precisely identify the actual usability problem that study participants encountered. Other violations were less specific, but still manifested themselves in usability problems and poor task outcomes. In this study, heuristic evaluation identified a significant portion of problems found during usability testing. While we make no assumptions about the generalizability of the results to other domains and software systems, heuristic evaluation may, under certain circumstances, be a useful tool to determine design problems early in the development cycle.

  19. A normalization method for combination of laboratory test results from different electronic healthcare databases in a distributed research network.

    Science.gov (United States)

    Yoon, Dukyong; Schuemie, Martijn J; Kim, Ju Han; Kim, Dong Ki; Park, Man Young; Ahn, Eun Kyoung; Jung, Eun-Young; Park, Dong Kyun; Cho, Soo Yeon; Shin, Dahye; Hwang, Yeonsoo; Park, Rae Woong

    2016-03-01

    Distributed research networks (DRNs) afford statistical power by integrating observational data from multiple partners for retrospective studies. However, laboratory test results across care sites are derived using different assays from varying patient populations, making it difficult to simply combine data for analysis. Additionally, existing normalization methods are not suitable for retrospective studies. We normalized laboratory results from different data sources by adjusting for heterogeneous clinico-epidemiologic characteristics of the data and called this the subgroup-adjusted normalization (SAN) method. Subgroup-adjusted normalization renders the means and standard deviations of distributions identical under population structure-adjusted conditions. To evaluate its performance, we compared SAN with existing methods for simulated and real datasets consisting of blood urea nitrogen, serum creatinine, hematocrit, hemoglobin, serum potassium, and total bilirubin. Various clinico-epidemiologic characteristics can be applied together in SAN. For simplicity of comparison, age and gender were used to adjust population heterogeneity in this study. In simulations, SAN had the lowest standardized difference in means (SDM) and Kolmogorov-Smirnov values for all tests (p normalization performed better than normalization using other methods. The SAN method is applicable in a DRN environment and should facilitate analysis of data integrated across DRN partners for retrospective observational studies. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Study the Three Extraction Methods for HBV DNA to Use in PCR

    Directory of Open Access Journals (Sweden)

    N. Sheikh

    2004-07-01

    Full Text Available Diagnosis of Hepatitis B is important because of the its high prevalence. Recently PCR method , has found greater interest among different diagnostic methods. Several reports emphasis on some false negative results in those laboratories using PCR. The aim of this study was to compare three different procedures for HBV DNA extraction. A total 30 serum samples received from Shariati hospital. Sera was taken from patients having chronic Hepatitis with HBs antigen positive and HBe antigen negative. The sensitivity of guanidium hydrochloride method for extracting the HBV DNA from serum were evaluated and compared with phenol–chloroform and boiling methods. Diagnostic PCR kit was obtained from Cynagene contained taq polymerase, reaction mixture, dNTP, and buffer for reaction. A 353 bp product were amplified by amplification program provided in used PCR protocol. The comparison of results indicated that procedure was successful for amplification of the designed products from Hepatitis B in sera. Number of positive results were 16,19,23 and number of negative result were 14,11,7 for the boiling, phenol-chloroform and guanidium-hydrochloride extraction methods respectively.PCR method is the fastest diagnosis method and the most accurate procedure to identify Hepatitis B. Guanidium hydrochloride method was the most successful procedure studied in this survey for viruses.

  1. Investigation of error estimation method of observational data and comparison method between numerical and observational results toward V and V of seismic simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kawakami, Yoshiaki; Nakajima, Norihiro

    2017-01-01

    The method to estimate errors included in observational data and the method to compare numerical results with observational results are investigated toward the verification and validation (V and V) of a seismic simulation. For the method to estimate errors, 144 literatures for the past 5 years (from the year 2010 to 2014) in the structure engineering field and earthquake engineering field where the description about acceleration data is frequent are surveyed. As a result, it is found that some processes to remove components regarded as errors from observational data are used in about 30% of those literatures. Errors are caused by the resolution, the linearity, the temperature coefficient for sensitivity, the temperature coefficient for zero shift, the transverse sensitivity, the seismometer property, the aliasing, and so on. Those processes can be exploited to estimate errors individually. For the method to compare numerical results with observational results, public materials of ASME V and V Symposium 2012-2015, their references, and above 144 literatures are surveyed. As a result, it is found that six methods have been mainly proposed in existing researches. Evaluating those methods using nine items, advantages and disadvantages for those methods are arranged. The method is not well established so that it is necessary to employ those methods by compensating disadvantages and/or to search for a solution to a novel method. (author)

  2. Daily radiotoxicological supervision of personnel at the Pierrelatte industrial complex. Methods and results

    International Nuclear Information System (INIS)

    Chalabreysse, Jacques.

    1978-05-01

    A 13 year experience gained from daily radiotoxicological supervision of personnel at the PIERRELATTE industrial complex is presented. This study is divided into two parts: part one is theoretical: bibliographical synthesis of all scattered documents and publications; a homogeneous survey of all literature on the subject is thus available. Part two reviews the experience gained in professional surroundings: laboratory measurements and analyses (development of methods and daily applications); mathematical formulae to answer the first questions which arise before an individual liable to be contaminated; results obtained at PIERRELATTE [fr

  3. Cutaneous blood flow. A comparative study between the thermal recovery method and the radioxenon clearance method

    Energy Technology Data Exchange (ETDEWEB)

    Tavares, C M; Ferreira, J M; Fernandes, F V

    1975-01-01

    Since 1968 a thermal recovery method to study the cutaneous circulation has been utilized in the detection of skin circulation changes caused by certain pharmacological agents or by some pathological conditions. This method is based in the determination of the thermal recuperation of a small area of the skin previously cooled. In this work, we want to present the results of a comparative analysis between the thermal recovery method and the clearance of the radioactive xenon injected intracutaneously. The study was performed in the distal extremity of the lower limbs in 16 normal subjects, 16 hyperthyroid patients with increased cutaneous temperature and 11 patients with presumably low cutaneous blood flow (3 patients with hypothyroidism and 8 with obstructive arteriosclerosis).

  4. Resource costing for multinational neurologic clinical trials: methods and results.

    Science.gov (United States)

    Schulman, K; Burke, J; Drummond, M; Davies, L; Carlsson, P; Gruger, J; Harris, A; Lucioni, C; Gisbert, R; Llana, T; Tom, E; Bloom, B; Willke, R; Glick, H

    1998-11-01

    We present the results of a multinational resource costing study for a prospective economic evaluation of a new medical technology for treatment of subarachnoid hemorrhage within a clinical trial. The study describes a framework for the collection and analysis of international resource cost data that can contribute to a consistent and accurate intercountry estimation of cost. Of the 15 countries that participated in the clinical trial, we collected cost information in the following seven: Australia, France, Germany, the UK, Italy, Spain, and Sweden. The collection of cost data in these countries was structured through the use of worksheets to provide accurate and efficient cost reporting. We converted total average costs to average variable costs and then aggregated the data to develop study unit costs. When unit costs were unavailable, we developed an index table, based on a market-basket approach, to estimate unit costs. To estimate the cost of a given procedure, the market-basket estimation process required that cost information be available for at least one country. When cost information was unavailable in all countries for a given procedure, we estimated costs using a method based on physician-work and practice-expense resource-based relative value units. Finally, we converted study unit costs to a common currency using purchasing power parity measures. Through this costing exercise we developed a set of unit costs for patient services and per diem hospital services. We conclude by discussing the implications of our costing exercise and suggest guidelines to facilitate more effective multinational costing exercises.

  5. Image restoration by the method of convex projections: part 2 applications and numerical results.

    Science.gov (United States)

    Sezan, M I; Stark, H

    1982-01-01

    The image restoration theory discussed in a previous paper by Youla and Webb [1] is applied to a simulated image and the results compared with the well-known method known as the Gerchberg-Papoulis algorithm. The results show that the method of image restoration by projection onto convex sets, by providing a convenient technique for utilizing a priori information, performs significantly better than the Gerchberg-Papoulis method.

  6. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    Science.gov (United States)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  7. Comparison of Results according to the treatment Method in Maxillary Sinus Carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Woong Ki; Jo, Jae Sik; Ahn, Sung Ja; Nam, Taek Keun; Nah, Byung Sik [Chonnam National University College of Medicine, Kwangju (Korea, Republic of); Park, Seung Jin [Gyeongsang National Univ., Jinju (Korea, Republic of)

    1995-03-15

    Purpose : A retrospective analysis was performed to investigate the proper management of maxillary sinus carcinoma. Materials and Methods : Authors analysed 33 patients of squamous cell carcinoma of maxillary sinus treated at Chonnam University Hospital from January 1986 to December 1992. There were 24 men and 9 women with median age of 55 years. According to AJCC TNM system of 1988, a patient of T2, 10 patients of T3 and 22 patients of T4 were available, respectively. Cervical lymph node metastases was observed in 5 patients(N1;4/33, N2b;1/33). Patients were classified as 3 groups according to management method. The first group, named as 'FAR' (16 patients), was consisted of preoperative intra-arterial chemotherapy with 5-fluorouracil(5-FU;mean of total dosage;3078mg) through the superficial temporal artery with concurrent radiation(mean dose delivered;3433cGy, daily 180-200cGy) and vitamin A(50,000 IU daily), and followed by total maxillectomy and postoperative radiation therapy(mean dose;2351cGy). The second group, named as 'SR'(7 patients), was consisted of total maxillectomy followed by postoperative radiation therapy(mean dose 5920 cGy). Her third group, named as 'R'(6 patients), was treated with radiation alone(mean dose;7164cGy). Kaplan-Meier product limit method was used for survival analysis and Mantel-Cox test was performed for significance of survival difference between two groups. Results : Local recurrence free survival rate in the end of 2 year was 100%, 5-% and 0% in FAR, SR and R group, respectively. Disease free survival rate in 2 years was 88.9%, 40% and 50% in Far, SR and R group, respectively. There were statistically significant difference between FAR and SR or FAR and R group in their local recurrence free, disease free and overall survival rates. But difference of each survival rate between SR and R group was not significant. Conclusion : In this study FAR group revealed better results that SR or R group. In the

  8. Comparison of Results according to the treatment Method in Maxillary Sinus Carcinoma

    International Nuclear Information System (INIS)

    Chung, Woong Ki; Jo, Jae Sik; Ahn, Sung Ja; Nam, Taek Keun; Nah, Byung Sik; Park, Seung Jin

    1995-01-01

    Purpose : A retrospective analysis was performed to investigate the proper management of maxillary sinus carcinoma. Materials and Methods : Authors analysed 33 patients of squamous cell carcinoma of maxillary sinus treated at Chonnam University Hospital from January 1986 to December 1992. There were 24 men and 9 women with median age of 55 years. According to AJCC TNM system of 1988, a patient of T2, 10 patients of T3 and 22 patients of T4 were available, respectively. Cervical lymph node metastases was observed in 5 patients(N1;4/33, N2b;1/33). Patients were classified as 3 groups according to management method. The first group, named as 'FAR' (16 patients), was consisted of preoperative intra-arterial chemotherapy with 5-fluorouracil(5-FU;mean of total dosage;3078mg) through the superficial temporal artery with concurrent radiation(mean dose delivered;3433cGy, daily 180-200cGy) and vitamin A(50,000 IU daily), and followed by total maxillectomy and postoperative radiation therapy(mean dose;2351cGy). The second group, named as 'SR'(7 patients), was consisted of total maxillectomy followed by postoperative radiation therapy(mean dose 5920 cGy). Her third group, named as 'R'(6 patients), was treated with radiation alone(mean dose;7164cGy). Kaplan-Meier product limit method was used for survival analysis and Mantel-Cox test was performed for significance of survival difference between two groups. Results : Local recurrence free survival rate in the end of 2 year was 100%, 5-% and 0% in FAR, SR and R group, respectively. Disease free survival rate in 2 years was 88.9%, 40% and 50% in Far, SR and R group, respectively. There were statistically significant difference between FAR and SR or FAR and R group in their local recurrence free, disease free and overall survival rates. But difference of each survival rate between SR and R group was not significant. Conclusion : In this study FAR group revealed better results that SR or R group. In the future prospective randomized

  9. Radionuclide methods application in cardiac studies

    International Nuclear Information System (INIS)

    Kotina, E.D.; Ploskikh, V.A.; Babin, A.V.

    2013-01-01

    Radionuclide methods are one of the most modern methods of functional diagnostics of diseases of the cardio-vascular system that requires the use of mathematical methods of processing and analysis of data obtained during the investigation. Study is carried out by means of one-photon emission computed tomography (SPECT). Mathematical methods and software for SPECT data processing are developed. This software allows defining physiologically meaningful indicators for cardiac studies

  10. The quality of reporting methods and results of cost-effectiveness analyses in Spain: a methodological systematic review.

    Science.gov (United States)

    Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian

    2016-01-07

    Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable

  11. Paleomagnetic intensity of Aso pyroclastic flows: Additional results with LTD-DHT Shaw method, Thellier method with pTRM-tail check

    Science.gov (United States)

    Maruuchi, T.; Shibuya, H.

    2009-12-01

    , and 42 specimens were submitted to Thellier experiments. Twelve specimens from 4 sites passed the same criteria as Aso-2, and yield a mean paleointensity of 43.1±1.4uT. It again agrees with the value (45.6±1.7uT) of Takai et al. (2002). LTD-DHT Shaw method experiment is also applied for 12 specimens from 3 sites, and 4 passed the criteria giving 38.2±1.7. Although it is a little smaller than Thellier results, it is way larger than the Sint-800 at the time of Aso-4. Aso-1 result in this study is more consistent with the Sint-800 at that time than Takai et al. (2002). But for Aso-2 and Aso-4, their new reliable paleointensity results suggest that the discrepancy from the Sint-800 is not attributed to the experimental problems.

  12. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    Energy Technology Data Exchange (ETDEWEB)

    Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.; Hyer, Daniel E. [Department of Radiation Oncology, University of Iowa, 200 Hawkins Drive, Iowa City, Iowa 52242 (United States); Hill, Patrick M. [Department of Human Oncology, University of Wisconsin, 600 Highland Avenue, Madison, Wisconsin 53792 (United States); Gao, Mingcheng; Laub, Steve; Pankuch, Mark [Division of Medical Physics, CDH Proton Center, 4455 Weaver Parkway, Warrenville, Illinois 60555 (United States)

    2015-03-15

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ{sub x1},σ{sub x2},σ{sub y1},σ{sub y2}) together with the spatial location of the maximum dose (μ{sub x},μ{sub y}). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets.

  13. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    International Nuclear Information System (INIS)

    Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.; Hyer, Daniel E.; Hill, Patrick M.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark

    2015-01-01

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ x1 ,σ x2 ,σ y1 ,σ y2 ) together with the spatial location of the maximum dose (μ x ,μ y ). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets

  14. A method for modeling laterally asymmetric proton beamlets resulting from collimation

    Science.gov (United States)

    Gelover, Edgar; Wang, Dongxu; Hill, Patrick M.; Flynn, Ryan T.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark; Hyer, Daniel E.

    2015-01-01

    Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σx1,σx2,σy1,σy2) together with the spatial location of the maximum dose (μx,μy). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets. PMID:25735287

  15. a comparison of methods in a behaviour study of the south african ...

    African Journals Online (AJOL)

    A COMPARISON OF METHODS IN A BEHAVIOUR STUDY OF THE ... Three methods are outlined in this paper and the results obtained from each method were .... There was definitely no aggressive response towards the Sky pointing mate.

  16. Ecological content validation of the Information Assessment Method for parents (IAM-parent): A mixed methods study.

    Science.gov (United States)

    Bujold, M; El Sherif, R; Bush, P L; Johnson-Lafleur, J; Doray, G; Pluye, P

    2018-02-01

    This mixed methods study content validated the Information Assessment Method for parents (IAM-parent) that allows users to systematically rate and comment on online parenting information. Quantitative data and results: 22,407 IAM ratings were collected; of the initial 32 items, descriptive statistics showed that 10 had low relevance. Qualitative data and results: IAM-based comments were collected, and 20 IAM users were interviewed (maximum variation sample); the qualitative data analysis assessed the representativeness of IAM items, and identified items with problematic wording. Researchers, the program director, and Web editors integrated quantitative and qualitative results, which led to a shorter and clearer IAM-parent. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Method of research and study of uranium deposits

    International Nuclear Information System (INIS)

    Lenoble, A.

    1955-01-01

    In a first part, the author gives a fast retrospective of the evaluations of the uranium deposits in the French Union. The author established a method of prospecting and studying, modifiable at all times following the experiences and the results, permitting to make the general inventory of uranium resources on the territory. The method is based on: 1 - the determination of geological guides in order to mark the most promising deposits, 2 - the definition of a methodology adapted to every steps of the research, 3 - the choice of the material adapted for each of the steps. This method, originally established for the prospecting in crystalline massifs, is adaptable to the prospecting of the sedimentary formations. (M.B.) [fr

  18. Photovoice and Clubfoot: Using a Participatory Research Method to Study Caregiver Adherence to the Ponseti Method in Perú.

    Science.gov (United States)

    Pletch, Alison; Morcuende, Jose; Barriga, Hersey; Segura, Jose; Salas, Alexandro

    2015-01-01

    The Ponseti Method of casting and bracing is the gold-standard treatment for congenital clubfoot in young children. Despite its many advantages, outcomes depend heavily on caregiver adherence to the treatment protocol. Our study explored the experience caregivers had with the Ponseti method using a photography-based participatory research method known as Photovoice. Five adult caregivers were recruited from families pursuing clubfoot treatment at the Children's Hospital in Lima, Perú, during June, 2013. Each was provided a digital camera and training and agreed to photograph their experiences caring for a child undergoing Ponseti Method clubfoot treatment. Participants held four to five weekly one-on-one meetings with the researcher to discuss their photos. They also attended a group meeting at the end of the study to view and discuss photos of other participants. Using photos collected at this meeting, participants identified themes that summarized their experiences with treatment and discussed ways to improve delivery of care in order to support caregiver adherence to treatment. These results were presented to clinicians in Lima who use the Ponseti Method. The Photovoice method allowed researchers and participants to study the experience caregivers have with the Ponseti Method, and results can be used to inform the design of patient-based care models.

  19. [Adverse events management. Methods and results of a development project].

    Science.gov (United States)

    Rabøl, Louise Isager; Jensen, Elisabeth Brøgger; Hellebek, Annemarie H; Pedersen, Beth Lilja

    2006-11-27

    This article describes the methods and results of a project in the Copenhagen Hospital Corporation (H:S) on preventing adverse events. The aim of the project was to raise awareness about patients' safety, test a reporting system for adverse events, develop and test methods of analysis of events and propagate ideas about how to prevent adverse events. H:S developed an action plan and a reporting system for adverse events, founded an organization and developed an educational program on theories and methods of learning from adverse events for both leaders and employees. During the three-year period from 1 January 2002 to 31 December 2004, the H:S staff reported 6011 adverse events. In the same period, the organization completed 92 root cause analyses. More than half of these dealt with events that had been optional to report, the other half events that had been mandatory to report. The number of reports and the front-line staff's attitude towards reporting shows that the H:S succeeded in founding a safety culture. Future work should be centred on developing and testing methods that will prevent adverse events from happening. The objective is to suggest and complete preventive initiatives which will help increase patient safety.

  20. Trojan Horse Method: Recent Results

    International Nuclear Information System (INIS)

    Pizzone, R. G.; Spitaleri, C.

    2008-01-01

    Owing the presence of the Coulomb barrier at astrophysically relevant kinetic energies, it is very difficult, or sometimes impossible to measure astrophysical reaction rates in laboratory. This is why different indirect techniques are being used along with direct measurements. The THM is unique indirect technique allowing one measure astrophysical rearrangement reactions down to astrophysical relevant energies. The basic principle and a review of the main application of the Trojan Horse Method are presented. The applications aiming at the extraction of the bare S b (E) astrophysical factor and electron screening potentials U e for several two body processes are discussed

  1. Testing the ISP method with the PARIO device: Accuracy of results and influence of homogenization technique

    Science.gov (United States)

    Durner, Wolfgang; Huber, Magdalena; Yangxu, Li; Steins, Andi; Pertassek, Thomas; Göttlein, Axel; Iden, Sascha C.; von Unold, Georg

    2017-04-01

    The particle-size distribution (PSD) is one of the main properties of soils. To determine the proportions of the fine fractions silt and clay, sedimentation experiments are used. Most common are the Pipette and Hydrometer method. Both need manual sampling at specific times. Both are thus time-demanding and rely on experienced operators. Durner et al. (Durner, W., S.C. Iden, and G. von Unold (2017): The integral suspension pressure method (ISP) for precise particle-size analysis by gravitational sedimentation, Water Resources Research, doi:10.1002/2016WR019830) recently developed the integral suspension method (ISP) method, which is implemented in the METER Group device PARIOTM. This new method estimates continuous PSD's from sedimentation experiments by recording the temporal evolution of the suspension pressure at a certain measurement depth in a sedimentation cylinder. It requires no manual interaction after start and thus no specialized training of the lab personnel. The aim of this study was to test the precision and accuracy of new method with a variety of materials, to answer the following research questions: (1) Are the results obtained by PARIO reliable and stable? (2) Are the results affected by the initial mixing technique to homogenize the suspension, or by the presence of sand in the experiment? (3) Are the results identical to the one that are obtained with the Pipette method as reference method? The experiments were performed with a pure quartz silt material and four real soil materials. PARIO measurements were done repetitively on the same samples in a temperature-controlled lab to characterize the repeatability of the measurements. Subsequently, the samples were investigated by the pipette method to validate the results. We found that the statistical error for silt fraction from replicate and repetitive measurements was in the range of 1% for the quartz material to 3% for soil materials. Since the sand fractions, as in any sedimentation method, must

  2. Folic Acid Supplementation and Preterm Birth: Results from Observational Studies

    Directory of Open Access Journals (Sweden)

    Elena Mantovani

    2014-01-01

    Full Text Available Introduction. Folic acid (FA supplementation is recommended worldwide in the periconceptional period for the prevention of neural tube defects. Due to its involvement in a number of cellular processes, its role in other pregnancy outcomes such as miscarriage, recurrent miscarriage, low birth weight, preterm birth (PTB, preeclampsia, abruptio placentae, and stillbirth has been investigated. PTB is a leading cause of perinatal mortality and morbidity; therefore its association with FA supplementation is of major interest. The analysis of a small number of randomized clinical trials (RCTs has not found a beneficial role of FA in reducing the rate of PTBs. Aim of the Study. The aim of this review was to examine the results from recent observational studies about the effect of FA supplementation on PTB. Materials and Methods. We carried out a search on Medline and by manual search of the observational studies from 2009 onwards that analyzed the rate of PTB in patients who received supplementation with FA before and/or throughout pregnancy. Results. The results from recent observational studies suggest a slight reduction of PTBs that is not consistent with the results from RCTs. Further research is needed to better understand the role of FA supplementation before and during pregnancy in PTB.

  3. Advanced methods for the study of PWR cores

    International Nuclear Information System (INIS)

    Lambert, M.; Salvatores, St.; Ferrier, A.; Pelet, J.; Nicaise, N.; Pouliquen, J.Y.; Foret, F.; Chauliac, C.; Johner, J.; Cohen, Ch.

    2003-01-01

    This document gathers the transparencies presented at the 6. technical session of the French nuclear energy society (SFEN) in October 2003. The transparencies of the annual meeting are presented in the introductive part: 1 - status of the French nuclear park: nuclear energy results, management of an exceptional climatic situation: the heat wave of summer 2003 and the power generation (J.C. Barral); 2 - status of the research on controlled thermonuclear fusion (J. Johner). Then follows the technical session about the advanced methods for the study of PWR reactor cores: 1 - the evolution approach of study methodologies (M. Lambert, J. Pelet); 2 - the point of view of the nuclear safety authority (D. Brenot); 3 - the improved decoupled methodology for the steam pipe rupture (S. Salvatores, J.Y. Pouliquen); 4 - the MIR method for the pellet-clad interaction (renovated IPG methodology) (E. Baud, C. Royere); 5 - the improved fuel management (IFM) studies for Koeberg (C. Cohen); 6 - principle of the methods of accident study implemented for the European pressurized reactor (EPR) (F. Foret, A. Ferrier); 7 - accident studies with the EPR, steam pipe rupture (N. Nicaise, S. Salvatores); 8 - the co-development platform, a new generation of software tools for the new methodologies (C. Chauliac). (J.S.)

  4. New methods for estimating follow-up rates in cohort studies

    Directory of Open Access Journals (Sweden)

    Xiaonan Xue

    2017-12-01

    Full Text Available Abstract Background The follow-up rate, a standard index of the completeness of follow-up, is important for assessing the validity of a cohort study. A common method for estimating the follow-up rate, the “Percentage Method”, defined as the fraction of all enrollees who developed the event of interest or had complete follow-up, can severely underestimate the degree of follow-up. Alternatively, the median follow-up time does not indicate the completeness of follow-up, and the reverse Kaplan-Meier based method and Clark’s Completeness Index (CCI also have limitations. Methods We propose a new definition for the follow-up rate, the Person-Time Follow-up Rate (PTFR, which is the observed person-time divided by total person-time assuming no dropouts. The PTFR cannot be calculated directly since the event times for dropouts are not observed. Therefore, two estimation methods are proposed: a formal person-time method (FPT in which the expected total follow-up time is calculated using the event rate estimated from the observed data, and a simplified person-time method (SPT that avoids estimation of the event rate by assigning full follow-up time to all events. Simulations were conducted to measure the accuracy of each method, and each method was applied to a prostate cancer recurrence study dataset. Results Simulation results showed that the FPT has the highest accuracy overall. In most situations, the computationally simpler SPT and CCI methods are only slightly biased. When applied to a retrospective cohort study of cancer recurrence, the FPT, CCI and SPT showed substantially greater 5-year follow-up than the Percentage Method (92%, 92% and 93% vs 68%. Conclusions The Person-time methods correct a systematic error in the standard Percentage Method for calculating follow-up rates. The easy to use SPT and CCI methods can be used in tandem to obtain an accurate and tight interval for PTFR. However, the FPT is recommended when event rates and

  5. [Case-non case studies: Principles, methods, bias and interpretation].

    Science.gov (United States)

    Faillie, Jean-Luc

    2017-10-31

    Case-non case studies belongs to the methods assessing drug safety by analyzing the disproportionality of notifications of adverse drug reactions in pharmacovigilance databases. Used for the first time in the 1980s, the last few decades have seen a significant increase in the use of this design. The principle of the case-non case study is to compare drug exposure in cases of a studied adverse reaction with that of cases of other reported adverse reactions and called "non cases". Results are presented in the form of a reporting odds ratio (ROR), the interpretation of which makes it possible to identify drug safety signals. This article describes the principle of the case-non case study, the method of calculating the ROR and its confidence interval, the different modalities of analysis and how to interpret its results with regard to the advantages and limitations of this design. Copyright © 2017 Société française de pharmacologie et de thérapeutique. Published by Elsevier Masson SAS. All rights reserved.

  6. A method of estimating conceptus doses resulting from multidetector CT examinations during all stages of gestation

    International Nuclear Information System (INIS)

    Damilakis, John; Tzedakis, Antonis; Perisinakis, Kostas; Papadakis, Antonios E.

    2010-01-01

    Purpose: Current methods for the estimation of conceptus dose from multidetector CT (MDCT) examinations performed on the mother provide dose data for typical protocols with a fixed scan length. However, modified low-dose imaging protocols are frequently used during pregnancy. The purpose of the current study was to develop a method for the estimation of conceptus dose from any MDCT examination of the trunk performed during all stages of gestation. Methods: The Monte Carlo N-Particle (MCNP) radiation transport code was employed in this study to model the Siemens Sensation 16 and Sensation 64 MDCT scanners. Four mathematical phantoms were used, simulating women at 0, 3, 6, and 9 months of gestation. The contribution to the conceptus dose from single simulated scans was obtained at various positions across the phantoms. To investigate the effect of maternal body size and conceptus depth on conceptus dose, phantoms of different sizes were produced by adding layers of adipose tissue around the trunk of the mathematical phantoms. To verify MCNP results, conceptus dose measurements were carried out by means of three physical anthropomorphic phantoms, simulating pregnancy at 0, 3, and 6 months of gestation and thermoluminescence dosimetry (TLD) crystals. Results: The results consist of Monte Carlo-generated normalized conceptus dose coefficients for single scans across the four mathematical phantoms. These coefficients were defined as the conceptus dose contribution from a single scan divided by the CTDI free-in-air measured with identical scanning parameters. Data have been produced to take into account the effect of maternal body size and conceptus position variations on conceptus dose. Conceptus doses measured with TLD crystals showed a difference of up to 19% compared to those estimated by mathematical simulations. Conclusions: Estimation of conceptus doses from MDCT examinations of the trunk performed on pregnant patients during all stages of gestation can be made

  7. Why, and how, mixed methods research is undertaken in health services research in England: a mixed methods study

    Science.gov (United States)

    O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon

    2007-01-01

    Background Recently, there has been a surge of international interest in combining qualitative and quantitative methods in a single study – often called mixed methods research. It is timely to consider why and how mixed methods research is used in health services research (HSR). Methods Documentary analysis of proposals and reports of 75 mixed methods studies funded by a research commissioner of HSR in England between 1994 and 2004. Face-to-face semi-structured interviews with 20 researchers sampled from these studies. Results 18% (119/647) of HSR studies were classified as mixed methods research. In the documentation, comprehensiveness was the main driver for using mixed methods research, with researchers wanting to address a wider range of questions than quantitative methods alone would allow. Interviewees elaborated on this, identifying the need for qualitative research to engage with the complexity of health, health care interventions, and the environment in which studies took place. Motivations for adopting a mixed methods approach were not always based on the intrinsic value of mixed methods research for addressing the research question; they could be strategic, for example, to obtain funding. Mixed methods research was used in the context of evaluation, including randomised and non-randomised designs; survey and fieldwork exploratory studies; and instrument development. Studies drew on a limited number of methods – particularly surveys and individual interviews – but used methods in a wide range of roles. Conclusion Mixed methods research is common in HSR in the UK. Its use is driven by pragmatism rather than principle, motivated by the perceived deficit of quantitative methods alone to address the complexity of research in health care, as well as other more strategic gains. Methods are combined in a range of contexts, yet the emerging methodological contributions from HSR to the field of mixed methods research are currently limited to the single

  8. QUALITATIVE METHODS IN CREATIVITY STUDIES

    DEFF Research Database (Denmark)

    Hertel, Frederik

    2015-01-01

    In this article we will focus on developing a qualitative research design suitable for conducting case study in creativity. The case is a team of workers (See Hertel, 2015) doing industrial cleaning in the Danish food industry. The hypothesis is that these workers are both participating in......-specific methods, involving a discussion of creativity test, divergent and convergent thinking, for studying creativity in this specific setting. Beside from that we will develop a research design involving a combination of methods necessary for conducting a case study in the setting mentioned....

  9. A Preliminary Study of the Effectiveness of Different Recitation Teaching Methods

    Science.gov (United States)

    Endorf, Robert J.; Koenig, Kathleen M.; Braun, Gregory A.

    2006-02-01

    We present preliminary results from a comparative study of student understanding for students who attended recitation classes which used different teaching methods. Student volunteers from our introductory calculus-based physics course attended a special recitation class that was taught using one of four different teaching methods. A total of 272 students were divided into approximately equal groups for each method. Students in each class were taught the same topic, "Changes in energy and momentum," from Tutorials in Introductory Physics. The different teaching methods varied in the amount of student and teacher engagement. Student understanding was evaluated through pretests and posttests given at the recitation class. Our results demonstrate the importance of the instructor's role in teaching recitation classes. The most effective teaching method was for students working in cooperative learning groups with the instructors questioning the groups using Socratic dialogue. These results provide guidance and evidence for the teaching methods which should be emphasized in training future teachers and faculty members.

  10. Different methods for ethical analysis in health technology assessment: an empirical study.

    Science.gov (United States)

    Saarni, Samuli I; Braunack-Mayer, Annette; Hofmann, Bjørn; van der Wilt, Gert Jan

    2011-10-01

    Ethical analysis can highlight important ethical issues related to implementing a technology, values inherent in the technology itself, and value-decisions underlying the health technology assessment (HTA) process. Ethical analysis is a well-acknowledged part of HTA, yet seldom included in practice. One reason for this is lack of knowledge about the properties and differences between the methods available. This study compares different methods for ethical analysis within HTA. Ethical issues related to bariatric (obesity) surgery were independently evaluated using axiological, casuist, principlist, and EUnetHTA models for ethical analysis within HTA. The methods and results are presented and compared. Despite varying theoretical underpinnings and practical approaches, the four methods identified similar themes: personal responsibility, self-infliction, discrimination, justice, public funding, and stakeholder involvement. The axiological and EUnetHTA models identified a wider range of arguments, whereas casuistry and principlism concentrated more on analyzing a narrower set of arguments deemed more important. Different methods can be successfully used for conducting ethical analysis within HTA. Although our study does not show that different methods in ethics always produce similar results, it supports the view that different methods of ethics can yield relevantly similar results. This suggests that the key conclusions of ethical analyses within HTA can be transferable between methods and countries. The systematic and transparent use of some method of ethics appears more important than the choice of the exact method.

  11. Study on dry-calibration method of ultrasonic flowmeter

    International Nuclear Information System (INIS)

    Ozaki, Yoshihiko; Yasuda, Hidenori.

    1988-01-01

    This paper describes a study on a dry-calibration method for application of an ultrasonic flowmeter to the fields such as nuclear or thermal power plants where high temperature and pressurized fluids are used in coolant or feedwater systems. For the measurement of the flow quantity using the ultrasonic flowmeter, it is important to obtain a correction coefficient of the rate of line averaged axial velocity to plane averaged axial velocity. We have developed analytical method to predict the turbulent flow profiles in the cross sections of piping including bends. The method is based on parabolic flow model and k-ε model with wall functions for the near-wall regions. The axial velocity profiles and the correction coefficients predicted by the analytical method were compared with the experimental results for water and liquid sodium in various L/D conditions. The both results were shown to be in approximate agreement within about 5% accuracy for the flow profiles and about 2% accuracy for the correction coefficients, though the piping had the 90degC bend with a very small redius of curvature. In the case of small L/D conditions, it was also shown that the reverse flow effects could not be disregarded in the predominant direction. However, the accuracy of the dry-calibration by using the analytical method was confirmed to be within about 2% as things were. (author)

  12. Epiphysiodesis Made with Radio Frequency Ablation: First Results from a Pilot Study

    DEFF Research Database (Denmark)

    Shiguetomi Medina, Juan Manuel; Rahbek, Ole; Stødkilde-Jørgensen, Hans

    Objective Development of a new technique for epiphysiodesis using radiofrequency ablation on an animal model (pig) that involves less scarring, less exposure to X-rays, and reduces the risk of injuring the surrounding structures compared to current methods. Material and Methods 4 non-mature 40 kg...... performed right after the procedure and 12 weeks later.The length of both tibiae was measured immediately after the ablation and at the end of the study. Results Both legs were equal at the beginning of the study and there was a leg length difference in average of 3.7mm (SD=0.48) at the end. No damage...

  13. LOGICAL CONDITIONS ANALYSIS METHOD FOR DIAGNOSTIC TEST RESULTS DECODING APPLIED TO COMPETENCE ELEMENTS PROFICIENCY

    Directory of Open Access Journals (Sweden)

    V. I. Freyman

    2015-11-01

    Full Text Available Subject of Research.Representation features of education results for competence-based educational programs are analyzed. Solution importance of decoding and proficiency estimation for elements and components of discipline parts of competences is shown. The purpose and objectives of research are formulated. Methods. The paper deals with methods of mathematical logic, Boolean algebra, and parametrical analysis of complex diagnostic test results, that controls proficiency of some discipline competence elements. Results. The method of logical conditions analysis is created. It will give the possibility to formulate logical conditions for proficiency determination of each discipline competence element, controlled by complex diagnostic test. Normalized test result is divided into noncrossing zones; a logical condition about controlled elements proficiency is formulated for each of them. Summarized characteristics for test result zones are imposed. An example of logical conditions forming for diagnostic test with preset features is provided. Practical Relevance. The proposed method of logical conditions analysis is applied in the decoding algorithm of proficiency test diagnosis for discipline competence elements. It will give the possibility to automate the search procedure for elements with insufficient proficiency, and is also usable for estimation of education results of a discipline or a component of competence-based educational program.

  14. The Trojan Horse method for nuclear astrophysics: Recent results on resonance reactions

    Energy Technology Data Exchange (ETDEWEB)

    Cognata, M. La; Pizzone, R. G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Spitaleri, C.; Cherubini, S.; Romano, S. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania, Italy and Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Gulino, M.; Tumino, A. [Kore University, Enna, Italy and Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Lamia, L. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy)

    2014-05-09

    Nuclear astrophysics aims to measure nuclear-reaction cross sections of astrophysical interest to be included into models to study stellar evolution and nucleosynthesis. Low energies, < 1 MeV or even < 10 keV, are requested for this is the window where these processes are more effective. Two effects have prevented to achieve a satisfactory knowledge of the relevant nuclear processes, namely, the Coulomb barrier exponentially suppressing the cross section and the presence of atomic electrons. These difficulties have triggered theoretical and experimental investigations to extend our knowledge down to astrophysical energies. For instance, indirect techniques such as the Trojan Horse Method have been devised yielding new cutting-edge results. In particular, I will focus on the application of this indirect method to resonance reactions. Resonances might dramatically enhance the astrophysical S(E)-factor so, when they occur right at astrophysical energies, their measurement is crucial to pin down the astrophysical scenario. Unknown or unpredicted resonances might introduce large systematic errors in nucleosynthesis models. These considerations apply to low-energy resonances and to sub-threshold resonances as well, as they may produce sizable modifications of the S-factor due to, for instance, destructive interference with another resonance.

  15. The Trojan Horse method for nuclear astrophysics: Recent results on resonance reactions

    International Nuclear Information System (INIS)

    Cognata, M. La; Pizzone, R. G.; Spitaleri, C.; Cherubini, S.; Romano, S.; Gulino, M.; Tumino, A.; Lamia, L.

    2014-01-01

    Nuclear astrophysics aims to measure nuclear-reaction cross sections of astrophysical interest to be included into models to study stellar evolution and nucleosynthesis. Low energies, < 1 MeV or even < 10 keV, are requested for this is the window where these processes are more effective. Two effects have prevented to achieve a satisfactory knowledge of the relevant nuclear processes, namely, the Coulomb barrier exponentially suppressing the cross section and the presence of atomic electrons. These difficulties have triggered theoretical and experimental investigations to extend our knowledge down to astrophysical energies. For instance, indirect techniques such as the Trojan Horse Method have been devised yielding new cutting-edge results. In particular, I will focus on the application of this indirect method to resonance reactions. Resonances might dramatically enhance the astrophysical S(E)-factor so, when they occur right at astrophysical energies, their measurement is crucial to pin down the astrophysical scenario. Unknown or unpredicted resonances might introduce large systematic errors in nucleosynthesis models. These considerations apply to low-energy resonances and to sub-threshold resonances as well, as they may produce sizable modifications of the S-factor due to, for instance, destructive interference with another resonance

  16. Comparative study of in-situ filter test methods

    International Nuclear Information System (INIS)

    Marshall, M.; Stevens, D.C.

    1981-01-01

    Available methods of testing high efficiency particulate aerosol (HEPA) filters in-situ have been reviewed. In order to understand the relationship between the results produced by different methods a selection has been compared. Various pieces of equipment for generating and detecting aerosols have been tested and their suitability assessed. Condensation-nuclei, DOP (di-octyl phthalate) and sodium-flame in-situ filter test methods have been studied, using the 500 cfm (9000 m 3 /h) filter test rig at Harwell and in the field. Both the sodium-flame and DOP methods measure the penetration through leaks and filter material. However the measured penetration through filtered leaks depends on the aerosol size distribution and the detection method. Condensation-nuclei test methods can only be used to measure unfiltered leaks since condensation nuclei have a very low penetration through filtered leaks. A combination of methods would enable filtered and unfiltered leaks to be measured. A condensation-nucleus counter using n-butyl alcohol as the working fluid has the advantage of being able to detect any particle up to 1 μm in diameter, including DOP, and so could be used for this purpose. A single-particle counter has not been satisfactory because of interference from particles leaking into systems under extract, particularly downstream of filters, and because the concentration of the input aerosol has to be severely limited. The sodium-flame method requires a skilled operator and may cause safety and corrosion problems. The DOP method using a total light scattering detector has so far been the most satisfactory. It is fairly easy to use, measures reasonably low values of penetration and gives rapid results. DOP has had no adverse effect on HEPA filters over a long series of tests

  17. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    Science.gov (United States)

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  18. Implications for monitoring: study designs and interpretation of results

    International Nuclear Information System (INIS)

    Green, R. H.; Montagna, P.

    1996-01-01

    Two innovative statistical approaches to the interpretation and generalization of the results from the study of long-term environmental impacts of offshore oil and gas exploration and production in the Gulf of Mexico were described. The first of the two methods, the Sediment Quality Triad approach, relies on a test of coherence of responses, whereas the second approach uses small scale spatial heterogeneity of response as evidence of impact. As far as the study design was concerned, it was argued that differing objectives which are demanded of the same study (e.g. generalization about environmental impact of similar platforms versus the spatial pattern of impact around individual platforms) are frequently in conflict. If at all possible, they should be avoided since the conflicting demands tend to compromise the design for both situations. 31 refs., 5 figs

  19. STUDIES OF METHOD FOR DETERMINING THE PROTEIN CONCENTRATION OF "MALEIN PPD" BY THE KJELDAHL METHOD

    Directory of Open Access Journals (Sweden)

    Ciuca, V

    2017-06-01

    Full Text Available Glanders is a contagious and fatal disease of horses, donkeys, and mules, caused by infection with the bacterium Burkholderia mallei. The pathogen causes nodules and ulcerations in the upper respiratory tract and lungs. Glanders is transmissible to humans by direct contact with diseased animals or with infected or contaminated material. In the untreated acute disease, the mortality rate can reach 95% within 3 weeks Malein PPD - the diagnostic product contain max 2mg/ml Burkholderia mallei. The amount of protein in the biological product "Malein PPD" is measured as nitrogen from protein molecule, applying the Kjeldahl (method determination of nitrogen by sulphuric acid digestion. The validation study aims to demonstrate the determination of the protein of the Malein PPD, by sulphuric acid digestion, it is an appropriate analytical method, reproducible and meets the quality requirements of diagnostic reagents. The paper establishes the performance characteristics of the method considered and identify the factors that influence these characteristics. The method for determining the concentration of protein, by the Kjeldahl method is considered valid if the results obtained for each validation parameter are within the admissibility criteria.The validation procedure includes details on protocol working to determine the protein of the Malein PPD, validation criteria, experimental results, mathematical calculations.

  20. Monitoring ambient ozone with a passive measurement technique method, field results and strategy

    NARCIS (Netherlands)

    Scheeren, BA; Adema, EH

    1996-01-01

    A low-cost, accurate and sensitive passive measurement method for ozone has been developed and tested. The method is based on the reaction of ozone with indigo carmine which results in colourless reaction products which are detected spectrophotometrically after exposure. Coated glass filters are

  1. Lesion insertion in the projection domain: Methods and initial results

    International Nuclear Information System (INIS)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia

    2015-01-01

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically

  2. Lesion insertion in the projection domain: Methods and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Yu, Zhicong; Ma, Chi; McCollough, Cynthia, E-mail: mccollough.cynthia@mayo.edu [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States)

    2015-12-15

    Purpose: To perform task-based image quality assessment in CT, it is desirable to have a large number of realistic patient images with known diagnostic truth. One effective way of achieving this objective is to create hybrid images that combine patient images with inserted lesions. Because conventional hybrid images generated in the image domain fails to reflect the impact of scan and reconstruction parameters on lesion appearance, this study explored a projection-domain approach. Methods: Lesions were segmented from patient images and forward projected to acquire lesion projections. The forward-projection geometry was designed according to a commercial CT scanner and accommodated both axial and helical modes with various focal spot movement patterns. The energy employed by the commercial CT scanner for beam hardening correction was measured and used for the forward projection. The lesion projections were inserted into patient projections decoded from commercial CT projection data. The combined projections were formatted to match those of commercial CT raw data, loaded onto a commercial CT scanner, and reconstructed to create the hybrid images. Two validations were performed. First, to validate the accuracy of the forward-projection geometry, images were reconstructed from the forward projections of a virtual ACR phantom and compared to physically acquired ACR phantom images in terms of CT number accuracy and high-contrast resolution. Second, to validate the realism of the lesion in hybrid images, liver lesions were segmented from patient images and inserted back into the same patients, each at a new location specified by a radiologist. The inserted lesions were compared to the original lesions and visually assessed for realism by two experienced radiologists in a blinded fashion. Results: For the validation of the forward-projection geometry, the images reconstructed from the forward projections of the virtual ACR phantom were consistent with the images physically

  3. Results of six years of cytogenetic studies in amniotic fluid

    Directory of Open Access Journals (Sweden)

    Enelis Reyes Reyes

    2015-10-01

    Full Text Available Background: research into different genetic diseases is one of the preventive programs of paramount importance at public health level. The early detection of chromosomopathies and the establishment of an appropriate strategy reduce the morbidity-morality rate and improve the patients’ quality of life.Objective: to describe the behavior of the results of the cytogenetic studies in the amniotic fluid of pregnant women from Las Tunas province during six years: from 2008 to 2014.Methods: a retrospective and descriptive study was carried out to assess the results of cytogenetic studies in amniotic liquid during six years: from 2008 to 2014. The statistical records were checked and the results, the indication criteria, the behavior of the age groups in women advanced in age and the diagnosed chromosomopathies were assessed.Results: the samples with results that exceeded the non-conclusive and positive women prevailed; 2, 3 positive cases of chromosomopathies were diagnosed out of 100 studied women at risk; pregnant women of advanced gestational years prevailed as indication criterion, being the 37 to 40 years old age group the predominant one; in the positive cases, numeric chromosomopathies of the type trisomy 21 or Down’s syndrome prevailed, with a frequency of 1, 2 out of 100 pregnant women at risk.Conclusions: the program of the cytogenetic diagnosis in the amniotic fluid has been an effective tool to detect congenital prenatal defects by chromosomopathies, very useful in the process of genetic advice.

  4. [Epidemiological methods used in studies in the prevalence of Tourette syndrome].

    Science.gov (United States)

    Stefanoff, Paweł; Mazurek, Jacek

    2003-01-01

    Tourette syndrome (TS) prevalence was studied since the early 80-ies. Its clinical course is characterised by co-occurrence of motor and vocal tics. Results of previous epidemiological studies were surprisingly divergent: the prevalence varied from 0.5 to 115 cases per 10,000 population. The disease previously recognised as extremely rare and severe is now considered as quite common, with often moderate course. Selected methods used in studies of TS prevalence and analysis of their possible impact on study results are presented. The studies were divided into 3 groups: studies of the hospitalised population, large-scale screenings and studies involving school population, basing on characteristic and size of population, methods of selection of subjects, diagnostic and screening methods used. Studies of the hospitalised population involved patients with most severe symptoms, in different age groups, different methods of final diagnosis confirmation were used. TS prevalence varied from 0.5 up to 15 cases per 10,000 population. Procedures used in large-scale screening studies made possible the elimination of potential selection bias. Large populations were studied using transparent and repetitive confirmation of diagnoses. Their validity was additionally checked in parallel validity studies. TS prevalence was in the range 4.3 to 10 cases per 10,000 population. The highest TS prevalence was obtained in studies involving schoolchildren. Data were gathered from multiple sources: from parents, teachers and children, as well as from classroom observation. Diagnoses were made by experienced clinicians. TS prevalence obtained in school population studies was between 36.2 up to 115 per 10,000 population.

  5. Long-Term Results After Simple Versus Complex Stenting of Coronary Artery Bifurcation Lesions Nordic Bifurcation Study 5-Year Follow-Up Results

    DEFF Research Database (Denmark)

    Maeng, M.; Holm, N. R.; Erglis, A.

    2013-01-01

    Objectives This study sought to report the 5-year follow-up results of the Nordic Bifurcation Study. Background Randomized clinical trials with short-term follow-up have indicated that coronary bifurcation lesions may be optimally treated using the optional side branch stenting strategy. Methods...... complex strategy of planned stenting of both the main vessel and the side branch. (C) 2013 by the American College of Cardiology Foundation...

  6. EXPLORATION BY MEANS OF GEOPHYSICAL METHODS OF GEOTHERMAL FIELDS AND CASE STUDIES

    Directory of Open Access Journals (Sweden)

    Züheyr KAMACI

    1997-01-01

    Full Text Available Geothermal energy which is one of the reuseable energy resources, can save as much as 77 million barrels of petroleum equivalent annually when used in the production of electricity and heating-environment. Geophysical exploration methods plays in important role in the fields of geothermal exploration, development and observational studies. Thermal and geoelectrical methods are the most effective methods which shows the temperature variation anomalies and mechanical drilling places. But, when the other methods of gravity, magnetic, radiometric, well geophysics and well logs can be used in conjunction with seismic tomography, apart from the mentioned geophysical exploration method, better results could be obtained. From the above mentioned facts various case history reports are given from our country and worldwide to determine geothermal energy resources by using geophysical exploration technique application. From these results of studies a 55 °C hot water artessian aquifer is found in the Uşak-Banaz geothermal field by applying geoelectrical methods.

  7. Study of test methods for radionuclide migration in aerated zone

    International Nuclear Information System (INIS)

    Li Shushen; Guo Zede; Wang Zhiming

    1993-01-01

    Aerated zone is an important natural barrier against transport of radionuclides released from disposal facilities of LLRW. This paper introduces study methods for radionuclide migration in aerated zone, including determination of water movement, laboratory simulation test, and field tracing test. For one purpose, results obtained with different methods are compared. These methods have been used in a five-year cooperative research project between CIRP and JAERI for an establishment of methodology for safety assessment on shallow land disposal of LLRW

  8. Feasibility study of structured diagnosis methods for functional dyspepsia in Korean medicine clinics

    Directory of Open Access Journals (Sweden)

    Jeong Hwan Park

    2017-12-01

    Full Text Available Background: Functional dyspepsia (FD is the seventh most common disease encountered in Korean medicine (KM clinics. Despite the large number of FD patients visiting KM clinics, the accumulated medical records have no utility in evidence development, due to being unstructured. This study aimed to construct a standard operating procedure (SOP with appropriate structured diagnostic methods for FD, and assess the feasibility for use in KM clinics. Methods: Two rounds of professional surveys were conducted by 10 Korean internal medicine professors to select the representative diagnostic methods. A feasibility study was conducted to evaluate compliance and time required for using the structured diagnostic methods by three specialists in two hospitals. Results: As per the results of the professional survey, five questionnaires and one basic diagnostic method were selected. An SOP was constructed based on the survey results, and a feasibility study showed that the SOP compliance score (out of 5 was 3.45 among the subjects, and 3.25 among the practitioners. The SOP was acceptable and was not deemed difficult to execute. The total execution time was 136.5 minutes, out of which the gastric emptying test time was 129 minutes. Conclusion: This feasibility study of the SOP with structured diagnostic methods for FD confirmed it was adequate for use in KM clinics. It is expected that these study findings will be helpful to clinicians who wish to conduct observational studies as well as to generate quantitative medical records to facilitate Big Data research. Keywords: Big Data, Dyspepsia, Korean medicine, Feasibility studies, Observational study

  9. Methods to estimate the between‐study variance and its uncertainty in meta‐analysis†

    Science.gov (United States)

    Jackson, Dan; Viechtbauer, Wolfgang; Bender, Ralf; Bowden, Jack; Knapp, Guido; Kuss, Oliver; Higgins, Julian PT; Langan, Dean; Salanti, Georgia

    2015-01-01

    Meta‐analyses are typically used to estimate the overall/mean of an outcome of interest. However, inference about between‐study variability, which is typically modelled using a between‐study variance parameter, is usually an additional aim. The DerSimonian and Laird method, currently widely used by default to estimate the between‐study variance, has been long challenged. Our aim is to identify known methods for estimation of the between‐study variance and its corresponding uncertainty, and to summarise the simulation and empirical evidence that compares them. We identified 16 estimators for the between‐study variance, seven methods to calculate confidence intervals, and several comparative studies. Simulation studies suggest that for both dichotomous and continuous data the estimator proposed by Paule and Mandel and for continuous data the restricted maximum likelihood estimator are better alternatives to estimate the between‐study variance. Based on the scenarios and results presented in the published studies, we recommend the Q‐profile method and the alternative approach based on a ‘generalised Cochran between‐study variance statistic’ to compute corresponding confidence intervals around the resulting estimates. Our recommendations are based on a qualitative evaluation of the existing literature and expert consensus. Evidence‐based recommendations require an extensive simulation study where all methods would be compared under the same scenarios. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. PMID:26332144

  10. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  11. Studies on Hepa filter test methods

    International Nuclear Information System (INIS)

    Lee, S.H.; Jon, K.S.; Park, W.J.; Ryoo, R.

    1981-01-01

    The purpose of this study is to compare testing methods of the HEPA filter adopted in other countries with each other, and to design and construct a test duct system to establish testing methods. The American D.O.P. test method, the British NaCl test method and several other independently developed methods are compared. It is considered that the D.O.P. method is most suitable for in-plant and leak tests

  12. Comparison study on cell calculation method of fast reactor

    International Nuclear Information System (INIS)

    Chiba, Gou

    2002-10-01

    Effective cross sections obtained by cell calculations are used in core calculations in current deterministic methods. Therefore, it is important to calculate the effective cross sections accurately and several methods have been proposed. In this study, some of the methods are compared to each other using a continuous energy Monte Carlo method as a reference. The result shows that the table look-up method used in Japan Nuclear Cycle Development Institute (JNC) sometimes has a difference over 10% in effective microscopic cross sections and be inferior to the sub-group method. The problem was overcome by introducing a new nuclear constant system developed in JNC, in which the ultra free energy group library is used. The system can also deal with resonance interaction effects between nuclides which are not able to be considered by other methods. In addition, a new method was proposed to calculate effective cross section accurately for power reactor fuel subassembly where the new nuclear constant system cannot be applied. This method uses the sub-group method and the ultra fine energy group collision probability method. The microscopic effective cross sections obtained by this method agree with the reference values within 5% difference. (author)

  13. Studies on Erythropoietin Bioassay Method

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Kyoung Sam; Ro, Heung Kyu; Lee, Mun Ho [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1975-09-15

    It is the purpose of this paper to design the most preferable method of erythropoietin bioassay in Korea. Bioassay utilizing polycythemic mice are currently in general use for the indirect determination of erythropoietin. Assay animals are usually prepared either by transfusion or by exposure to reduced oxygen tension in specially constructed chamber. We prepared the polycythemic mice by the specially constructed hypobaric chamber. We observed weights and hematocrits of the mice in the hypobaric chamber, then hematocrits and 72 hours {sup 59}Fe red cell uptake ratio of the polycythemic mice induced by hypoxia after removal from the hypobaric chamber. We designed the method of erythropoietin bioassay according to the results obtained by above experiments. Then we measured the 72 hours {sup 59}Fe red cell uptake ratio of the polycythemic mice with normal saline, normal plasma and anemic plasma according to the method we designed. The results are followed:1) The hematocrits of the mice in hypobaric chamber increased to 74% in 11 days. It is preferable to maintain the pressure of the chamber to 400 mmHg for first 4 days then 300 mmHg for last 10 days to reduce the death rate and time consuming in hypobaric chamber. 2) After removal from the hypobaric chamber, the 72 hours {sup 59}Fe red cell uptake ratio decreased rapidly and maintained the lowest level from the fourth day to tenth day. 3) We design the method of erythropoietin bioassay according to the results of above experiment and to the half life of erythropoietin. 4) The Korean product {sup 59}Fe is mixture of {sup 55}Fe and {sup 59}Fe. And the {sup 59}Fe red cell uptake ratio in normal mice was far less with Korean product {sup 59}Fe than with pure {sup 59}Fe of foreign product. So it is desirable to use pure {sup 59}Fe in this method of erythropoietin bioassay. 5) Considering the cost, the technique, the time consuming and the sensitivity it is the most preferable method of erythropoietin bioassay in Korea

  14. Evolution of different reaction methods resulting in the formation of AgI125 for use in brachytherapy sources

    International Nuclear Information System (INIS)

    Souza, C.D.; Peleias Jr, F.S.; Rostelato, M.E.C.M.; Zeituni, C.A.; Benega, M.A.G.; Tiezzi, R.; Mattos, F.R.; Rodrigues, B.T.; Oliveira, T.B.; Feher, A.; Moura, J.A.; Costa, O.L.

    2014-01-01

    Prostate cancer represents about 10% of all cases of cancer in the world. Brachytherapy has been extensively used in the early and intermediate stages of the illness. The radiotherapy method reduces the damage probability to surrounding healthy tissues. The present study compares several deposition methods of iodine-125 on silver substrate (seed core), in order to choose the most suitable one to be implemented at IPEN. Four methods were selected: method 1 (assay based on electrodeposition) which presented efficiency of 65.16%; method 2 (assay based on chemical reactions, developed by David Kubiatowicz) which presented efficiency of 70.80%; method 3 (chemical reaction based on the methodology developed by Dr. Maria Elisa Rostelato) which presented efficiency of 55.80%; new method developed by IPEN with 90.5% efficiency. Based on the results, the new method is the suggested one to be implemented. (authors)

  15. [Reconsidering children's dreams. A critical review of methods and results in developmental dream research from Freud to contemporary works].

    Science.gov (United States)

    Sándor, Piroska; Bódizs, Róbert

    2014-01-01

    Examining children's dream development is a significant challenge for researchers. Results from studies on children's dreaming may enlighten us on the nature and role of dreaming as well as broaden our knowledge of consciousness and cognitive development. This review summarizes the main questions and historical progress in developmental dream research, with the aim of shedding light on the advantages, disadvantages and effects of different settings and methods on research outcomes. A typical example would be the dreams of 3 to 5 year-olds: they are simple and static, with a relative absence of emotions and active self participation according to laboratory studies; studies using different methodology however found them to be vivid, rich in emotions, with the self as an active participant. Questions about the validity of different methods arise, and are considered within this review. Given that methodological differences can result in highly divergent outcomes, it is strongly recommended for future research to select methodology and treat results more carefully.

  16. A systematic study of genome context methods: calibration, normalization and combination

    Directory of Open Access Journals (Sweden)

    Dale Joseph M

    2010-10-01

    Full Text Available Abstract Background Genome context methods have been introduced in the last decade as automatic methods to predict functional relatedness between genes in a target genome using the patterns of existence and relative locations of the homologs of those genes in a set of reference genomes. Much work has been done in the application of these methods to different bioinformatics tasks, but few papers present a systematic study of the methods and their combination necessary for their optimal use. Results We present a thorough study of the four main families of genome context methods found in the literature: phylogenetic profile, gene fusion, gene cluster, and gene neighbor. We find that for most organisms the gene neighbor method outperforms the phylogenetic profile method by as much as 40% in sensitivity, being competitive with the gene cluster method at low sensitivities. Gene fusion is generally the worst performing of the four methods. A thorough exploration of the parameter space for each method is performed and results across different target organisms are presented. We propose the use of normalization procedures as those used on microarray data for the genome context scores. We show that substantial gains can be achieved from the use of a simple normalization technique. In particular, the sensitivity of the phylogenetic profile method is improved by around 25% after normalization, resulting, to our knowledge, on the best-performing phylogenetic profile system in the literature. Finally, we show results from combining the various genome context methods into a single score. When using a cross-validation procedure to train the combiners, with both original and normalized scores as input, a decision tree combiner results in gains of up to 20% with respect to the gene neighbor method. Overall, this represents a gain of around 15% over what can be considered the state of the art in this area: the four original genome context methods combined using a

  17. Improvement of human cell line activation test (h-CLAT) using short-time exposure methods for prevention of false-negative results.

    Science.gov (United States)

    Narita, Kazuto; Ishii, Yuuki; Vo, Phuc Thi Hong; Nakagawa, Fumiko; Ogata, Shinichi; Yamashita, Kunihiko; Kojima, Hajime; Itagaki, Hiroshi

    2018-01-01

    Recently, animal testing has been affected by increasing ethical, social, and political concerns regarding animal welfare. Several in vitro safety tests for evaluating skin sensitization, such as the human cell line activation test (h-CLAT), have been proposed. However, similar to other tests, the h-CLAT has produced false-negative results, including in tests for acid anhydride and water-insoluble chemicals. In a previous study, we demonstrated that the cause of false-negative results from phthalic anhydride was hydrolysis by an aqueous vehicle, with IL-8 release from THP-1 cells, and that short-time exposure to liquid paraffin (LP) dispersion medium could reduce false-negative results from acid anhydrides. In the present study, we modified the h-CLAT by applying this exposure method. We found that the modified h-CLAT is a promising method for reducing false-negative results obtained from acid anhydrides and chemicals with octanol-water partition coefficients (LogK ow ) greater than 3.5. Based on the outcomes from the present study, a combination of the original and the modified h-CLAT is suggested for reducing false-negative results. Notably, the combination method provided a sensitivity of 95% (overall chemicals) or 93% (chemicals with LogK ow > 2.0), and an accuracy of 88% (overall chemicals) or 81% (chemicals with LogK ow > 2.0). We found that the combined method is a promising evaluation scheme for reducing false-negative results seen in existing in vitro skin-sensitization tests. In the future, we expect a combination of original and modified h-CLAT to be applied in a newly developed in vitro test for evaluating skin sensitization.

  18. Application of the DSA preconditioned GMRES formalism to the method of characteristics - First results

    International Nuclear Information System (INIS)

    Le Tellier, R.; Hebert, A.

    2004-01-01

    The method of characteristics is well known for its slow convergence; consequently, as it is often done for SN methods, the Generalized Minimal Residual approach (GMRES) has been investigated for its practical implementation and its high reliability. GMRES is one of the most effective Krylov iterative methods to solve large linear systems. Moreover, the system has been 'left preconditioned' with the Algebraic Collapsing Acceleration (ACA) a variant of the Diffusion Synthetic Acceleration (DSA) based on I. Suslov's former works. This paper presents the first numerical results of these methods in 2D geometries with material discontinuities. Indeed, previous investigations have shown a degraded effectiveness of Diffusion Synthetic Accelerations with this kind of geometries. Results are presented for 9 x 9 Cartesian assemblies in terms of the speed of convergence of the inner iterations (fixed source) of the method of characteristics. It shows a significant improvement on the convergence rate. (authors)

  19. Application of photonuclear methods of analysis in biology, medicine, ecological studies

    International Nuclear Information System (INIS)

    Burmistenko, Yu.N.

    1986-01-01

    Examples of application of photonuclear methods of analysis (PhMA) of the substance composition in biology, medicine, ecology are considered. The methods for determining the element composition of soft and bone tissues, blood, urine are developed. The results of studying the limits of determination of different elements are presented. In ecological investigations PhMA is applied for studying the composition of atmospheric aerosols, industrial sewage, canalization wastes, pollution of soil, plants, animals with toxic elements

  20. Monte Carlo Method to Study Properties of Acceleration Factor Estimation Based on the Test Results with Varying Load

    Directory of Open Access Journals (Sweden)

    N. D. Tiannikova

    2014-01-01

    Full Text Available G.D. Kartashov has developed a technique to determine the rapid testing results scaling functions to the normal mode. Its feature is preliminary tests of products of one sample including tests using the alternating modes. Standard procedure of preliminary tests (researches is as follows: n groups of products with m elements in each start being tested in normal mode and, after a failure of one of products in the group, the remained products are tested in accelerated mode. In addition to tests in alternating mode, tests in constantly normal mode are conducted as well. The acceleration factor of rapid tests for this type of products, identical to any lots is determined using such testing results of products from the same lot. A drawback of this technique is that tests are to be conducted in alternating mode till the failure of all products. That is not always is possible. To avoid this shortcoming, the Renyi criterion is offered. It allows us to determine scaling functions using the right-censored data thus giving the opportunity to stop testing prior to all failures of products.In this work a statistical modeling of the acceleration factor estimation owing to Renyi statistics minimization is implemented by the Monte-Carlo method. Results of modeling show that the acceleration factor estimation obtained through Renyi statistics minimization is conceivable for rather large n . But for small sample volumes some systematic bias of acceleration factor estimation, which decreases with growth n is observed for both distributions (exponential and Veybull's distributions. Therefore the paper also presents calculation results of correction factors for a case of exponential distribution and Veybull's distribution.

  1. Comparative study of durability test methods for pellets and briquettes

    Energy Technology Data Exchange (ETDEWEB)

    Temmerman, Michaeel; Rabier, Fabienne [Centre wallon de Recherches agronomiques (CRA-W), 146, chaussee de Namur, B-5030, Gembloux (Belgium); Jensen, Peter Daugbjerg [Forest and Landscape, The Royal Veterinary and Agricultural University, Rolighedsvej 23, DK-1958 Frederiksberg C (Denmark); Hartmann, Hans; Boehm, Thorsten [Technologie- und Foerderzentrum fuer Nachwachsende Rohstoffe-TFZ, Schulgasse 18, D-94315 Straubing (Germany)

    2006-11-15

    Different methods for the determination of the mechanical durability (DU) of pellets and briquettes were compared by international round robin tests including different laboratories. The DUs of five briquette and 26 pellet types were determined. For briquettes, different rotation numbers of a prototype tumbler and a calculated DU index are compared. For pellets testing, the study compares two standard methods, a tumbling device according to ASAE S 269.4, the Lignotester according to ONORM M 7135 and a second tumbling method with a prototype tumbler. For the tested methods, the repeatability, the reproducibility and the required minimum number of replications to achieve given accuracy levels were calculated. Additionally, this study evaluates the relation between DU and particle density. The results show for both pellets and briquettes, that the measured DU values and their variability are influenced by the applied method. Moreover, the variability of the results depend on the biofuel itself. For briquettes of DU above 90%, five replications lead to an accuracy of 2%, while 39 replications are needed to achieve an accuracy of 10%, when briquettes of DU below 90% are tested. For pellets, the tumbling device described by the ASAE standard allows to reach acceptable accuracy levels (1%) with a limited number of replications. Finally, for the tested pellets and briquettes no relation between DU and particle density was found. (author)

  2. Methods to assess intended effects of drug treatment in observational studies are reviewed

    NARCIS (Netherlands)

    Klungel, Olaf H|info:eu-repo/dai/nl/181447649; Martens, Edwin P|info:eu-repo/dai/nl/088859010; Psaty, Bruce M; Grobbee, Diederik E; Sullivan, Sean D; Stricker, Bruno H Ch; Leufkens, Hubert G M|info:eu-repo/dai/nl/075255049; de Boer, A|info:eu-repo/dai/nl/075097346

    2004-01-01

    BACKGROUND AND OBJECTIVE: To review methods that seek to adjust for confounding in observational studies when assessing intended drug effects. METHODS: We reviewed the statistical, economical and medical literature on the development, comparison and use of methods adjusting for confounding. RESULTS:

  3. Application of NUREG-1150 methods and results to accident management

    International Nuclear Information System (INIS)

    Dingman, S.; Sype, T.; Camp, A.; Maloney, K.

    1991-01-01

    The use of NUREG-1150 and similar probabilistic risk assessments in the Nuclear Regulatory Commission (NRC) and industry risk management programs is discussed. Risk management is more comprehensive than the commonly used term accident management. Accident management includes strategies to prevent vessel breach, mitigate radionuclide releases from the reactor coolant system, and mitigate radionuclide releases to the environment. Risk management also addresses prevention of accident initiators, prevention of core damage, and implementation of effective emergency response procedures. The methods and results produced in NUREG-1150 provide a framework within which current risk management strategies can be evaluated, and future risk management programs can be developed and assessed. Examples of the use of the NUREG-1150 framework for identifying and evaluating risk management options are presented. All phases of risk management are discussed, with particular attention given to the early phases of accidents. Plans and methods for evaluating accident management strategies that have been identified in the NRC accident management program are discussed

  4. Application of NUREG-1150 methods and results to accident management

    International Nuclear Information System (INIS)

    Dingman, S.; Sype, T.; Camp, A.; Maloney, K.

    1990-01-01

    The use of NUREG-1150 and similar Probabilistic Risk Assessments in NRC and industry risk management programs is discussed. ''Risk management'' is more comprehensive than the commonly used term ''accident management.'' Accident management includes strategies to prevent vessel breach, mitigate radionuclide releases from the reactor coolant system, and mitigate radionuclide releases to the environment. Risk management also addresses prevention of accident initiators, prevention of core damage, and implementation of effective emergency response procedures. The methods and results produced in NUREG-1150 provide a framework within which current risk management strategies can be evaluated, and future risk management programs can be developed and assessed. Examples of the use of the NUREG-1150 framework for identifying and evaluating risk management options are presented. All phases of risk management are discussed, with particular attention given to the early phases of accidents. Plans and methods for evaluating accident management strategies that have been identified in the NRC accident management program are discussed. 2 refs., 3 figs

  5. Methods of Efficient Study Habits and Physics Learning

    Science.gov (United States)

    Zettili, Nouredine

    2010-02-01

    We want to discuss the methods of efficient study habits and how they can be used by students to help them improve learning physics. In particular, we deal with the most efficient techniques needed to help students improve their study skills. We focus on topics such as the skills of how to develop long term memory, how to improve concentration power, how to take class notes, how to prepare for and take exams, how to study scientific subjects such as physics. We argue that the students who conscientiously use the methods of efficient study habits achieve higher results than those students who do not; moreover, a student equipped with the proper study skills will spend much less time to learn a subject than a student who has no good study habits. The underlying issue here is not the quantity of time allocated to the study efforts by the students, but the efficiency and quality of actions so that the student can function at peak efficiency. These ideas were developed as part of Project IMPACTSEED (IMproving Physics And Chemistry Teaching in SEcondary Education), an outreach grant funded by the Alabama Commission on Higher Education. This project is motivated by a major pressing local need: A large number of high school physics teachers teach out of field. )

  6. Hybrid Method for Mobile learning Cooperative: Study of Timor Leste

    Science.gov (United States)

    da Costa Tavares, Ofelia Cizela; Suyoto; Pranowo

    2018-02-01

    In the modern world today the decision support system is very useful to help in solving a problem, so this study discusses the learning process of savings and loan cooperatives in Timor Leste. The purpose of the observation is that the people of Timor Leste are still in the process of learning the use DSS for good saving and loan cooperative process. Based on existing research on the Timor Leste community on credit cooperatives, a mobile application will be built that will help the cooperative learning process in East Timorese society. The methods used for decision making are AHP (Analytical Hierarchy Process) and SAW (simple additive Weighting) method to see the result of each criterion and the weight of the value. The result of this research is mobile leaning cooperative in decision support system by using SAW and AHP method. Originality Value: Changed the two methods of mobile application development using AHP and SAW methods to help the decision support system process of a savings and credit cooperative in Timor Leste.

  7. Hybrid Method for Mobile learning Cooperative: Study of Timor Leste

    Directory of Open Access Journals (Sweden)

    da Costa Tavares Ofelia Cizela

    2018-01-01

    Full Text Available In the modern world today the decision support system is very useful to help in solving a problem, so this study discusses the learning process of savings and loan cooperatives in Timor Leste. The purpose of the observation is that the people of Timor Leste are still in the process of learning the use DSS for good saving and loan cooperative process. Based on existing research on the Timor Leste community on credit cooperatives, a mobile application will be built that will help the cooperative learning process in East Timorese society. The methods used for decision making are AHP (Analytical Hierarchy Process and SAW (simple additive Weighting method to see the result of each criterion and the weight of the value. The result of this research is mobile leaning cooperative in decision support system by using SAW and AHP method. Originality Value: Changed the two methods of mobile application development using AHP and SAW methods to help the decision support system process of a savings and credit cooperative in Timor Leste.

  8. Elucidation of a method for study on sports tactics

    OpenAIRE

    内山, 治樹

    2007-01-01

    It is important to extract the deep-layer arrangement to study on sports tactics, which makes the tactics as they really are, rather than analyzing the phenomenal forms at their surface. The purpose of this study was to propose the "original doctrine-based approach" as a logical method suitable for studying the arrangement of sports tactics and to examine the objective reasonableness of it. As a result of considering and examining the three viewpoints, namely the conceptual regulations of spo...

  9. Natural science methods in field archaeology, with the case study of Crimea

    Science.gov (United States)

    Smekalova, T. N.; Yatsishina, E. B.; Garipov, A. S.; Pasumanskii, A. E.; Ketsko, R. S.; Chudin, A. V.

    2016-07-01

    The natural science methods applied in archaeological field survey are briefly reviewed. They are classified into several groups: remote sensing (analysis of space and airspace photographs, viewshed analysis, study of detailed topographic and special maps, and three-dimensional photogrammetry), geophysical survey, and analysis of cultural layer elements (by geochemical, paleosol, and other methods). The most important principle is the integration of complementary nondestructive and fast natural science methods in order to obtain the most complete and reliable results. Emphasis is placed on the consideration of geophysical methods of the study, primarily, magnetic exploration. A multidisciplinary study of the monuments of ancient Chersonesos and its "barbarian" environment is described as an example of successful application of a complex technique.

  10. The study of diagnostic accuracy of chest nodules by using different compression methods

    International Nuclear Information System (INIS)

    Liang Zhigang; Kuncheng, L.I.; Zhang Jinghong; Liu Shuliang

    2005-01-01

    Background: The purpose of this study was to compare the diagnostic accuracy of small nodules in the chest by using different compression methods. Method: Two radiologists with 5 years experience twice interpreted 39 chest images by using lossless and lossy compression methods. The time interval was 3 weeks. Each time the radiologists interpreted one kind of compressed images. The image browser used the Unisight software provided by Atlastiger Company in Shanghai. The interpreting results were analyzed by the ROCKIT software and the ROC curves were painted by Excel 2002. Results: In studies of receiver operating characteristics for scoring the presence or absence of nodules, the images with lossy compression method showed no statistical difference as compared with the images with lossless compression method. Conclusion: The diagnostic accuracy of chest nodules by using the lossless and lossy compression methods had no significant difference, we could use the lossy compression method to transmit and archive the chest images with nodules

  11. A comparative study of different methods for calculating electronic transition rates

    Science.gov (United States)

    Kananenka, Alexei A.; Sun, Xiang; Schubert, Alexander; Dunietz, Barry D.; Geva, Eitan

    2018-03-01

    We present a comprehensive comparison of the following mixed quantum-classical methods for calculating electronic transition rates: (1) nonequilibrium Fermi's golden rule, (2) mixed quantum-classical Liouville method, (3) mean-field (Ehrenfest) mixed quantum-classical method, and (4) fewest switches surface-hopping method (in diabatic and adiabatic representations). The comparison is performed on the Garg-Onuchic-Ambegaokar benchmark charge-transfer model, over a broad range of temperatures and electronic coupling strengths, with different nonequilibrium initial states, in the normal and inverted regimes. Under weak to moderate electronic coupling, the nonequilibrium Fermi's golden rule rates are found to be in good agreement with the rates obtained via the mixed quantum-classical Liouville method that coincides with the fully quantum-mechanically exact results for the model system under study. Our results suggest that the nonequilibrium Fermi's golden rule can serve as an inexpensive yet accurate alternative to Ehrenfest and the fewest switches surface-hopping methods.

  12. Methods for studying short-range order in solid binary solutions

    International Nuclear Information System (INIS)

    Beranger, Gerard

    1969-12-01

    The short range order definition and its characteristic parameters are first recalled. The different methods to study the short range order are then examined: X ray diffusion, electrical resistivity, specific heat and thermoelectric power, neutron diffraction, electron spin resonance, study of thermodynamic and mechanical properties. The theory of the X ray diffraction effects due to short range order and the subsequent experimental method are emphasized. The principal results obtained from binary Systems, by the different experimental techniques, are reported and briefly discussed. The Au-Cu, Li-Mg, Au-Ni and Cu-Zn Systems are moreover described. (author) [fr

  13. Active teaching methods, studying responses and learning

    DEFF Research Database (Denmark)

    Christensen, Hans Peter; Vigild, Martin Etchells; Thomsen, Erik Vilain

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching.......Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching....

  14. Recommendations for describing statistical studies and results in general readership science and engineering journals.

    Science.gov (United States)

    Gardenier, John S

    2012-12-01

    This paper recommends how authors of statistical studies can communicate to general audiences fully, clearly, and comfortably. The studies may use statistical methods to explore issues in science, engineering, and society or they may address issues in statistics specifically. In either case, readers without explicit statistical training should have no problem understanding the issues, the methods, or the results at a non-technical level. The arguments for those results should be clear, logical, and persuasive. This paper also provides advice for editors of general journals on selecting high quality statistical articles without the need for exceptional work or expense. Finally, readers are also advised to watch out for some common errors or misuses of statistics that can be detected without a technical statistical background.

  15. Quantifying viruses and bacteria in wastewater—Results, interpretation methods, and quality control

    Science.gov (United States)

    Francy, Donna S.; Stelzer, Erin A.; Bushon, Rebecca N.; Brady, Amie M.G.; Mailot, Brian E.; Spencer, Susan K.; Borchardt, Mark A.; Elber, Ashley G.; Riddell, Kimberly R.; Gellner, Terry M.

    2011-01-01

    Membrane bioreactors (MBR), used for wastewater treatment in Ohio and elsewhere in the United States, have pore sizes small enough to theoretically reduce concentrations of protozoa and bacteria, but not viruses. Sampling for viruses in wastewater is seldom done and not required. Instead, the bacterial indicators Escherichia coli (E. coli) and fecal coliforms are the required microbial measures of effluents for wastewater-discharge permits. Information is needed on the effectiveness of MBRs in removing human enteric viruses from wastewaters, particularly as compared to conventional wastewater treatment before and after disinfection. A total of 73 regular and 28 quality-control (QC) samples were collected at three MBR and two conventional wastewater plants in Ohio during 23 regular and 3 QC sampling trips in 2008-10. Samples were collected at various stages in the treatment processes and analyzed for bacterial indicators E. coli, fecal coliforms, and enterococci by membrane filtration; somatic and F-specific coliphage by the single agar layer (SAL) method; adenovirus, enterovirus, norovirus GI and GII, rotavirus, and hepatitis A virus by molecular methods; and viruses by cell culture. While addressing the main objective of the study-comparing removal of viruses and bacterial indicators in MBR and conventional plants-it was realized that work was needed to identify data analysis and quantification methods for interpreting enteric virus and QC data. Therefore, methods for quantifying viruses, qualifying results, and applying QC data to interpretations are described in this report. During each regular sampling trip, samples were collected (1) before conventional or MBR treatment (post-preliminary), (2) after secondary or MBR treatment (post-secondary or post-MBR), (3) after tertiary treatment (one conventional plant only), and (4) after disinfection (post-disinfection). Glass-wool fiber filtration was used to concentrate enteric viruses from large volumes, and small

  16. Interval estimation methods of the mean in small sample situation and the results' comparison

    International Nuclear Information System (INIS)

    Wu Changli; Guo Chunying; Jiang Meng; Lin Yuangen

    2009-01-01

    The methods of the sample mean's interval estimation, namely the classical method, the Bootstrap method, the Bayesian Bootstrap method, the Jackknife method and the spread method of the Empirical Characteristic distribution function are described. Numerical calculation on the samples' mean intervals is carried out where the numbers of the samples are 4, 5, 6 respectively. The results indicate the Bootstrap method and the Bayesian Bootstrap method are much more appropriate than others in small sample situation. (authors)

  17. The Vermont oxford neonatal encephalopathy registry: rationale, methods, and initial results

    Science.gov (United States)

    2012-01-01

    Background In 2006, the Vermont Oxford Network (VON) established the Neonatal Encephalopathy Registry (NER) to characterize infants born with neonatal encephalopathy, describe evaluations and medical treatments, monitor hypothermic therapy (HT) dissemination, define clinical research questions, and identify opportunities for improved care. Methods Eligible infants were ≥ 36 weeks with seizures, altered consciousness (stupor, coma) during the first 72 hours of life, a 5 minute Apgar score of ≤ 3, or receiving HT. Infants with central nervous system birth defects were excluded. Results From 2006–2010, 95 centers registered 4232 infants. Of those, 59% suffered a seizure, 50% had a 5 minute Apgar score of ≤ 3, 38% received HT, and 18% had stupor/coma documented on neurologic exam. Some infants experienced more than one eligibility criterion. Only 53% had a cord gas obtained and only 63% had a blood gas obtained within 24 hours of birth, important components for determining HT eligibility. Sixty-four percent received ventilator support, 65% received anticonvulsants, 66% had a head MRI, 23% had a cranial CT, 67% had a full channel encephalogram (EEG) and 33% amplitude integrated EEG. Of all infants, 87% survived. Conclusions The VON NER describes the heterogeneous population of infants with NE, the subset that received HT, their patterns of care, and outcomes. The optimal routine care of infants with neonatal encephalopathy is unknown. The registry method is well suited to identify opportunities for improvement in the care of infants affected by NE and study interventions such as HT as they are implemented in clinical practice. PMID:22726296

  18. Recent Studies on Trojan Horse Method

    International Nuclear Information System (INIS)

    Cherubini, S.; Spitaleri, C.; Gulino, M.

    2011-01-01

    The study of nuclear reactions that are important for the understanding of astrophysical problems received an increasing attention over the last decades. The Trojan Horse Method was proposed as a tool to overcome some of the problems connected with the measurement of cross-sections between charged particles at astrophysical energies. Here we present some recent studies on this method. (authors)

  19. Comparative study of discretization methods of microarray data for inferring transcriptional regulatory networks

    Directory of Open Access Journals (Sweden)

    Ji Wei

    2010-10-01

    Full Text Available Abstract Background Microarray data discretization is a basic preprocess for many algorithms of gene regulatory network inference. Some common discretization methods in informatics are used to discretize microarray data. Selection of the discretization method is often arbitrary and no systematic comparison of different discretization has been conducted, in the context of gene regulatory network inference from time series gene expression data. Results In this study, we propose a new discretization method "bikmeans", and compare its performance with four other widely-used discretization methods using different datasets, modeling algorithms and number of intervals. Sensitivities, specificities and total accuracies were calculated and statistical analysis was carried out. Bikmeans method always gave high total accuracies. Conclusions Our results indicate that proper discretization methods can consistently improve gene regulatory network inference independent of network modeling algorithms and datasets. Our new method, bikmeans, resulted in significant better total accuracies than other methods.

  20. Comparative study of two commercially pure titanium casting methods

    Directory of Open Access Journals (Sweden)

    Renata Cristina Silveira Rodrigues

    2010-10-01

    Full Text Available The interest in using titanium to fabricate removable partial denture (RPD frameworks has increased, but there are few studies evaluating the effects of casting methods on clasp behavior. OBJECTIVE: This study compared the occurrence of porosities and the retentive force of commercially pure titanium (CP Ti and cobalt-chromium (Co-Cr removable partial denture circumferential clasps cast by induction/centrifugation and plasma/vacuum-pressure. MATERIAL AND METHODS: 72 frameworks were cast from CP Ti (n=36 and Co-Cr alloy (n=36; control group. For each material, 18 frameworks were casted by electromagnetic induction and injected by centrifugation, whereas the other 18 were casted by plasma and injected by vacuum-pressure. For each casting method, three subgroups (n=6 were formed: 0.25 mm, 0.50 mm, and 0.75 mm undercuts. The specimens were radiographed and subjected to an insertion/removal test simulating 5 years of framework use. Data were analyzed by ANOVA and Tukey's to compare materials and cast methods (α=0.05. RESULTS: Three of 18 specimens of the induction/centrifugation group and 9 of 18 specimens of plasma/vacuum-pressure cast presented porosities, but only 1 and 7 specimens, respectively, were rejected for simulation test. For Co-Cr alloy, no defects were found. Comparing the casting methods, statistically significant differences (p<0.05 were observed only for the Co-Cr alloy with 0.25 mm and 0.50 mm undercuts. Significant differences were found for the 0.25 mm and 0.75 mm undercuts dependent on the material used. For the 0.50 mm undercut, significant differences were found when the materials were induction casted. CONCLUSION: Although both casting methods produced satisfactory CP Ti RPD frameworks, the occurrence of porosities was greater in the plasma/vacuum-pressure than in the induction/centrifugation method, the latter resulting in higher clasp rigidity, generating higher retention force values.

  1. Analytical method and result of radiation exposure for depressurization accident of HTTR

    International Nuclear Information System (INIS)

    Sawa, K.; Shiozawa, S.; Mikami, H.

    1990-01-01

    The Japan Atomic Energy Research Institute (JAERI) is now proceeding with the construction design of the High Temperature Engineering Test Reactor (HTTR). Since the HTTR has some characteristics different from LWRs, analytical method of radiation exposure in accidents provided for LWRs can not be applied directly. This paper describes the analytical method of radiation exposure developed by JAERI for the depressurization accident, which is the severest accident in respect to radiation exposure among the design basis accidents of the HTTR. The result is also described in this paper

  2. A study on manufacturing and construction method of buffer

    International Nuclear Information System (INIS)

    Chijimatsu, Masakazu; Sugita, Yutaka; Amemiya, Kiyoshi

    1999-09-01

    As an engineered barrier system in the geological disposal of high-level waste, multibarrier system is considered. Multibarrier system consists of the vitrified waste, the overpack and the buffer. Bentonite is one of the potential material as the buffer because of its low water permeability, self-sealing properties, radionuclides adsorption and retardation properties, thermal conductivity, chemical buffering properties, overpack supporting properties, stress buffering properties, etc. In order to evaluate the functions of buffer, a lot of experiments has been conducted. The evaluations of these functions are based on the assumption that the buffer is emplaced or constructed in the disposal tunnel (or disposal pit) properly. Therefore, it is necessary to study on the manufacturing / construction method of buffer. As the manufacturing / construction technology of the buffer, the block installation method and in-situ compaction method, etc, are being investigated. The block installation method is to emplace the buffer blocks manufactured in advance at the ground facility, and construction processes of the block installation method at the underground will be simplified compared with the in-situ compaction method. On the other hand, the in-situ compaction method is to introduce the buffer material with specified water content into the disposal tunnel and to make the buffer with high density at the site using a compaction machine. In regard to the in-situ compaction method, it is necessary to investigate the optimum finished thickness of one layer because it is impossible to construct the buffer at one time. This report describes the results of compaction property test and the summary of the past investigation results in connection with the manufacturing / construction method. Then this report shows the construction method that will be feasible in the actual disposal site. (J.P.N.)

  3. Assessing Cost-Effectiveness in Obesity (ACE-Obesity: an overview of the ACE approach, economic methods and cost results

    Directory of Open Access Journals (Sweden)

    Swinburn Boyd

    2009-11-01

    Full Text Available Abstract Background The aim of the ACE-Obesity study was to determine the economic credentials of interventions which aim to prevent unhealthy weight gain in children and adolescents. We have reported elsewhere on the modelled effectiveness of 13 obesity prevention interventions in children. In this paper, we report on the cost results and associated methods together with the innovative approach to priority setting that underpins the ACE-Obesity study. Methods The Assessing Cost Effectiveness (ACE approach combines technical rigour with 'due process' to facilitate evidence-based policy analysis. Technical rigour was achieved through use of standardised evaluation methods, a research team that assembles best available evidence and extensive uncertainty analysis. Cost estimates were based on pathway analysis, with resource usage estimated for the interventions and their 'current practice' comparator, as well as associated cost offsets. Due process was achieved through involvement of stakeholders, consensus decisions informed by briefing papers and 2nd stage filter analysis that captures broader factors that influence policy judgements in addition to cost-effectiveness results. The 2nd stage filters agreed by stakeholders were 'equity', 'strength of the evidence', 'feasibility of implementation', 'acceptability to stakeholders', 'sustainability' and 'potential for side-effects'. Results The intervention costs varied considerably, both in absolute terms (from cost saving [6 interventions] to in excess of AUD50m per annum and when expressed as a 'cost per child' estimate (from Conclusion The use of consistent methods enables valid comparison of potential intervention costs and cost-offsets for each of the interventions. ACE-Obesity informs policy-makers about cost-effectiveness, health impact, affordability and 2nd stage filters for important options for preventing unhealthy weight gain in children. In related articles cost-effectiveness results and

  4. A Comparative Study of Distribution System Parameter Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Williams, Tess L.; Gourisetti, Sri Nikhil Gup

    2016-07-17

    In this paper, we compare two parameter estimation methods for distribution systems: residual sensitivity analysis and state-vector augmentation with a Kalman filter. These two methods were originally proposed for transmission systems, and are still the most commonly used methods for parameter estimation. Distribution systems have much lower measurement redundancy than transmission systems. Therefore, estimating parameters is much more difficult. To increase the robustness of parameter estimation, the two methods are applied with combined measurement snapshots (measurement sets taken at different points in time), so that the redundancy for computing the parameter values is increased. The advantages and disadvantages of both methods are discussed. The results of this paper show that state-vector augmentation is a better approach for parameter estimation in distribution systems. Simulation studies are done on a modified version of IEEE 13-Node Test Feeder with varying levels of measurement noise and non-zero error in the other system model parameters.

  5. Study on numerical methods for transient flow induced by speed-changing impeller of fluid machinery

    International Nuclear Information System (INIS)

    Wu, Dazhuan; Chen, Tao; Wang, Leqin; Cheng, Wentao; Sun, Youbo

    2013-01-01

    In order to establish a reliable numerical method for solving the transient rotating flow induced by a speed-changing impeller, two numerical methods based on finite volume method (FVM) were presented and analyzed in this study. Two-dimensional numerical simulations of incompressible transient unsteady flow induced by an impeller during starting process were carried out respectively by using DM and DSR methods. The accuracy and adaptability of the two methods were evaluated by comprehensively comparing the calculation results. Moreover, an intensive study on the application of DSR method was conducted subsequently. The results showed that transient flow structure evolution and transient characteristics of the starting impeller are obviously affected by the starting process. The transient flow can be captured by both two methods, and the DSR method shows a higher computational efficiency. As an application example, the starting process of a mixed-flow pump was simulated by using DSR method. The calculation results were analyzed by comparing with the experiment data.

  6. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  7. An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders

    Science.gov (United States)

    Venker, Courtney E.; Kover, Sara T.

    2015-01-01

    Purpose: Eye-gaze methods have the potential to advance the study of neurodevelopmental disorders. Despite their increasing use, challenges arise in using these methods with individuals with neurodevelopmental disorders and in reporting sufficient methodological detail such that the resulting research is replicable and interpretable. Method: This…

  8. Studying collaborative information seeking: Experiences with three methods

    DEFF Research Database (Denmark)

    Hyldegård, Jette Seiden; Hertzum, Morten; Hansen, Preben

    2015-01-01

    , however, benefit from a discussion of methodological issues. This chapter describes the application of three methods for collecting and analyzing data in three CIS studies. The three methods are Multidimensional Exploration, used in a CIS study of students’ in-formation behavior during a group assignment......; Task-structured Observation, used in a CIS study of patent engineers; and Condensed Observation, used in a CIS study of information-systems development. The three methods are presented in the context of the studies for which they were devised, and the experiences gained using the methods are discussed....... The chapter shows that different methods can be used for collecting and analyzing data about CIS incidents. Two of the methods focused on tasks and events in work settings, while the third was applied in an educational setting. Commonalities and differences among the methods are discussed to inform decisions...

  9. CT-guided percutaneous neurolysis methods. State of the art and first results

    International Nuclear Information System (INIS)

    Schneider, B.; Richter, G.M.; Roeren, T.; Kauffmann, G.W.

    1996-01-01

    We used 21G or 22G fine needles. All CT-guided percutaneous neurolysis methods require a proper blood coagulation. Most common CT scanners are suitable for neurolysis if there is enough room for maintaining sterile conditions. All neurolysis methods involve sterile puncture of the ganglia under local anesthesia, a test block with anesthetic and contrast agent to assess the clinical effect and the definitive block with a mixture of 96% ethanol and local anesthetic. This allows us to correct the position of the needle if we see improper distribution of the test block or unwanted side effects. Though inflammatory complications of the peritoneum due to puncture are rarely seen, we prefer the dorsal approach whenever possible. Results: Seven of 20 legs showed at least transient clinical improvement after CT-guided lumbar sympathectomies; 13 legs had to be amputated. Results of the methods in the literature differ. For lumbar sympathectomy, improved perfusion is reported in 39-89%, depending on the pre-selection of the patient group. Discussion: It was recently proved that sympathectomy not only improves perfusion of the skin but also of the muscle. The hypothesis of a steal effect after sympathectomy towards skin perfusion was disproved. Modern aggressive surgical and interventional treatment often leaves patients to sympathectomy whose reservers of collateralization are nearly exhausted. We presume this is the reason for the different results we found in our patient group. For thoracic sympathectomy the clinical treatment depends very much on the indications. Whereas palmar hyperhidrosis offers nearly 100% success, only 60-70% of patients with disturbance of perfusion have benefited. Results in celiac ganglia block also differ. Patients with carcinoma of the pancreas and other organs of the upper abdomen benefit in 80-100% of all cases, patients with chronic pancreatitis in 60-80%. (orig./VHE) [de

  10. Tank 48H Waste Composition and Results of Investigation of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Walker , D.D. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1997-04-02

    This report serves two purposes. First, it documents the analytical results of Tank 48H samples taken between April and August 1996. Second, it describes investigations of the precision of the sampling and analytical methods used on the Tank 48H samples.

  11. Review of quantum Monte Carlo methods and results for Coulombic systems

    International Nuclear Information System (INIS)

    Ceperley, D.

    1983-01-01

    The various Monte Carlo methods for calculating ground state energies are briefly reviewed. Then a summary of the charged systems that have been studied with Monte Carlo is given. These include the electron gas, small molecules, a metal slab and many-body hydrogen

  12. Tensile strength of concrete under static and intermediate strain rates: Correlated results from different testing methods

    International Nuclear Information System (INIS)

    Wu Shengxing; Chen Xudong; Zhou Jikai

    2012-01-01

    Highlights: ► Tensile strength of concrete increases with increase in strain rate. ► Strain rate sensitivity of tensile strength of concrete depends on test method. ► High stressed volume method can correlate results from various test methods. - Abstract: This paper presents a comparative experiment and analysis of three different methods (direct tension, splitting tension and four-point loading flexural tests) for determination of the tensile strength of concrete under low and intermediate strain rates. In addition, the objective of this investigation is to analyze the suitability of the high stressed volume approach and Weibull effective volume method to the correlation of the results of different tensile tests of concrete. The test results show that the strain rate sensitivity of tensile strength depends on the type of test, splitting tensile strength of concrete is more sensitive to an increase in the strain rate than flexural and direct tensile strength. The high stressed volume method could be used to obtain a tensile strength value of concrete, free from the influence of the characteristics of tests and specimens. However, the Weibull effective volume method is an inadequate method for describing failure of concrete specimens determined by different testing methods.

  13. Development of methods to measure hemoglobin adducts by gel electrophoresis - Preliminary results

    International Nuclear Information System (INIS)

    Sun, J.D.; McBride, S.M.

    1988-01-01

    Chemical adducts formed on blood hemoglobin may be a useful biomarker for assessing human exposures to these compounds. This paper reports preliminary results in the development of methods to measure such adducts that may be generally applicable for a wide variety of chemicals. Male F344/N rats were intraperitoneally injected with 14 C-BaP dissolved in corn oil. Twenty-four hours later, the rats were sacrificed. Blood samples were collected and globin was isolated. Globin protein was then cleaved into peptide fragments using cyanogen bromide and the fragments separated using 2-dimensional gel electrophoresis. The results showed that the adducted 14 C-globin fragments migrated to different areas of the gel than did unadducted fragments. Further research is being conducted to develop methods that will allow quantitation of separated adducted globin fragments from human blood samples without the use of a radiolabel. (author)

  14. Comparison result of inversion of gravity data of a fault by particle swarm optimization and Levenberg-Marquardt methods.

    Science.gov (United States)

    Toushmalani, Reza

    2013-01-01

    The purpose of this study was to compare the performance of two methods for gravity inversion of a fault. First method [Particle swarm optimization (PSO)] is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. Second method [The Levenberg-Marquardt algorithm (LM)] is an approximation to the Newton method used also for training ANNs. In this paper first we discussed the gravity field of a fault, then describes the algorithms of PSO and LM And presents application of Levenberg-Marquardt algorithm, and a particle swarm algorithm in solving inverse problem of a fault. Most importantly the parameters for the algorithms are given for the individual tests. Inverse solution reveals that fault model parameters are agree quite well with the known results. A more agreement has been found between the predicted model anomaly and the observed gravity anomaly in PSO method rather than LM method.

  15. Performance of various mathematical methods for calculation of radioimmunoassay results

    International Nuclear Information System (INIS)

    Sandel, P.; Vogt, W.

    1977-01-01

    Interpolation and regression methods are available for computer aided determination of radioimmunological end results. We compared the performance of eight algorithms (weighted and unweighted linear logit-log regression, quadratic logit-log regression, Rodbards logistic model in the weighted and unweighted form, smoothing spline interpolation with a large and small smoothing factor and polygonal interpolation) on the basis of three radioimmunoassays with different reference curve characteristics (digoxin, estriol, human chorionic somatomammotropin = HCS). Great store was set by the accuracy of the approximation at the intermediate points on the curve, ie. those points that lie midway between two standard concentrations. These concentrations were obtained by weighing and inserted as unknown samples. In the case of digoxin and estriol the polygonal interpolation provided the best results while the weighted logit-log regression proved superior in the case of HCS. (orig.) [de

  16. The study of necessity of verification-methods for Depleted Uranium

    International Nuclear Information System (INIS)

    Park, J. B.; Ahn, S. H.; Ahn, G. H.; Chung, S. T.; Shin, J. S.

    2006-01-01

    ROK has tried to establish management system for depleted uranium from 2004, and ROK achieved some results in this field including management software, management skill, and the list of company using the nuclear material. But, the studies for the depleted uranium are insufficient exclude the studies of KAERI. In terms of SSAC, we have to study more about whether the depleted uranium is really dangerous material or not and how is the depleted uranium diverted to the nuclear weapon. The depleted uranium was controlled by the item counting in the national system for the small quantity nuclear material. We don't have unique technical methods to clarify the depleted uranium on-the-spot inspection not laboratory scale. Therefore, I would like to suggest of the necessity of the verification methods for depleted uranium. Furthermore, I would like to show you the methods of the verification of the depleted uranium in national system up to now

  17. EPA (Environmental Protection Agency) Method Study 12, cyanide in water. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Winter, J.; Britton, P.; Kroner, R.

    1984-05-01

    EPA Method Study 12, Cyanide in Water reports the results of a study by EMSL-Cincinnati for the parameters, Total Cyanide and Cyanides Amendable to Chlorination, present in water at microgram per liter levels. Four methods: pyridine-pyrazolone, pyridine-barbituric acid, electrode and Roberts-Jackson were used by 112 laboratories in Federal and State agencies, municipalities, universities, and the private/industrial sector. Sample concentrates were prepared in pairs with similar concentrations at each of three levels. Analysts diluted samples to volume with distilled and natural waters and analyzed them. Precision, accuracy, bias and the natural water interference were evaluated for each analytical method and comparisons were made between the four methods.

  18. Experimental study for development of thermic lance cutting method

    International Nuclear Information System (INIS)

    Machida, N.; Katano, Y.; Kamiya, Y.

    1988-01-01

    A series of experiments on a thermic lance cutting method were carried out to obtain useful data for the practical application of this method to the dismantling of reinforced concrete. As a first step, a performance experiment was executed to study basic cutting performance relating to oxygen consumption, extent of bar loss and cutting speed, as well as by-products generated during cutting work such as powdered dust, gas, fumes and slag. An automated and remote-controlled cutting machine was then developed utilizing automated bar supply and ignition. This paper describes the result of these experiments. (author)

  19. Influence of Meibomian Gland Expression Methods on Human Lipid Analysis Results.

    Science.gov (United States)

    Kunnen, Carolina M E; Brown, Simon H J; Lazon de la Jara, Percy; Holden, Brien A; Blanksby, Stephen J; Mitchell, Todd W; Papas, Eric B

    2016-01-01

    To compare the lipid composition of human meibum across three different meibum expression techniques. Meibum was collected from five healthy non-contact lens wearers (aged 20-35 years) after cleaning the eyelid margin using three meibum expression methods: cotton buds (CB), meibomian gland evaluator (MGE) and meibomian gland forceps (MGF). Meibum was also collected using cotton buds without cleaning the eyelid margin (CBn). Lipids were analyzed by chip-based, nano-electrospray mass spectrometry (ESI-MS). Comparisons were made using linear mixed models. Tandem MS enabled identification and quantification of over 200 lipid species across ten lipid classes. There were significant differences between collection techniques in the relative quantities of polar lipids obtained (P<.05). The MGE method returned smaller polar lipid quantities than the CB approaches. No significant differences were found between techniques for nonpolar lipids. No significant differences were found between cleaned and non-cleaned eyelids for polar or nonpolar lipids. Meibum expression technique influences the relative amount of phospholipids in the resulting sample. The highest amounts of phospholipids were detected with the CB approaches and the lowest with the MGE technique. Cleaning the eyelid margin prior to expression was not found to affect the lipid composition of the sample. This may be a consequence of the more forceful expression resulting in cell membrane contamination or higher risk of tear lipid contamination as a result of reflex tearing. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    Science.gov (United States)

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  1. Study on some factors affecting the results in the use of MIP method in concrete research

    International Nuclear Information System (INIS)

    Kumar, Rakesh; Bhattacharjee, B.

    2003-01-01

    Effects of rate of pressure application and forms and type of sample on porosity and pore size distribution of concrete estimated through mercury intrusion porosimetry (MIP) are presented in this experimental work. Two different forms of concrete sample, namely, crushed chunks of concrete and small core drilled out from the concrete beam specimens, were used for this study. The results exhibit that the rate of pressure application in mercury porosimetry has little effect on porosity and pore size distribution of concrete. It is also demonstrated that small cores drilled out from large concrete specimens are preferable as samples for performing porosimetry test on concrete

  2. Design, methods, baseline characteristics and interim results of the Catheter Sampled Blood Archive in Cardiovascular Diseases (CASABLANCA study

    Directory of Open Access Journals (Sweden)

    Hanna K. Gaggin

    2014-11-01

    Conclusions: The CASABLANCA study will examine the role of novel biomarkers and metabolomics for predicting a wide range of cardiovascular, neurologic, and renal complications in patients undergoing angiography. Full results are expected in the latter half of 2014 (ClinicalTrials.Gov # NCT00842868.

  3. A study of interpolation method in diagnosis of carpal tunnel syndrome

    Directory of Open Access Journals (Sweden)

    Alireza Ashraf

    2013-01-01

    Full Text Available Context: The low correlation between the patients′ signs and symptoms of carpal tunnel syndrome (CTS and results of electrodiagnostic tests makes the diagnosis challenging in mild cases. Interpolation is a mathematical method for finding median nerve conduction velocity (NCV exactly at carpal tunnel site. Therefore, it may be helpful in diagnosis of CTS in patients with equivocal test results. Aim: The aim of this study is to evaluate interpolation method as a CTS diagnostic test. Settings and Design: Patients with two or more clinical symptoms and signs of CTS in a median nerve territory with 3.5 ms ≤ distal median sensory latency <4.6 ms from those who came to our electrodiagnostic clinics and also, age matched healthy control subjects were recruited in the study. Materials and Methods: Median compound motor action potential and median sensory nerve action potential latencies were measured by a MEDLEC SYNERGY VIASIS electromyography and conduction velocities were calculated by both routine method and interpolation technique. Statistical Analysis Used: Chi-square and Student′s t-test were used for comparing group differences. Cut-off points were calculated using receiver operating characteristic curve. Results: A sensitivity of 88%, specificity of 67%, positive predictive value (PPV and negative predictive value (NPV of 70.8% and 84.7% were obtained for median motor NCV and a sensitivity of 98.3%, specificity of 91.7%, PPV and NPV of 91.9% and 98.2% were obtained for median sensory NCV with interpolation technique. Conclusions: Median motor interpolation method is a good technique, but it has less sensitivity and specificity than median sensory interpolation method.

  4. Physical Model Method for Seismic Study of Concrete Dams

    Directory of Open Access Journals (Sweden)

    Bogdan Roşca

    2008-01-01

    Full Text Available The study of the dynamic behaviour of concrete dams by means of the physical model method is very useful to understand the failure mechanism of these structures to action of the strong earthquakes. Physical model method consists in two main processes. Firstly, a study model must be designed by a physical modeling process using the dynamic modeling theory. The result is a equations system of dimensioning the physical model. After the construction and instrumentation of the scale physical model a structural analysis based on experimental means is performed. The experimental results are gathered and are available to be analysed. Depending on the aim of the research may be designed an elastic or a failure physical model. The requirements for the elastic model construction are easier to accomplish in contrast with those required for a failure model, but the obtained results provide narrow information. In order to study the behaviour of concrete dams to strong seismic action is required the employment of failure physical models able to simulate accurately the possible opening of joint, sliding between concrete blocks and the cracking of concrete. The design relations for both elastic and failure physical models are based on dimensional analysis and consist of similitude relations among the physical quantities involved in the phenomenon. The using of physical models of great or medium dimensions as well as its instrumentation creates great advantages, but this operation involves a large amount of financial, logistic and time resources.

  5. Methods of dealing with co-products of biofuels in life-cycle analysis and consequent results within the U.S. context

    International Nuclear Information System (INIS)

    Wang, Michael; Huo Hong; Arora, Salil

    2011-01-01

    Products other than biofuels are produced in biofuel plants. For example, corn ethanol plants produce distillers' grains and solubles. Soybean crushing plants produce soy meal and soy oil, which is used for biodiesel production. Electricity is generated in sugarcane ethanol plants both for internal consumption and export to the electric grid. Future cellulosic ethanol plants could be designed to co-produce electricity with ethanol. It is important to take co-products into account in the life-cycle analysis of biofuels and several methods are available to do so. Although the International Standard Organization's ISO 14040 advocates the system boundary expansion method (also known as the 'displacement method' or the 'substitution method') for life-cycle analyses, application of the method has been limited because of the difficulty in identifying and quantifying potential products to be displaced by biofuel co-products. As a result, some LCA studies and policy-making processes have considered alternative methods. In this paper, we examine the available methods to deal with biofuel co-products, explore the strengths and weaknesses of each method, and present biofuel LCA results with different co-product methods within the U.S. context.

  6. Methods of dealing with co-products of biofuels in life-cycle analysis and consequent results within the U.S. context

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Michael, E-mail: mqwang@anl.gov [Center for Transportation Research, Argonne National Laboratory, Argonne, IL 60439 (United States); Huo Hong [Institute of Energy, Environment, and Economics, Tsinghua University, Beijing, 100084 (China); Arora, Salil [Center for Transportation Research, Argonne National Laboratory, Argonne, IL 60439 (United States)

    2011-10-15

    Products other than biofuels are produced in biofuel plants. For example, corn ethanol plants produce distillers' grains and solubles. Soybean crushing plants produce soy meal and soy oil, which is used for biodiesel production. Electricity is generated in sugarcane ethanol plants both for internal consumption and export to the electric grid. Future cellulosic ethanol plants could be designed to co-produce electricity with ethanol. It is important to take co-products into account in the life-cycle analysis of biofuels and several methods are available to do so. Although the International Standard Organization's ISO 14040 advocates the system boundary expansion method (also known as the 'displacement method' or the 'substitution method') for life-cycle analyses, application of the method has been limited because of the difficulty in identifying and quantifying potential products to be displaced by biofuel co-products. As a result, some LCA studies and policy-making processes have considered alternative methods. In this paper, we examine the available methods to deal with biofuel co-products, explore the strengths and weaknesses of each method, and present biofuel LCA results with different co-product methods within the U.S. context.

  7. Public attitudes towards alcohol control policies in Scotland and England: Results from a mixed-methods study.

    Science.gov (United States)

    Li, Jessica; Lovatt, Melanie; Eadie, Douglas; Dobbie, Fiona; Meier, Petra; Holmes, John; Hastings, Gerard; MacKintosh, Anne Marie

    2017-03-01

    The harmful effects of heavy drinking on health have been widely reported, yet public opinion on governmental responsibility for alcohol control remains divided. This study examines UK public attitudes towards alcohol policies, identifies underlying dimensions that inform these, and relationships with perceived effectiveness. A cross-sectional mixed methods study involving a telephone survey of 3477 adult drinkers aged 16-65 and sixteen focus groups with 89 adult drinkers in Scotland and England was conducted between September 2012 and February 2013. Principal components analysis (PCA) was used to reduce twelve policy statements into underlying dimensions. These dimensions were used in linear regression models examining alcohol policy support by demographics, drinking behaviour and perceptions of UK drinking and government responsibility. Findings were supplemented with a thematic analysis of focus group transcripts. A majority of survey respondents supported all alcohol policies, although the level of support varied by type of policy. Greater enforcement of laws on under-age sales and more police patrolling the streets were strongly supported while support for pricing policies and restricting access to alcohol was more divided. PCA identified four main dimensions underlying support on policies: alcohol availability, provision of health information and treatment services, alcohol pricing, and greater law enforcement. Being female, older, a moderate drinker, and holding a belief that government should do more to reduce alcohol harms were associated with higher support on all policy dimensions. Focus group data revealed findings from the survey may have presented an overly positive level of support on all policies due to differences in perceived policy effectiveness. Perceived effectiveness can help inform underlying patterns of policy support and should be considered in conjunction with standard measures of support in future research on alcohol control policies

  8. Improved Method of Detection Falsification Results the Digital Image in Conditions of Attacks

    Directory of Open Access Journals (Sweden)

    Kobozeva A.A.

    2016-08-01

    Full Text Available The modern level of information technologies development has led to unheard ease embodiments hitherto unauthorized modifications of digital content. At the moment, very important question is the effective expert examination of authenticity of digital images, video, audio, development of the methods for identification and localization of violations of their integrity using these contents for purposes other than entertainment. Present paper deals with the improvement of the detection method of the cloning results in digital images - one of the most frequently used in the software tools falsification realized in all modern graphics editors. The method is intended for clone detection areas and pre-image in terms of additional disturbing influences in the image after the cloning operation for "masking" of the results, which complicates the search process. The improvement is aimed at reducing the number of "false alarms", when the area of the clone / pre-image detected in the original image or the localization of the identified areas do not correspond to the real clone and pre-image. The proposed improvement, based on analysis of different sizes per-pixel image blocks with the least difference from each other, has made it possible efficient functioning of the method, regardless of the specificity of the analyzed digital image.

  9. A mixed-methods study into ballet for people living with Parkinson's.

    Science.gov (United States)

    Houston, Sara; McGill, Ashley

    2013-06-01

    Background : Parkinson's is a neurological disease that is physically debilitating and can be socially isolating. Dance is growing in popularity for people with Parkinson's and claims have been made for its benefits. The paper details a mixed-methods study that examined a 12-week dance project for people with Parkinson's, led by English National Ballet. Methods : The effects on balance, stability and posture were measured through the Fullerton Advanced Balance Scale and a plumb-line analysis. The value of participation and movement quality were interpreted through ethnographic methods, grounded theory and Effort analysis. Results : Triangulation of results indicates that people were highly motivated, with 100% adherence, and valued the classes as an important part of their lives. Additionally, results indicated an improvement in balance and stability, although not in posture. Conclusions : Dancing may offer benefit to people with Parkinson's through its intellectual, artistic, social and physical aspects. The paper suggests that a range of research methods is fundamental to capture the importance of multifaceted activity, such as dance, to those with Parkinson's.

  10. Comparative study of methods for potential and actual evapotranspiration determination

    International Nuclear Information System (INIS)

    Kolev, B.

    2004-01-01

    Two types of methods for potential and actual evapotranspiration determining were compared. The first type includes neutron gauge, tensiometers, gypsum blocks and lysimeters. The actual and potential evapotranspiration were calculated by water balance equation. The second type of methods used a simulation model for all calculation. The aim of this study was not only to compare and estimate the methods using. It was mainly pointed on calculations of water use efficiency and transpiration coefficient in potential production situation. This makes possible to choose the best way for water consumption optimization for a given crop. The final results find with the best of the methods could be used for applying the principles of sustainable agriculture in random region of Bulgarian territory. (author)

  11. Metastatic nasopharyngeal carcinoma: clinical study and therapeutic results of 95 cases

    International Nuclear Information System (INIS)

    Khanfir, A.; Frikha, M.; Ghorbel, A.; Drira, M.M.; Karray, H.; Daoud, J.

    2006-01-01

    Purpose. -- The objective of this retrospective study was to discuss the epidemio-clinical criteria and the therapeutic results of metastatic nasopharyngeal carcinoma. Patients and methods. - The current study concerned 95 patients with histologically proven nasopharyngeal carcinoma who were metastatic at diagnosis or who had developed late metastasis. We reviewed the epidemio-clinical records of all the patients. Patients were treated with chemotherapy (BEC regimen: bleomycin, epirubicin and cisplatin or PBF regimen: bleomycin, 5-fluorouracil and cisplatin) and radiotherapy of pauci metastatic localizations (single or double) or bone metastasis with high risk of compression or fracture ±associated with locoregional radiotherapy for patients who were metastatic at diagnosis. Response was assessed according to the WHO criteria. Overall survival was calculated according to the Kaplan-Meier method. A long-term disease-free survival was defined from 36 months. Results. - There were 34 patients who were metastatic at diagnosis and 61 patients who had developed late metastasis. The mean age was 41.5 years (sex-ratio: 3.1). Bone metastases were the most frequent (83%). Objective and complete response rates were respectively 75% and 70%, and 32% and 16% for BEC and PBF regimens. Twenty-five patients received radiotherapy for pauci metastatic localizations, among whom 19 patients who were metastatic at diagnosis received locoregional irradiation. The overall survival probability was of 15% for three years. Eleven patients were long survivors (extremes: 36 and 134 months). Conclusion. - Therapeutic results were comparable to those reported in other series using platin combination chemotherapy. Radiotherapy of metastasis yielded to long-term survival. (authors)

  12. What Happened to Remote Usability Testing? An Empirical Study of Three Methods

    DEFF Research Database (Denmark)

    Stage, Jan; Andreasen, M. S.; Nielsen, H. V.

    2007-01-01

    The idea of conducting usability tests remotely emerged ten years ago. Since then, it has been studied empirically, and some software organizations employ remote methods. Yet there are still few comparisons involving more than one remote method. This paper presents results from a systematic...... empirical comparison of three methods for remote usability testing and a conventional laboratorybased think-aloud method. The three remote methods are a remote synchronous condition, where testing is conducted in real time but the test monitor is separated spatially from the test subjects, and two remote...

  13. The albatross plot: A novel graphical tool for presenting results of diversely reported studies in a systematic review.

    Science.gov (United States)

    Harrison, Sean; Jones, Hayley E; Martin, Richard M; Lewis, Sarah J; Higgins, Julian P T

    2017-09-01

    Meta-analyses combine the results of multiple studies of a common question. Approaches based on effect size estimates from each study are generally regarded as the most informative. However, these methods can only be used if comparable effect sizes can be computed from each study, and this may not be the case due to variation in how the studies were done or limitations in how their results were reported. Other methods, such as vote counting, are then used to summarize the results of these studies, but most of these methods are limited in that they do not provide any indication of the magnitude of effect. We propose a novel plot, the albatross plot, which requires only a 1-sided P value and a total sample size from each study (or equivalently a 2-sided P value, direction of effect and total sample size). The plot allows an approximate examination of underlying effect sizes and the potential to identify sources of heterogeneity across studies. This is achieved by drawing contours showing the range of effect sizes that might lead to each P value for given sample sizes, under simple study designs. We provide examples of albatross plots using data from previous meta-analyses, allowing for comparison of results, and an example from when a meta-analysis was not possible. Copyright © 2017 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd.

  14. A Critical Study of Agglomerated Multigrid Methods for Diffusion

    Science.gov (United States)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2011-01-01

    Agglomerated multigrid techniques used in unstructured-grid methods are studied critically for a model problem representative of laminar diffusion in the incompressible limit. The studied target-grid discretizations and discretizations used on agglomerated grids are typical of current node-centered formulations. Agglomerated multigrid convergence rates are presented using a range of two- and three-dimensional randomly perturbed unstructured grids for simple geometries with isotropic and stretched grids. Two agglomeration techniques are used within an overall topology-preserving agglomeration framework. The results show that multigrid with an inconsistent coarse-grid scheme using only the edge terms (also referred to in the literature as a thin-layer formulation) provides considerable speedup over single-grid methods but its convergence deteriorates on finer grids. Multigrid with a Galerkin coarse-grid discretization using piecewise-constant prolongation and a heuristic correction factor is slower and also grid-dependent. In contrast, grid-independent convergence rates are demonstrated for multigrid with consistent coarse-grid discretizations. Convergence rates of multigrid cycles are verified with quantitative analysis methods in which parts of the two-grid cycle are replaced by their idealized counterparts.

  15. Study on highly efficient seismic data acquisition and processing methods based on sparsity constraint

    Science.gov (United States)

    Wang, H.; Chen, S.; Tao, C.; Qiu, L.

    2017-12-01

    High-density, high-fold and wide-azimuth seismic data acquisition methods are widely used to overcome the increasingly sophisticated exploration targets. The acquisition period is longer and longer and the acquisition cost is higher and higher. We carry out the study of highly efficient seismic data acquisition and processing methods based on sparse representation theory (or compressed sensing theory), and achieve some innovative results. The theoretical principles of highly efficient acquisition and processing is studied. We firstly reveal sparse representation theory based on wave equation. Then we study the highly efficient seismic sampling methods and present an optimized piecewise-random sampling method based on sparsity prior information. At last, a reconstruction strategy with the sparsity constraint is developed; A two-step recovery approach by combining sparsity-promoting method and hyperbolic Radon transform is also put forward. The above three aspects constitute the enhanced theory of highly efficient seismic data acquisition. The specific implementation strategies of highly efficient acquisition and processing are studied according to the highly efficient acquisition theory expounded in paragraph 2. Firstly, we propose the highly efficient acquisition network designing method by the help of optimized piecewise-random sampling method. Secondly, we propose two types of highly efficient seismic data acquisition methods based on (1) single sources and (2) blended (or simultaneous) sources. Thirdly, the reconstruction procedures corresponding to the above two types of highly efficient seismic data acquisition methods are proposed to obtain the seismic data on the regular acquisition network. A discussion of the impact on the imaging result of blended shooting is discussed. In the end, we implement the numerical tests based on Marmousi model. The achieved results show: (1) the theoretical framework of highly efficient seismic data acquisition and processing

  16. Structural issues affecting mixed methods studies in health research: a qualitative study.

    Science.gov (United States)

    O'Cathain, Alicia; Nicholl, Jon; Murphy, Elizabeth

    2009-12-09

    Health researchers undertake studies which combine qualitative and quantitative methods. Little attention has been paid to the structural issues affecting this mixed methods approach. We explored the facilitators and barriers to undertaking mixed methods studies in health research. Face-to-face semi-structured interviews with 20 researchers experienced in mixed methods research in health in the United Kingdom. Structural facilitators for undertaking mixed methods studies included a perception that funding bodies promoted this approach, and the multidisciplinary constituency of some university departments. Structural barriers to exploiting the potential of these studies included a lack of education and training in mixed methods research, and a lack of templates for reporting mixed methods articles in peer-reviewed journals. The 'hierarchy of evidence' relating to effectiveness studies in health care research, with the randomised controlled trial as the gold standard, appeared to pervade the health research infrastructure. Thus integration of data and findings from qualitative and quantitative components of mixed methods studies, and dissemination of integrated outputs, tended to occur through serendipity and effort, further highlighting the presence of structural constraints. Researchers are agents who may also support current structures - journal reviewers and editors, and directors of postgraduate training courses - and thus have the ability to improve the structural support for exploiting the potential of mixed methods research. The environment for health research in the UK appears to be conducive to mixed methods research but not to exploiting the potential of this approach. Structural change, as well as change in researcher behaviour, will be necessary if researchers are to fully exploit the potential of using mixed methods research.

  17. Study on Separation of Structural Isomer with Magneto-Archimedes method

    Science.gov (United States)

    Kobayashi, T.; Mori, T.; Akiyama, Y.; Mishima, F.; Nishijima, S.

    2017-09-01

    Organic compounds are refined by separating their structural isomers, however each separation method has some problems. For example, distillation consumes large energy. In order to solve these problems, new separation method is needed. Considering organic compounds are diamagnetic, we focused on magneto-Archimedes method. With this method, particle mixture dispersed in a paramagnetic medium can be separated in a magnetic field due to the difference of the density and magnetic susceptibility of the particles. In this study, we succeeded in separating isomers of phthalic acid as an example of structural isomer using MnCl2 solution as the paramagnetic medium. In order to use magneto-Archimedes method for separating materials for food or medicine, we proposed harmless medium using oxygen and fluorocarbon instead of MnCl2 aqueous solution. As a result, the possibility of separating every structural isomer was shown.

  18. E-assessment and an e-training program among elderly care staff lacking formal competence: results of a mixed-methods intervention study.

    Science.gov (United States)

    Nilsson, Annika; Engström, Maria

    2015-05-06

    Among staff working in elderly care, a considerable proportion lack formal competence for their work. Lack of formal competence, in turn, has been linked to higher staff ratings of stress symptoms, sleep disturbances and workload. 1) To describe the strengths and weaknesses of an e-assessment and subsequent e-training program used among elderly care staff who lack formal competence and 2) to study the effects of an e-training program on staff members' working life (quality of care and psychological and structural empowerment) and well-being (job satisfaction and psychosomatic health). The hypothesis was that staff who had completed the e-assessment and the e-training program would rate greater improvements in working life and well-being than would staff who had only participated in the e-assessments. An intervention study with a mixed-methods approach using quantitative (2010-2011) and qualitative data (2011) was conducted in Swedish elderly care. Participants included a total of 41 staff members. To describe the strengths and weaknesses of the e-assessment and the e-training program, qualitative data were gathered using semi-structured interviews together with a study-specific questionnaire. To study the effects of the intervention, quantitative data were collected using questionnaires on: job satisfaction, psychosomatic health, psychological empowerment, structural empowerment and quality of care in an intervention and a comparison group. Staff who completed the e-assessments and the e-training program primarily experienced strengths associated with this approach. The results were also in line with our hypotheses: Staff who completed the e-assessment and the e-training program rated improvements in their working life and well-being. Use of the e-assessments and e-training program employed in the present study could be one way to support elderly care staff who lack formal education by increasing their competence; increased competence, in turn, could improve their

  19. The effect of different methods and image analyzers on the results of the in vivo comet assay.

    Science.gov (United States)

    Kyoya, Takahiro; Iwamoto, Rika; Shimanura, Yuko; Terada, Megumi; Masuda, Shuichi

    2018-01-01

    The in vivo comet assay is a widely used genotoxicity test that can detect DNA damage in a range of organs. It is included in the Organisation for Economic Co-operation and Development Guidelines for the Testing of Chemicals. However, various protocols are still used for this assay, and several different image analyzers are used routinely to evaluate the results. Here, we verified a protocol that largely contributes to the equivalence of results, and we assessed the effect on the results when slides made from the same sample were analyzed using two different image analyzers (Comet Assay IV vs Comet Analyzer). Standardizing the agarose concentrations and DNA unwinding and electrophoresis times had a large impact on the equivalence of the results between the different methods used for the in vivo comet assay. In addition, there was some variation in the sensitivity of the two different image analyzers tested; however this variation was considered to be minor and became negligible when the test conditions were standardized between the two different methods. By standardizing the concentrations of low melting agarose and DNA unwinding and electrophoresis times between both methods used in the current study, the sensitivity to detect the genotoxicity of a positive control substance in the in vivo comet assay became generally comparable, independently of the image analyzer used. However, there may still be the possibility that other conditions, except for the three described here, could affect the reproducibility of the in vivo comet assay.

  20. Critical Appraisal of Mixed Methods Studies

    Science.gov (United States)

    Heyvaert, Mieke; Hannes, Karin; Maes, Bea; Onghena, Patrick

    2013-01-01

    In several subdomains of the social, behavioral, health, and human sciences, research questions are increasingly answered through mixed methods studies, combining qualitative and quantitative evidence and research elements. Accordingly, the importance of including those primary mixed methods research articles in systematic reviews grows. It is…

  1. Studies of Ancient Russian Cultural Objects Using the Neutron Tomography Method

    Directory of Open Access Journals (Sweden)

    Sergey Kichanov

    2018-01-01

    Full Text Available Neutron radiography and tomography is a non-destructive method that provides detailed information about the internal structure of cultural heritage objects. The differences in the neutron attenuation coefficients of constituent elements of the studied objects, as well as the application of modern mathematical algorithms to carry out three-dimensional imaging data analysis, allow one to obtain unique information about the spatial distribution of different phases, the presence of internal defects, or the degree of structural degradation inside valuable cultural objects. The results of the neutron studies of several archaeological objects related to different epochs of the Russian history are reported in order to demonstrate the opportunities provided by the neutron tomography method. The obtained 3D structural volume data, as well as the results of the corresponding data analysis, are presented.

  2. Mixing studies at low grade vacuum pan using the radiotracer method

    International Nuclear Information System (INIS)

    Griffith, J.M

    1999-01-01

    In this paper, some preliminary results achieved in the evaluation of the homogenization time at a vacuum pan for massecuite b and seed preparation , using two approaches of the radiotracer method, are presented. Practically no difference between the o n line , using small size detector and the sampling methods, in mixing studies performed at the high-grade massecuite was detected. Results achieved during the trials performed at the vacuum station show that the mechanical agitation in comparison with normal agitation improves the performance of mixing in high-grade massecuite b and in seed preparation at the vacuum pan

  3. Mixing studies at low grade vacuum pan using the radiotracer method

    International Nuclear Information System (INIS)

    Griffith, J.M

    1999-01-01

    In this paper, some preliminary results achieved in the evaluation of the homogenization time at a Vacuum Pan for massecuite B and seed preparation , using two approaches of the radiotracer method, are presented. Practically no difference between the Ion Line , using small size detector and the sampling methods, in mixing studies performed at the high-grade massecuite was detected. Results achieved during the trials performed at the vacuum station show that the mechanical agitation in comparison with normal agitation improves the performance of mixing in high-grade massecuite b and in seed preparation at the vacuum pan

  4. EPA's radon study results

    International Nuclear Information System (INIS)

    Dowd, R.M.

    1988-01-01

    Last winter, in cooperation with agencies in 10 states and two metropolitan area counties, EPA measured the indoor air radon concentrations of 14,000 houses, some chosen statistically at random and some by request of the homeowner. Passive measurement methodologies were used, such as exposing a charcoal canister to the air for a few days and allowing the air to migrate in to the charcoal naturally. To reduce dilution of radon by the outside air, the protocol required that the house be shut up; therefore, the study was conducted during winter. The measuring device was placed in the lowest livable area (usually the basement) of each house to maximize potential concentration. It should be noted that these procedures are generally considered to be screening tests because they result in a worst-case measurement rather than a best value. The results of these findings are presented

  5. Methods and models used in comparative risk studies

    International Nuclear Information System (INIS)

    Devooght, J.

    1983-01-01

    Comparative risk studies make use of a large number of methods and models based upon a set of assumptions incompletely formulated or of value judgements. Owing to the multidimensionality of risks and benefits, the economic and social context may notably influence the final result. Five classes of models are briefly reviewed: accounting of fluxes of effluents, radiation and energy; transport models and health effects; systems reliability and bayesian analysis; economic analysis of reliability and cost-risk-benefit analysis; decision theory in presence of uncertainty and multiple objectives. Purpose and prospect of comparative studies are assessed in view of probable diminishing returns for large generic comparisons [fr

  6. [Study on commercial specification of atractylodes based on Delphi method].

    Science.gov (United States)

    Wang, Hao; Chen, Li-Xiao; Huang, Lu-Qi; Zhang, Tian-Tian; Li, Ying; Zheng, Yu-Guang

    2016-03-01

    This research adopts "Delphi method" to evaluate atractylodes traditional traits and rank correlation. By using methods of mathematical statistics the relationship of the traditional identification indicators and atractylodes goods rank correlation was analyzed, It is found that the main characteristics affectingatractylodes commodity specifications and grades of main characters wereoil points of transaction,color of transaction,color of surface,grain of transaction,texture of transaction andspoilage. The study points out that the original "seventy-six kinds of medicinal materials commodity specification standards of atractylodes differentiate commodity specification" is not in conformity with the actual market situation, we need to formulate corresponding atractylodes medicinal products specifications and grades.This study combined with experimental results "Delphi method" and the market actual situation, proposed the new draft atractylodes commodity specifications and grades, as the new atractylodes commodity specifications and grades standards. It provides a reference and theoretical basis. Copyright© by the Chinese Pharmaceutical Association.

  7. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  8. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  9. Comparative study of two methods for determining the diffusible hydrogen content in welds

    International Nuclear Information System (INIS)

    Celio de Abreu, L.; Modenesi, P.J.; Villani-Marques, P.

    1994-01-01

    This work presents a comparative study of the methods for measurement of the amount of diffusible hydrogen in welds: glycerin, mercury and gaseous chromatography. The effect of the variables collecting temperatures and times were analyzed. Basic electrodes type AWS E 9018-M were humidified and dried at different times and temperatures in order to obtain a large variation in the diffusible hydrogen contents. The results showed that the collecting time can be reduced when the collecting temperature is raised, the mercury and chromatography methods present similar results, higher than those obtained by the glycerin method, the use of liquid nitrogen in the preparation of the specimens for test is unessential. The chromatography method presents the lower dispersion and is the method that can have the collecting time more reduced by the raising of the collecting temperature. The use of equations for comparison between results obtained by the various methods encountered in the literature is also discussed. (Author) 16 refs

  10. Studies on the comparability of the results from different methods for the radioimmunological determination of digoxin

    International Nuclear Information System (INIS)

    Dwenger, A.; Trautschold, I.

    1978-01-01

    Three iodine-125-digoxin radioimmunoassay kits (A Amersham Buchler; B Boehringer Mannheim; C Schwarz Mann/Becton Dickinson) were evaluated with respect to assay quality and comparability of the results. Intra- and interassay variances were calculated for the following types of samples: Three media (a pool serum; b artificial human serum; c buffer solution with albumin and globulin) containing pure digoxin, sera from a pharmacokinetic study, sera with different concentrations of proteins, a hemolytic serum and sera with digitoxin and metabolites of spironolactone. The intra-assay precision depended on the medium of the sample and was higher for samples with identical digoxin concentrations in an identical medium (e.g. CV for 2 μg/l in medium a for kit A: 4.3% for kit B: 7.0%; for kit C: 2.2%) than for samples with identical antigen concentrations in different media (CV for 2 μg/l in media a, b and c for kit A: 6.4%; for kit B: 9.1%; for kit C: 4.3%). The mean recovery in the range 0.5-4 μg/l depended on the kind of medium (a, b or c) and varied for kit A from 84.4% to 100.8%, for kit B from 112.0% to 119.6%, and for kit C from 98.0% to 104.5%. Decreasing serum protein concentrations to less than one half of the physiological concentration gave false negative results for kit A and fals positive results for kit C; for kit B this dependency was not be observed, but there was a decrease of reproducibility. (orig./AJ) [de

  11. Deficits in knowledge, attitude, and practice towards blood culture sampling: results of a nationwide mixed-methods study among inpatient care physicians in Germany.

    Science.gov (United States)

    Raupach-Rosin, Heike; Duddeck, Arne; Gehrlich, Maike; Helmke, Charlotte; Huebner, Johannes; Pletz, Mathias W; Mikolajczyk, Rafael; Karch, André

    2017-08-01

    Blood culture (BC) sampling rates in Germany are considerably lower than recommended. Aim of our study was to assess knowledge, attitudes, and practice of physicians in Germany regarding BC diagnostics. We conducted a cross-sectional mixed-methods study among physicians working in inpatient care in Germany. Based on the results of qualitative focus groups, a questionnaire-based quantitative study was conducted in 2015-2016. In total, 706 medical doctors and final-year medical students from 11 out of 16 federal states in Germany participated. BC sampling was considered an important diagnostic tool by 95% of the participants. However, only 23% of them would collect BCs in three scenarios for which BC ordering is recommended by present guidelines in Germany; almost one out of ten physicians would not have taken blood cultures in any of the three scenarios. The majority of participants (74%) reported not to adhere to the guideline recommendation that blood culture sampling should include at least two blood culture sets from two different injection sites. High routine in blood culture sampling, perceived importance of blood culture diagnostics, the availability of an in-house microbiological lab, and the department the physician worked in were identified as predictors for good blood culture practice. Our study suggests that there are substantial deficits in BC ordering and the application of guidelines for good BC practice in Germany. Based on these findings, multimodal interventions appear necessary for improving BC diagnostics.

  12. Dental age estimation using Willems method: A digital orthopantomographic study

    Directory of Open Access Journals (Sweden)

    Rezwana Begum Mohammed

    2014-01-01

    Full Text Available In recent years, age estimation has become increasingly important in living people for a variety of reasons, including identifying criminal and legal responsibility, and for many other social events such as a birth certificate, marriage, beginning a job, joining the army, and retirement. Objectives: The aim of this study was to assess the developmental stages of left seven mandibular teeth for estimation of dental age (DA in different age groups and to evaluate the possible correlation between DA and chronological age (CA in South Indian population using Willems method. Materials and Methods: Digital Orthopantomogram of 332 subjects (166 males, 166 females who fit the study and the criteria were obtained. Assessment of mandibular teeth (from central incisor to the second molar on left quadrant development was undertaken and DA was assessed using Willems method. Results and Discussion: The present study showed a significant correlation between DA and CA in both males (r = 0.71 and females (r = 0.88. The overall mean difference between the estimated DA and CA for males was 0.69 ± 2.14 years (P 0.05. Willems method underestimated the mean age of males by 0.69 years and females by 0.08 years and showed that females mature earlier than males in selected population. The mean difference between DA and CA according to Willems method was 0.39 years and is statistically significant (P < 0.05. Conclusion: This study showed significant relation between DA and CA. Thus, digital radiographic assessment of mandibular teeth development can be used to generate mean DA using Willems method and also the estimated age range for an individual of unknown CA.

  13. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  14. Comparative study for methods to determine the seismic response of NPP structures

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1995-01-01

    There are many different important problem areas in evaluating the seismic response of structures. In this study the effort is concentrated on three of these areas. The first task is the mathematical formulation of earthquake excitation. The random vibration theory is taken as the tool in this task. The second area of interest in this study is the soil-structure interaction analysis. The approach of impedance functions is chosen and the focal point of interest is the significance of frequency dependent impedance functions. The third area of interest is the methods to determine the structural response. The following three methods were tested: the mode superposition time history method; the complex frequency response method; the response spectrum method. The comparison was made with the aid of MSC/NASTRAN code. The three methods gave for outer containment building response results which were in good agreement with each other. (author). 4 refs., 5 figs

  15. Chemometrics-assisted spectrophotometric green method for correcting interferences in biowaiver studies: Application to assay and dissolution profiling study of donepezil hydrochloride tablets

    Science.gov (United States)

    Korany, Mohamed A.; Mahgoub, Hoda; Haggag, Rim S.; Ragab, Marwa A. A.; Elmallah, Osama A.

    2018-06-01

    A green, simple and cost effective chemometric UV-Vis spectrophotometric method has been developed and validated for correcting interferences that arise during conducting biowaiver studies. Chemometric manipulation has been done for enhancing the results of direct absorbance, resulting from very low concentrations (high incidence of background noise interference) of earlier points in the dissolution timing in case of dissolution profile using first and second derivative (D1 & D2) methods and their corresponding Fourier function convoluted methods (D1/FF& D2/FF). The method applied for biowaiver study of Donepezil Hydrochloride (DH) as a representative model was done by comparing two different dosage forms containing 5 mg DH per tablet as an application of a developed chemometric method for correcting interferences as well as for the assay and dissolution testing in its tablet dosage form. The results showed that first derivative technique can be used for enhancement of the data in case of low concentration range of DH (1-8 μg mL-1) in the three different pH dissolution media which were used to estimate the low drug concentrations dissolved at the early points in the biowaiver study. Furthermore, the results showed similarity in phosphate buffer pH 6.8 and dissimilarity in the other 2 pH media. The method was validated according to ICH guidelines and USP monograph for both assays (HCl of pH 1.2) and dissolution study in 3 pH media (HCl of pH 1.2, acetate buffer of pH 4.5 and phosphate buffer of pH 6.8). Finally, the assessment of the method greenness was done using two different assessment techniques: National Environmental Method Index label and Eco scale methods. Both techniques ascertained the greenness of the proposed method.

  16. Comparative Study of Different Processing Methods for the ...

    African Journals Online (AJOL)

    The result of the two processing methods reduced the cyanide concentration to the barest minimum level required by World Health Organization (10mg/kg). The mechanical pressing-fermentation method removed more cyanide when compared to fermentation processing method. Keywords: Cyanide, Fermentation, Manihot ...

  17. Experimental methods for studying the diffusion of radioactive gases in solids. VII. Sorption method

    International Nuclear Information System (INIS)

    Bekman, I.N.

    1983-01-01

    The details of the use of a sorption method in the study of the diffusion of gasses and vapors labeled with radioactive tracers in solids have been considered. Three variants of diffusion systems, which permit the determination of the diffusion coefficient and the solubility constant of gases both from the increase in the amount of diffusate in the sample and from the decrease in its amount in the reservoir, have been tested. Different ways of conducting the experiment have been discussed. A universal method for taking into account the processes of the absorption and scattering of radiation in the material of the sample has been proposed. The experimental results were treated with the aid of a specially developed program package, which is realized on computers of the BESM-6 type. Various mathematical models of the diffusion of gases in solids have been analyzed. Solutions of the diffusion equations under the boundary conditions of the sorption method for the cases of diffusion with trapping, dissociative diffusion, and diffusion in a plate containing spherical inclusions have been obtained. The method has been tested in the example case of the diffusion of a radiative inert gas, viz., radon-22, in low-density polyethylene

  18. Study of the Socratic method during cognitive restructuring.

    Science.gov (United States)

    Froján-Parga, María Xesús; Calero-Elvira, Ana; Montaño-Fidalgo, Montserrat

    2011-01-01

    Cognitive restructuring, in particular in the form of the Socratic method, is widely used by clinicians. However, little research has been published with respect to underlying processes, which has hindered well-accepted explanations of its effectiveness. The aim of this study is to present a new method of analysis of the Socratic method during cognitive restructuring based on the observation of the therapist's verbal behaviour. Using recordings from clinical sessions, 18 sequences were selected in which the Socratic method was applied by six cognitive-behavioural therapists working at a private clinical centre in Madrid. The recordings involved eight patients requiring therapy for various psychological problems. Observations were coded using a category system designed by the authors and that classifies the therapist's verbal behaviour into seven hypothesized functions based on basic behavioural operations. We used the Observer XT software to code the observed sequences. The results are summarized through a preliminary model which considers three different phases of the Socratic method and some functions of the therapist's verbal behaviour in each of these phases: discriminative and reinforcement functions in the starting phase, informative and motivational functions in the course of the debate, and instructional and reinforcement functions in the final phase. We discuss the long-term potential clinical benefits of the current proposal.  Copyright © 2010 John Wiley & Sons, Ltd.

  19. Automatically classifying sentences in full-text biomedical articles into Introduction, Methods, Results and Discussion.

    Science.gov (United States)

    Agarwal, Shashank; Yu, Hong

    2009-12-01

    Biomedical texts can be typically represented by four rhetorical categories: Introduction, Methods, Results and Discussion (IMRAD). Classifying sentences into these categories can benefit many other text-mining tasks. Although many studies have applied different approaches for automatically classifying sentences in MEDLINE abstracts into the IMRAD categories, few have explored the classification of sentences that appear in full-text biomedical articles. We first evaluated whether sentences in full-text biomedical articles could be reliably annotated into the IMRAD format and then explored different approaches for automatically classifying these sentences into the IMRAD categories. Our results show an overall annotation agreement of 82.14% with a Kappa score of 0.756. The best classification system is a multinomial naïve Bayes classifier trained on manually annotated data that achieved 91.95% accuracy and an average F-score of 91.55%, which is significantly higher than baseline systems. A web version of this system is available online at-http://wood.ims.uwm.edu/full_text_classifier/.

  20. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  1. Nondestructive, fast methods for burn-up study

    International Nuclear Information System (INIS)

    Schaechter, L.; Hacman, D.; Mot, O.

    1977-01-01

    Nondestructive methods, based on high resolution-spectrometry successfully applied at Institute for Atomic Physics are presented. These methods are preferred to destructive chemical methods; the latter being costly and lengthy and not suitable for statistical prediction of nuclear fuel behaviour. The following methods are developed: methods for determining the burn up of fuel elements and fuel assemblies; a method for determining the U 235 and Pu 239 contributions to the burn up and a code written in FORTRAN IV for numerical calculation of Pu 239 fission vs. burn up; a high precision method for burnup determination by adding burnable poison; a method for prediction of specific power distribution in the fuel elements of a research or power reactors; a method for determining the power output of the fuel element in an operating power reactor; a method for determining the content of Pu 239 of the fuel element irradiated in a reactor. The results which were obtained by these methods improved the fuel management at the VVR-S reactor at Institute for Atomic Physics, Bucharest and may be applied to other reactor types [fr

  2. Vitamin D and clinical disease progression in HIV infection: results from the EuroSIDA study

    DEFF Research Database (Denmark)

    Viard, Jean-Paul; Souberbielle, Jean-Claude; Kirk, Ole

    2011-01-01

    BACKGROUND:: We examined the association between vitamin D [25(OH)D] level and disease progression in HIV infection. METHODS:: Within the EuroSIDA study, 2000 persons were randomly selected for 25(OH)D measurement in stored plasma samples closest to study entry. 25(OH)D results were stratified...

  3. Unsupervised text mining methods for literature analysis: a case study for Thomas Pynchon's V.

    Directory of Open Access Journals (Sweden)

    Christos Iraklis Tsatsoulis

    2013-08-01

    Full Text Available We investigate the use of unsupervised text mining methods for the analysis of prose literature works, using Thomas Pynchon's novel 'V'. as a case study. Our results suggest that such methods may be employed to reveal meaningful information regarding the novel’s structure. We report results using a wide variety of clustering algorithms, several distinct distance functions, and different visualization techniques. The application of a simple topic model is also demonstrated. We discuss the meaningfulness of our results along with the limitations of our approach, and we suggest some possible paths for further study.

  4. Description and pilot results from a novel method for evaluating return of incidental findings from next-generation sequencing technologies.

    Science.gov (United States)

    Goddard, Katrina A B; Whitlock, Evelyn P; Berg, Jonathan S; Williams, Marc S; Webber, Elizabeth M; Webster, Jennifer A; Lin, Jennifer S; Schrader, Kasmintan A; Campos-Outcalt, Doug; Offit, Kenneth; Feigelson, Heather Spencer; Hollombe, Celine

    2013-09-01

    The aim of this study was to develop, operationalize, and pilot test a transparent, reproducible, and evidence-informed method to determine when to report incidental findings from next-generation sequencing technologies. Using evidence-based principles, we proposed a three-stage process. Stage I "rules out" incidental findings below a minimal threshold of evidence and is evaluated using inter-rater agreement and comparison with an expert-based approach. Stage II documents criteria for clinical actionability using a standardized approach to allow experts to consistently consider and recommend whether results should be routinely reported (stage III). We used expert opinion to determine the face validity of stages II and III using three case studies. We evaluated the time and effort for stages I and II. For stage I, we assessed 99 conditions and found high inter-rater agreement (89%), and strong agreement with a separate expert-based method. Case studies for familial adenomatous polyposis, hereditary hemochromatosis, and α1-antitrypsin deficiency were all recommended for routine reporting as incidental findings. The method requires definition of clinically actionable incidental findings and provide documentation and pilot testing of a feasible method that is scalable to the whole genome.

  5. Relationships Between Results Of An Internal And External Match Load Determining Method In Male, Singles Badminton Players.

    Science.gov (United States)

    Abdullahi, Yahaya; Coetzee, Ben; Van den Berg, Linda

    2017-07-03

    The study purpose was to determine relationships between results of internal and external match load determining methods. Twenty-one players, who participated in selected badminton championships during the 2014/2015 season served as subjects. The heart rate (HR) values and GPS data of each player were obtained via a fix Polar HR Transmitter Belt and MinimaxX GPS device. Moderate significant Spearman's rank correlations were found between HR and absolute duration (r = 0.43 at a low intensity (LI) and 0.44 at a high intensity (HI)), distance covered (r = 0.42 at a HI) and player load (PL) (r = 0.44 at a HI). Results also revealed an opposite trend for external and internal measures of load as the average relative HR value was found to be the highest for the HI zone (54.1%) compared to the relative measures of external load where average values (1.29-9.89%) were the lowest for the HI zone. In conclusion, our findings show that results of an internal and external badminton match load determining method are more related to each other in the HI zone than other zones and that the strength of relationships depend on the duration of activities that are performed in especially LI and HI zones. Overall, trivial to moderate relationships between results of an internal and external match load determining method in male, singles badminton players reaffirm the conclusions of others that these constructs measure distinctly different demands and should therefore be measured concurrently to fully understand the true requirements of badminton match play.

  6. An at-site flood estimation method in the context of nonstationarity I. A simulation study

    Science.gov (United States)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    The stationarity of annual flood peak records is the traditional assumption of flood frequency analysis. In some cases, however, as a result of land-use and/or climate change, this assumption is no longer valid. Therefore, new statistical models are needed to capture dynamically the change of probability density functions over time, in order to obtain reliable flood estimation. In this study, an innovative method for nonstationary flood frequency analysis was presented. Here, the new method is based on detrending the flood series and applying the L-moments along with the GEV distribution to the transformed ;stationary; series (hereafter, this is called the LM-NS). The LM-NS method was assessed through a comparative study with the maximum likelihood (ML) method for the nonstationary GEV model, as well as with the stationary (S) GEV model. The comparative study, based on Monte Carlo simulations, was carried out for three nonstationary GEV models: a linear dependence of the mean on time (GEV1), a quadratic dependence of the mean on time (GEV2), and linear dependence in both the mean and log standard deviation on time (GEV11). The simulation results indicated that the LM-NS method performs better than the ML method for most of the cases studied, whereas the stationary method provides the least accurate results. An additional advantage of the LM-NS method is to avoid the numerical problems (e.g., convergence problems) that may occur with the ML method when estimating parameters for small data samples.

  7. The Ronnie Gardiner Rhythm and Music Method - a feasibility study in Parkinson's disease.

    Science.gov (United States)

    Pohl, Petra; Dizdar, Nil; Hallert, Eva

    2013-01-01

    To assess the feasibility of the novel intervention, Ronnie Gardiner Rhythm and Music (RGRM™) Method compared to a control group for patients with Parkinson's disease (PD). Eighteen patients, mean age 68, participating in a disability study within a neurological rehabilitation centre, were randomly allocated to intervention group (n = 12) or control group (n = 6). Feasibility was assessed by comparing effects of the intervention on clinical outcome measures (primary outcome: mobility as assessed by two-dimensional motion analysis, secondary outcomes: mobility, cognition, quality of life, adherence, adverse events and eligibility). Univariable analyses showed no significant differences between groups following intervention. However, analyses suggested that patients in the intervention group improved more on mobility (p = 0.006), cognition and quality of life than patients in the control group. There were no adverse events and a high level of adherence to therapy was observed. In this disability study, the use of the RGRM™ Method showed promising results in the intervention group and the adherence level was high. Our results suggest that most assessments chosen are eligible to use in a larger randomized controlled study for patients with PD. The RGRM™ Method appeared to be a useful and safe method that showed promising results in both motor and cognitive functions as well as quality of life in patients with moderate PD. The RGRM™ Method can be used by physiotherapists, occupational, speech and music therapists in neurological rehabilitation. Most measurements were feasible except for Timed-Up-and-Go.

  8. Study of the radioactivity of rocks by the photographic method

    Energy Technology Data Exchange (ETDEWEB)

    Picciotto, E E

    1949-08-16

    The use of photographic plates, and especially of the new Ilford and Kodak plates, in nuclear physics is briefly described. In particular, the application of these methods to the study of the radioactivity of rocks is discussed. In a series of studies made by the authors, the photographic plates were placed in close contact with a thin, highly polished sheet of the rock sample and then developed under specified conditions. This method was used to determine the concentration of U and Th in two radioactive rock samples and the results are given. The samples were then reduced to powder form and the concentrations were again determined. Work on dissolved samples has not yet been completed. In conclusion, the relative merits of these different techniques are indicated.

  9. The relationship of the local food environment with obesity: A systematic review of methods, study quality, and results.

    Science.gov (United States)

    Cobb, Laura K; Appel, Lawrence J; Franco, Manuel; Jones-Smith, Jessica C; Nur, Alana; Anderson, Cheryl A M

    2015-07-01

    To examine the relationship between local food environments and obesity and assess the quality of studies reviewed. Systematic keyword searches identified studies from US and Canada that assessed the relationship of obesity to local food environments. We applied a quality metric based on design, exposure and outcome measurement, and analysis. We identified 71 studies representing 65 cohorts. Overall, study quality was low; 60 studies were cross-sectional. Associations between food outlet availability and obesity were predominantly null. Among non-null associations, we saw a trend toward inverse associations between supermarket availability and obesity (22 negative, 4 positive, 67 null) and direct associations between fast food and obesity (29 positive, 6 negative, 71 null) in adults. We saw direct associations between fast food availability and obesity in lower income children (12 positive, 7 null). Indices including multiple food outlets were most consistently associated with obesity in adults (18 expected, 1 not expected, 17 null). Limiting to higher quality studies did not affect results. Despite the large number of studies, we found limited evidence for associations between local food environments and obesity. The predominantly null associations should be interpreted cautiously due to the low quality of available studies. © 2015 The Obesity Society.

  10. Evaluation and Comparison of the Processing Methods of Airborne Gravimetry Concerning the Errors Effects on Downward Continuation Results: Case Studies in Louisiana (USA) and the Tibetan Plateau (China).

    Science.gov (United States)

    Zhao, Qilong; Strykowski, Gabriel; Li, Jiancheng; Pan, Xiong; Xu, Xinyu

    2017-05-25

    Gravity data gaps in mountainous areas are nowadays often filled in with the data from airborne gravity surveys. Because of the errors caused by the airborne gravimeter sensors, and because of rough flight conditions, such errors cannot be completely eliminated. The precision of the gravity disturbances generated by the airborne gravimetry is around 3-5 mgal. A major obstacle in using airborne gravimetry are the errors caused by the downward continuation. In order to improve the results the external high-accuracy gravity information e.g., from the surface data can be used for high frequency correction, while satellite information can be applying for low frequency correction. Surface data may be used to reduce the systematic errors, while regularization methods can reduce the random errors in downward continuation. Airborne gravity surveys are sometimes conducted in mountainous areas and the most extreme area of the world for this type of survey is the Tibetan Plateau. Since there are no high-accuracy surface gravity data available for this area, the above error minimization method involving the external gravity data cannot be used. We propose a semi-parametric downward continuation method in combination with regularization to suppress the systematic error effect and the random error effect in the Tibetan Plateau; i.e., without the use of the external high-accuracy gravity data. We use a Louisiana airborne gravity dataset from the USA National Oceanic and Atmospheric Administration (NOAA) to demonstrate that the new method works effectively. Furthermore, and for the Tibetan Plateau we show that the numerical experiment is also successfully conducted using the synthetic Earth Gravitational Model 2008 (EGM08)-derived gravity data contaminated with the synthetic errors. The estimated systematic errors generated by the method are close to the simulated values. In addition, we study the relationship between the downward continuation altitudes and the error effect. The

  11. Evaluation and Comparison of the Processing Methods of Airborne Gravimetry Concerning the Errors Effects on Downward Continuation Results: Case Studies in Louisiana (USA) and the Tibetan Plateau (China)

    Science.gov (United States)

    Zhao, Q.

    2017-12-01

    Gravity data gaps in mountainous areas are nowadays often filled in with the data from airborne gravity surveys. Because of the errors caused by the airborne gravimeter sensors, and because of rough flight conditions, such errors cannot be completely eliminated. The precision of the gravity disturbances generated by the airborne gravimetry is around 3-5 mgal. A major obstacle in using airborne gravimetry are the errors caused by the downward continuation. In order to improve the results the external high-accuracy gravity information e.g., from the surface data can be used for high frequency correction, while satellite information can be applying for low frequency correction. Surface data may be used to reduce the systematic errors, while regularization methods can reduce the random errors in downward continuation. Airborne gravity surveys are sometimes conducted in mountainous areas and the most extreme area of the world for this type of survey is the Tibetan Plateau. Since there are no high-accuracy surface gravity data available for this area, the above error minimization method involving the external gravity data cannot be used. We propose a semi-parametric downward continuation method in combination with regularization to suppress the systematic error effect and the random error effect in the Tibetan Plateau; i.e., without the use of the external high-accuracy gravity data. We use a Louisiana airborne gravity dataset from the USA National Oceanic and Atmospheric Administration (NOAA) to demonstrate that the new method works effectively. Furthermore, and for the Tibetan Plateau we show that the numerical experiment is also successfully conducted using the synthetic Earth Gravitational Model 2008 (EGM08)-derived gravity data contaminated with the synthetic errors. The estimated systematic errors generated by the method are close to the simulated values. In addition, we study the relationship between the downward continuation altitudes and the error effect. The

  12. COMPARATIVE STUDY ON MILK CASEIN ASSAY METHODS

    Directory of Open Access Journals (Sweden)

    RODICA CĂPRIłĂ

    2008-05-01

    Full Text Available Casein, the main milk protein was determined by different assay methods: the gravimetric method, the method based on the neutralization of the NaOH excess used for the casein precipitate solving and the method based on the titration of the acetic acid used for the casein precipitation. The last method is the simplest one, with the fewer steps, and also with the lowest error degree. The results of the experiment revealed that the percentage of casein from the whole milk protein represents between 72.6–81.3% in experiment 1, between 73.6–81.3% in experiment 2 and between 74.3–81% in experiment 3.

  13. Application to ion exchange study of an interferometry method

    International Nuclear Information System (INIS)

    Platzer, R.

    1960-01-01

    The numerous experiments carried out on ion exchange between clay suspensions and solutions have so far been done by studying the equilibrium between the two phases; by this method it is very difficult to obtain the kinetic properties of the exchange reactions. At method consisting of observation with an interferential microscope using polarised white light shows up the variations in concentration which take place during the ion exchange between an ionic solution and a montmorillonite slab as well as between an ionic solution and a grain of organic ion exchanger. By analysing the results it will be possible to compare the exchange constants of organic ion exchangers with those of mineral ion exchangers. (author) [fr

  14. Studying the Night Shift: A Multi-method Analysis of Overnight Academic Library Users

    Directory of Open Access Journals (Sweden)

    David Schwieder

    2017-09-01

    Full Text Available Abstract Objective – This paper reports on a study which assessed the preferences and behaviors of overnight library users at a major state university. The findings were used to guide the design and improvement of overnight library resources and services, and the selection of a future overnight library site. Methods – A multi-method design used descriptive and correlational statistics to analyze data produced by a multi-sample survey of overnight library users. These statistical methods included rankings, percentages, and multiple regression. ResultsResults showed a strong consistency across statistical methods and samples. Overnight library users consistently prioritized facilities like power outlets for electronic devices, and group and quiet study spaces, and placed far less emphasis on assistance from library staff. Conclusions – By employing more advanced statistical and sampling procedures than had been found in previous research, this paper strengthens the validity of findings on overnight user preferences and behaviors. The multi-method research design can also serve to guide future work in this area.

  15. Raw material consumption of the European Union--concept, calculation method, and results.

    Science.gov (United States)

    Schoer, Karl; Weinzettel, Jan; Kovanda, Jan; Giegrich, Jürgen; Lauwigi, Christoph

    2012-08-21

    This article presents the concept, calculation method, and first results of the "Raw Material Consumption" (RMC) economy-wide material flow indicator for the European Union (EU). The RMC measures the final domestic consumption of products in terms of raw material equivalents (RME), i.e. raw materials used in the complete production chain of consumed products. We employed the hybrid input-output life cycle assessment method to calculate RMC. We first developed a highly disaggregated environmentally extended mixed unit input output table and then applied life cycle inventory data for imported products without appropriate representation of production within the domestic economy. Lastly, we treated capital formation as intermediate consumption. Our results show that services, often considered as a solution for dematerialization, account for a significant part of EU raw material consumption, which emphasizes the need to focus on the full production chains and dematerialization of services. Comparison of the EU's RMC with its domestic extraction shows that the EU is nearly self-sufficient in biomass and nonmetallic minerals but extremely dependent on direct and indirect imports of fossil energy carriers and metal ores. This implies an export of environmental burden related to extraction and primary processing of these materials to the rest of the world. Our results demonstrate that internalizing capital formation has significant influence on the calculated RMC.

  16. Differential scanning calorimetry method for purity determination: A case study on polycyclic aromatic hydrocarbons and chloramphenicol

    International Nuclear Information System (INIS)

    Kestens, V.; Zeleny, R.; Auclair, G.; Held, A.; Roebben, G.; Linsinger, T.P.J.

    2011-01-01

    Highlights: → Purity assessment of polycyclic aromatic hydrocarbons and chloramphenicol by DSC. → DSC results compared with traditional purity methods. → Different methods give different results, multiple method approach recommended. → DSC sensitive to impurities that have similar structures as main component. - Abstract: In this study the validity and suitability of differential scanning calorimetry (DSC) to determine the purity of selected polycyclic aromatic hydrocarbons and chloramphenicol has been investigated. The study materials were two candidate certified reference materials (CRMs), 6-methylchrysene and benzo[a]pyrene, and two different batches of commercially available highly pure chloramphenicol. The DSC results were compared with those obtained by other methods, namely gas and liquid chromatography with mass spectrometric detection, liquid chromatography with diode array detection, and quantitative nuclear magnetic resonance. The purity results obtained by these different analytical methods confirm the well-known challenges of comparing results of different method-defined measurands. In comparison with other methods, DSC has a much narrower working range. This limits the applicability of DSC as purity determination method, for instance during the assignment of the purity value of a CRM. Nevertheless, this study showed that DSC can be a powerful technique to detect impurities that are structurally very similar to the main purity component. From this point of view, and because of its good repeatability, DSC can be considered as a valuable technique to investigate the homogeneity and stability of candidate purity CRMs.

  17. Evaluating Method Engineer Performance: an error classification and preliminary empirical study

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    1998-11-01

    Full Text Available We describe an approach to empirically test the use of metaCASE environments to model methods. Both diagrams and matrices have been proposed as a means for presenting the methods. These different paradigms may have their own effects on how easily and well users can model methods. We extend Batra's classification of errors in data modelling to cover metamodelling, and use it to measure the performance of a group of metamodellers using either diagrams or matrices. The tentative results from this pilot study confirm the usefulness of the classification, and show some interesting differences between the paradigms.

  18. Studies of non-contact methods for roughness measurements on wood surfaces

    International Nuclear Information System (INIS)

    Lundberg, I.A.S.; Porankiewicz, B.

    1995-01-01

    The quality of wood surfaces after different kinds of machining processes is a property of great importance for the wood processing industries. Present work is a study, whose objective was to evaluate different non-contact methods, for measurement of the quality of the wood surfaces by correlating them with stylus tracing. A number of Scots Pine samples were prepared by different kinds of wood machining processing. Surface roughness measurements were performed, utilizing two optical noncontact methods. The results indicate that the laser scan method can measure surface roughness on sawn wood with a sufficient degree of accuracy. (author) [de

  19. Aircrew Exposure To Cosmic Radiation Evaluated By Means Of Several Methods; Results Obtained In 2006

    International Nuclear Information System (INIS)

    Ploc, Ondrej; Spurny, Frantisek; Jadrnickova, Iva; Turek, Karel

    2008-01-01

    Routine evaluation of aircraft crew exposure to cosmic radiation in the Czech Republic is performed by means of calculation method. Measurements onboard aircraft work as a control tool of the routine method, as well as a possibility of comparison of results measured by means of several methods. The following methods were used in 2006: (1) mobile dosimetry unit (MDU) type Liulin--a spectrometer of energy deposited in Si-detector; (2) two types of LET spectrometers based on the chemically etched track detectors (TED); (3) two types of thermoluminescent detectors; and (4) two calculation methods. MDU represents currently one of the most reliable equipments for evaluation of the aircraft crew exposure to cosmic radiation. It is an active device which measures total energy depositions (E dep ) in the semiconductor unit, and, after appropriate calibration, is able to give a separate estimation for non-neutron and neutron-like components of H*(10). This contribution consists mostly of results acquired by means of this equipment; measurements with passive detectors and calculations are mentioned because of comparison. Reasonably good agreement of all data sets could be stated

  20. INTEGRATION OF PRODUCTION AND SUPPLY IN THE LEAN MANUFACTURING CONDITIONS ACCORDING TO THE LOT FOR LOT METHOD LOGIC - RESULTS OF RESEARCH

    Directory of Open Access Journals (Sweden)

    Roman Domański

    2015-12-01

    Full Text Available Background: The review of literature and observations of business practice indicate that integration of production and supply is not a well-developed area of science. The author notes that the publications on the integration most often focus on selected detailed aspects and are rather postulative in character. This is accompanied by absence of specific utilitarian solutions (tools which could be used in business practice. Methods: The research was conducted between 2009 and 2010 in a company in Wielkopolska which operates in the machining sector. The solution of the research problem is based on the author's own concept - the integration model. The cost concept of the solution was built and verified (case study on the basis of conditions of a given enterprise (industrial data. Results: Partial verifiability of results was proved in the entire set of selected material indexes (although in two cases out of three the costs differences to the disadvantage of the lot-for-lot method were small. In case of structure of studied product range, a significant conformity of results in the order of 67% was achieved for items typically characteristic for the LfL method (group AX. Conclusions: The formulated research problem and the result of its solution (only 6 material items demand a lot (orthodoxy in terms of implementation conditions. The concept of the solution has a narrow field of application in the selected organizational conditions (studied enterprise. It should be verified by independent studies of this kind at other enterprises.

  1. Feasibility to implement the radioisotopic method of nasal mucociliary transport measurement getting reliable results

    International Nuclear Information System (INIS)

    Troncoso, M.; Opazo, C.; Quilodran, C.; Lizama, V.

    2002-01-01

    Aim: Our goal was to implement the radioisotopic method to measure the nasal mucociliary velocity of transport (NMVT) in a feasible way in order to make it easily available as well as to validate the accuracy of the results. Such a method is needed when primary ciliary dyskinesia (PCD) is suspected, a disorder characterized for low NMVT, non-specific chronic respiratory symptoms that needs to be confirmed by electronic microscopic cilia biopsy. Methods: We performed one hundred studies from February 2000 until February 2002. Patients aged 2 months to 39 years, mean 9 years. All of them were referred from the Respiratory Disease Department. Ninety had upper or lower respiratory symptoms, ten were healthy controls. The procedure, done be the Nuclear Medicine Technologist, consists to put a 20 μl drop of 99mTc-MAA (0,1 mCi, 4 MBq) behind the head of the inferior turbinate in one nostril using a frontal light, a nasal speculum and a teflon catheter attached to a tuberculin syringe. The drop movement was acquired in a gamma camera-computer system and the velocity was expressed in mm/min. As there is need for the patient not to move during the procedure, sedation has to be used in non-cooperative children. Abnormal NMVT values cases were referred for nasal biopsy. Patients were classified in three groups. Normal controls (NC), PCD confirmed by biopsy (PCDB) and cases with respiratory symptoms without biopsy (RSNB). In all patients with NMVT less than 2.4 mm/min PCD was confirmed by biopsy. There was a clear-cut separation between normal and abnormal values and interestingly even the highest NMVT in PCDB cases was lower than the lowest NMVT in NC. The procedure is not as easy as is generally described in the literature because the operator has to get some skill as well as for the need of sedation in some cases. Conclusion: The procedure gives reliable, reproducible and objective results. It is safe, not expensive and quick in cooperative patients. Although, sometimes

  2. Methods and results of diuresis renography in infants and children. Methodik und Ergebnisse der Diurese-Nephrographie im Kindesalter

    Energy Technology Data Exchange (ETDEWEB)

    Kleinhans, E. (Klinik fuer Nuklearmedizin, RWTH Aachen (Germany)); Rohrmann, D. (Urologische Klinik, RWTH Aachen (Germany)); Stollbrink, C. (Paediatrische Klinik, RWTH Aachen (Germany)); Mertens, R. (Paediatrische Klinik, RWTH Aachen (Germany)); Jakse, G. (Urologische Klinik, RWTH Aachen (Germany)); Buell, U. (Klinik fuer Nuklearmedizin, RWTH Aachen (Germany))

    1994-02-01

    In infants and children with hydronephrosis, the decision-making process for those instances of urinary tract dilatation that require surgical correction and those that do not is based in part on the findings of diuresis renography. Quantitative analysis of renogram curve pattern is a well established tool which, in addition, provides comparable results in follow-up studies. However, standardization of the method including data analysis does not yet exist. In this study, three parameters obtained by mathematical curve analysis were examined: clearance half-time for diuretic response, clearance within 5 minutes and clearance within 16 minutes. As a result, 16 minutes clearance revealed superior results in discriminating obstructive impairments of urine drainage from not obstructive ones. Compared to the clearance halftime, the markedly shorter duration of the examination (16 minutes) is an additional benefit. (orig.)

  3. Phylogenetic representativeness: a new method for evaluating taxon sampling in evolutionary studies

    Directory of Open Access Journals (Sweden)

    Passamonti Marco

    2010-04-01

    Full Text Available Abstract Background Taxon sampling is a major concern in phylogenetic studies. Incomplete, biased, or improper taxon sampling can lead to misleading results in reconstructing evolutionary relationships. Several theoretical methods are available to optimize taxon choice in phylogenetic analyses. However, most involve some knowledge about the genetic relationships of the group of interest (i.e., the ingroup, or even a well-established phylogeny itself; these data are not always available in general phylogenetic applications. Results We propose a new method to assess taxon sampling developing Clarke and Warwick statistics. This method aims to measure the "phylogenetic representativeness" of a given sample or set of samples and it is based entirely on the pre-existing available taxonomy of the ingroup, which is commonly known to investigators. Moreover, our method also accounts for instability and discordance in taxonomies. A Python-based script suite, called PhyRe, has been developed to implement all analyses we describe in this paper. Conclusions We show that this method is sensitive and allows direct discrimination between representative and unrepresentative samples. It is also informative about the addition of taxa to improve taxonomic coverage of the ingroup. Provided that the investigators' expertise is mandatory in this field, phylogenetic representativeness makes up an objective touchstone in planning phylogenetic studies.

  4. A method for studying decision-making by guideline development groups

    Directory of Open Access Journals (Sweden)

    Michie Susan

    2009-08-01

    Full Text Available Abstract Background Multidisciplinary guideline development groups (GDGs have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. Methods A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Results Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE acute physical health GDG. Conclusion This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision

  5. MULTICRITERIA METHODS IN PERFORMING COMPANIES’ RESULTS USING ELECTRONIC RECRUITING, CORPORATE COMMUNICATION AND FINANCIAL RATIOS

    Directory of Open Access Journals (Sweden)

    Ivana Bilić

    2011-02-01

    Full Text Available Human resources represent one of the most important companies’ resources responsible in creation of companies’ competitive advantage. In search for the most valuable resources, companies use different methods. Lately, one of the growing methods is electronic recruiting, not only as a recruitment tool, but also as a mean of external communication. Additionally, in the process of corporate communication, companies nowadays use the electronic corporate communication as the easiest, the cheapest and the simplest form of business communication. The aim of this paper is to investigate relationship between three groups of different criteria; including main characteristics of performed electronic recruiting, corporate communication and selected financial performances. Selected companies were ranked separately by each group of criteria by usage of multicriteria decision making method PROMETHEE II. The main idea is to research whether companies which are the highest performers by certain group of criteria obtain the similar results regarding other group of criteria or performing results.

  6. Spacelab Science Results Study

    Science.gov (United States)

    Naumann, R. J.; Lundquist, C. A.; Tandberg-Hanssen, E.; Horwitz, J. L.; Germany, G. A.; Cruise, J. F.; Lewis, M. L.; Murphy, K. L.

    2009-01-01

    Beginning with OSTA-1 in November 1981 and ending with Neurolab in March 1998, a total of 36 Shuttle missions carried various Spacelab components such as the Spacelab module, pallet, instrument pointing system, or mission peculiar experiment support structure. The experiments carried out during these flights included astrophysics, solar physics, plasma physics, atmospheric science, Earth observations, and a wide range of microgravity experiments in life sciences, biotechnology, materials science, and fluid physics which includes combustion and critical point phenomena. In all, some 764 experiments were conducted by investigators from the U.S., Europe, and Japan. The purpose of this Spacelab Science Results Study is to document the contributions made in each of the major research areas by giving a brief synopsis of the more significant experiments and an extensive list of the publications that were produced. We have also endeavored to show how these results impacted the existing body of knowledge, where they have spawned new fields, and if appropriate, where the knowledge they produced has been applied.

  7. The Trojan Horse method for nuclear astrophysics: Recent results for direct reactions

    International Nuclear Information System (INIS)

    Tumino, A.; Gulino, M.; Spitaleri, C.; Cherubini, S.; Romano, S.; Cognata, M. La; Pizzone, R. G.; Rapisarda, G. G.; Lamia, L.

    2014-01-01

    The Trojan Horse method is a powerful indirect technique to determine the astrophysical factor for binary rearrangement processes A+x→b+B at astrophysical energies by measuring the cross section for the Trojan Horse (TH) reaction A+a→B+b+s in quasi free kinematics. The Trojan Horse Method has been successfully applied to many reactions of astrophysical interest, both direct and resonant. In this paper, we will focus on direct sub-processes. The theory of the THM for direct binary reactions will be shortly presented based on a few-body approach that takes into account the off-energy-shell effects and initial and final state interactions. Examples of recent results will be presented to demonstrate how THM works experimentally

  8. The Trojan Horse method for nuclear astrophysics: Recent results for direct reactions

    Energy Technology Data Exchange (ETDEWEB)

    Tumino, A.; Gulino, M. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania, Italy and Università degli Studi di Enna Kore, Enna (Italy); Spitaleri, C.; Cherubini, S.; Romano, S. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania, Italy and Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy); Cognata, M. La; Pizzone, R. G.; Rapisarda, G. G. [Laboratori Nazionali del Sud, Istituto Nazionale di Fisica Nucleare, Catania (Italy); Lamia, L. [Dipartimento di Fisica e Astronomia, Università di Catania, Catania (Italy)

    2014-05-09

    The Trojan Horse method is a powerful indirect technique to determine the astrophysical factor for binary rearrangement processes A+x→b+B at astrophysical energies by measuring the cross section for the Trojan Horse (TH) reaction A+a→B+b+s in quasi free kinematics. The Trojan Horse Method has been successfully applied to many reactions of astrophysical interest, both direct and resonant. In this paper, we will focus on direct sub-processes. The theory of the THM for direct binary reactions will be shortly presented based on a few-body approach that takes into account the off-energy-shell effects and initial and final state interactions. Examples of recent results will be presented to demonstrate how THM works experimentally.

  9. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    International Nuclear Information System (INIS)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L.

    2017-01-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  10. Validity studies among hierarchical methods of cluster analysis using cophenetic correlation coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Priscilla R.; Munita, Casimiro S.; Lapolli, André L., E-mail: prii.ramos@gmail.com, E-mail: camunita@ipen.br, E-mail: alapolli@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The literature presents many methods for partitioning of data base, and is difficult choose which is the most suitable, since the various combinations of methods based on different measures of dissimilarity can lead to different patterns of grouping and false interpretations. Nevertheless, little effort has been expended in evaluating these methods empirically using an archaeological data base. In this way, the objective of this work is make a comparative study of the different cluster analysis methods and identify which is the most appropriate. For this, the study was carried out using a data base of the Archaeometric Studies Group from IPEN-CNEN/SP, in which 45 samples of ceramic fragments from three archaeological sites were analyzed by instrumental neutron activation analysis (INAA) which were determinate the mass fraction of 13 elements (As, Ce, Cr, Eu, Fe, Hf, La, Na, Nd, Sc, Sm, Th, U). The methods used for this study were: single linkage, complete linkage, average linkage, centroid and Ward. The validation was done using the cophenetic correlation coefficient and comparing these values the average linkage method obtained better results. A script of the statistical program R with some functions was created to obtain the cophenetic correlation. By means of these values was possible to choose the most appropriate method to be used in the data base. (author)

  11. The reflexive case study method

    DEFF Research Database (Denmark)

    Rittenhofer, Iris

    2015-01-01

    This paper extends the international business research on small to medium-sized enterprises (SME) at the nexus of globalization. Based on a conceptual synthesis across disciplines and theoretical perspectives, it offers management research a reflexive method for case study research of postnational...

  12. A comparison of short-term dispersion estimates resulting from various atmospheric stability classification methods

    International Nuclear Information System (INIS)

    Mitchell, A.E. Jr.

    1982-01-01

    Four methods of classifying atmospheric stability class are applied at four sites to make short-term (1-h) dispersion estimates from a ground-level source based on a model consistent with U.S. Nuclear Regulatory Commission practice. The classification methods include vertical temperature gradient, standard deviation of horizontal wind direction fluctuations (sigma theta), Pasquill-Turner, and modified sigma theta which accounts for meander. Results indicate that modified sigma theta yields reasonable dispersion estimates compared to those produced using methods of vertical temperature gradient and Pasquill-Turner, and can be considered as a potential economic alternative in establishing onsite monitoring programs. (author)

  13. A study of environmental polluting factors by neutron activation method

    International Nuclear Information System (INIS)

    Paunoiu, C.; Doca, C.

    2004-01-01

    The paper presents: a) some importance factors of the environmental pollution; b) the theoretical aspects of the Neutron Activation Analysis (NAA) used in the study of the environmental pollution; c) the NAA specific hardware and software facilities existing at the Institute for Nuclear Research; d) a direct application of the NAA method in the study of the environmental pollution for Pitesti city by the analysis of some ground and vegetation samples; e) results and conclusions. (authors)

  14. Methodics of computing the results of monitoring the exploratory gallery

    Directory of Open Access Journals (Sweden)

    Krúpa Víazoslav

    2000-09-01

    Full Text Available At building site of motorway tunnel Višòové-Dubná skala , the priority is given to driving of exploration galley that secures in detail: geologic, engineering geology, hydrogeology and geotechnics research. This research is based on gathering information for a supposed use of the full profile driving machine that would drive the motorway tunnel. From a part of the exploration gallery which is driven by the TBM method, a fulfilling information is gathered about the parameters of the driving process , those are gathered by a computer monitoring system. The system is mounted on a driving machine. This monitoring system is based on the industrial computer PC 104. It records 4 basic values of the driving process: the electromotor performance of the driving machine Voest-Alpine ATB 35HA, the speed of driving advance, the rotation speed of the disintegrating head TBM and the total head pressure. The pressure force is evaluated from the pressure in the hydraulic cylinders of the machine. Out of these values, the strength of rock mass, the angle of inner friction, etc. are mathematically calculated. These values characterize rock mass properties as their changes. To define the effectivity of the driving process, the value of specific energy and the working ability of driving head is used. The article defines the methodics of computing the gathered monitoring information, that is prepared for the driving machine Voest – Alpine ATB 35H at the Institute of Geotechnics SAS. It describes the input forms (protocols of the developed method created by an EXCEL program and shows selected samples of the graphical elaboration of the first monitoring results obtained from exploratory gallery driving process in the Višòové – Dubná skala motorway tunnel.

  15. Repetitive transcranial magnetic stimulation as an adjuvant method in the treatment of depression: Preliminary results

    Directory of Open Access Journals (Sweden)

    Jovičić Milica

    2014-01-01

    Full Text Available Introduction. Repetitive transcranial magnetic stimulation (rTMS is a method of brain stimulation which is increasingly used in both clinical practice and research. Up-to-date studies have pointed out a potential antidepressive effect of rTMS, but definitive superiority over placebo has not yet been confirmed. Objective. The aim of the study was to examine the effect of rTMS as an adjuvant treatment with antidepressants during 18 weeks of evaluation starting from the initial application of the protocol. Methods. Four patients with the diagnosis of moderate/severe major depression were included in the study. The protocol involved 2000 stimuli per day (rTMS frequency of 10 Hz, intensity of 120% motor threshold administered over the left dorsolateral prefrontal cortex (DLPFC for 15 days. Subjective and objective depressive symptoms were measured before the initiation of rTMS and repeatedly evaluated at week 3, 6, 12 and 18 from the beginning of the stimulation. Results. After completion of rTMS protocol two patients demonstrated a reduction of depressive symptoms that was sustained throughout the 15-week follow-up period. One patient showed a tendency of remission during the first 12 weeks of the study, but relapsed in week 18. One patient showed no significant symptom reduction at any point of follow-up. Conclusion. Preliminary findings suggest that rTMS has a good tolerability and can be efficient in accelerating the effect of antidepressants, particularly in individuals with shorter duration of depressive episodes and moderate symptom severity. [Projekat Ministarstva nauke Republike Srbije, br. III41029 i br. ON175090

  16. The review and results of different methods for facial recognition

    Science.gov (United States)

    Le, Yifan

    2017-09-01

    In recent years, facial recognition draws much attention due to its wide potential applications. As a unique technology in Biometric Identification, facial recognition represents a significant improvement since it could be operated without cooperation of people under detection. Hence, facial recognition will be taken into defense system, medical detection, human behavior understanding, etc. Several theories and methods have been established to make progress in facial recognition: (1) A novel two-stage facial landmark localization method is proposed which has more accurate facial localization effect under specific database; (2) A statistical face frontalization method is proposed which outperforms state-of-the-art methods for face landmark localization; (3) It proposes a general facial landmark detection algorithm to handle images with severe occlusion and images with large head poses; (4) There are three methods proposed on Face Alignment including shape augmented regression method, pose-indexed based multi-view method and a learning based method via regressing local binary features. The aim of this paper is to analyze previous work of different aspects in facial recognition, focusing on concrete method and performance under various databases. In addition, some improvement measures and suggestions in potential applications will be put forward.

  17. Quantitative methods for studying design protocols

    CERN Document Server

    Kan, Jeff WT

    2017-01-01

    This book is aimed at researchers and students who would like to engage in and deepen their understanding of design cognition research. The book presents new approaches for analyzing design thinking and proposes methods of measuring design processes. These methods seek to quantify design issues and design processes that are defined based on notions from the Function-Behavior-Structure (FBS) design ontology and from linkography. A linkograph is a network of linked design moves or segments. FBS ontology concepts have been used in both design theory and design thinking research and have yielded numerous results. Linkography is one of the most influential and elegant design cognition research methods. In this book Kan and Gero provide novel and state-of-the-art methods of analyzing design protocols that offer insights into design cognition by integrating segmentation with linkography by assigning FBS-based codes to design moves or segments and treating links as FBS transformation processes. They propose and test ...

  18. Discriminant method for the optimization of radionuclide activity in studies of nuclear medicine

    International Nuclear Information System (INIS)

    Perez Diaz, Marlen

    2003-01-01

    It is presented a method for the optimization of the radionuclidic activity to administer to mature patients in studies of Nuclear Medicine. The method is based in technical of discriminant analysis to build a function that discriminates groups with image quality differed on the base of physical parameters as they are the contrast image and the aleatory noise. The image quality is the dependent variable and it is selected by means of experts' evaluation and technical of clustering. The function is a lineal combination of a reduced group of variables physical-medical, able to discriminate the groups starting from a big group of variables measures. The method allows, also, to establish the relative weight of each discriminant variable selected . The behavior of the same ones is analyzed among studies carried out with different administered activity, with the objective of determining the minimum value of this that still allows good results in the image quality (Approach of activity optimization). It is validated the method by means of results comparison with the grateful Curved ROC in studies carried out with the Mannequins of Jaszczak (for planar studies) and of Insert Heart (for studies of SPECT). The optim activity value of the 99mTc, obtained with the application of the method, was coincident with the one obtained after the application of the method ROC to 6 expert observers as much in planar studies as in SPECT for two different cameras gamma. The method was applied later on in static, dynamic studies and of SPECT carried out with camera gamma to a mature population of 210 patient. The decisive variables of the quality of the image were obtained in the nuclear venticulography in rest, the bony gammagraphy, the nuclear renogram, the renal gammagraphy and the cerebral SPECT, as well as some activity values optimized for the equipment conditions and available radiopharmac in the country, allowing to establish a better commitment relationship between image quality

  19. Comparative analyses reveal discrepancies among results of commonly used methods for Anopheles gambiaemolecular form identification

    Directory of Open Access Journals (Sweden)

    Pinto João

    2011-08-01

    Full Text Available Abstract Background Anopheles gambiae M and S molecular forms, the major malaria vectors in the Afro-tropical region, are ongoing a process of ecological diversification and adaptive lineage splitting, which is affecting malaria transmission and vector control strategies in West Africa. These two incipient species are defined on the basis of single nucleotide differences in the IGS and ITS regions of multicopy rDNA located on the X-chromosome. A number of PCR and PCR-RFLP approaches based on form-specific SNPs in the IGS region are used for M and S identification. Moreover, a PCR-method to detect the M-specific insertion of a short interspersed transposable element (SINE200 has recently been introduced as an alternative identification approach. However, a large-scale comparative analysis of four widely used PCR or PCR-RFLP genotyping methods for M and S identification was never carried out to evaluate whether they could be used interchangeably, as commonly assumed. Results The genotyping of more than 400 A. gambiae specimens from nine African countries, and the sequencing of the IGS-amplicon of 115 of them, highlighted discrepancies among results obtained by the different approaches due to different kinds of biases, which may result in an overestimation of MS putative hybrids, as follows: i incorrect match of M and S specific primers used in the allele specific-PCR approach; ii presence of polymorphisms in the recognition sequence of restriction enzymes used in the PCR-RFLP approaches; iii incomplete cleavage during the restriction reactions; iv presence of different copy numbers of M and S-specific IGS-arrays in single individuals in areas of secondary contact between the two forms. Conclusions The results reveal that the PCR and PCR-RFLP approaches most commonly utilized to identify A. gambiae M and S forms are not fully interchangeable as usually assumed, and highlight limits of the actual definition of the two molecular forms, which might

  20. Study of Application for Excursion Observation Method in Primary School 2nd Grade Social Studies

    Directory of Open Access Journals (Sweden)

    Ahmet Ali GAZEL

    2014-04-01

    Full Text Available This study aims to investigate how field trips are conducted at 2nd grade of primary schools as a part of social studies course. Data for this research is compiled from 143 permanent Social Studies teachers working throughout 2011–2012 Education Year in the primary schools of central Kütahya and its districts. Data is compiled by using descriptive search model. In the research, after taking expert opinions, a measuring tool developed by the researcher is used. Data obtained from the research were transferred to computer, and analyses were made. In the analysis of the data, frequency and percentage values have been used to determine the distribution. Also a single factor variance analysis and t-test for independent samples have been used to determine the significance of difference between the variables. As a result of the research, it has been realized that insufficient importance is given to field trip method in Social Studies lessons. Most of the teachers using this method apply it in spring months. Teachers usually make use of field trips independent from unit/topic to increase the students’ motivation, and they generally use verbal expression in the class after tours. The biggest difficulty teachers encounter while using tour-observation method is the students’ undisciplined behavior.

  1. Current Mathematical Methods Used in QSAR/QSPR Studies

    Directory of Open Access Journals (Sweden)

    Peixun Liu

    2009-04-01

    Full Text Available This paper gives an overview of the mathematical methods currently used in quantitative structure-activity/property relationship (QASR/QSPR studies. Recently, the mathematical methods applied to the regression of QASR/QSPR models are developing very fast, and new methods, such as Gene Expression Programming (GEP, Project Pursuit Regression (PPR and Local Lazy Regression (LLR have appeared on the QASR/QSPR stage. At the same time, the earlier methods, including Multiple Linear Regression (MLR, Partial Least Squares (PLS, Neural Networks (NN, Support Vector Machine (SVM and so on, are being upgraded to improve their performance in QASR/QSPR studies. These new and upgraded methods and algorithms are described in detail, and their advantages and disadvantages are evaluated and discussed, to show their application potential in QASR/QSPR studies in the future.

  2. Methods and optical fibers that decrease pulse degradation resulting from random chromatic dispersion

    Science.gov (United States)

    Chertkov, Michael; Gabitov, Ildar

    2004-03-02

    The present invention provides methods and optical fibers for periodically pinning an actual (random) accumulated chromatic dispersion of an optical fiber to a predicted accumulated dispersion of the fiber through relatively simple modifications of fiber-optic manufacturing methods or retrofitting of existing fibers. If the pinning occurs with sufficient frequency (at a distance less than or are equal to a correlation scale), pulse degradation resulting from random chromatic dispersion is minimized. Alternatively, pinning may occur quasi-periodically, i.e., the pinning distance is distributed between approximately zero and approximately two to three times the correlation scale.

  3. SS-HORSE method for studying resonances

    Energy Technology Data Exchange (ETDEWEB)

    Blokhintsev, L. D. [Moscow State University, Skobeltsyn Institute of Nuclear Physics (Russian Federation); Mazur, A. I.; Mazur, I. A., E-mail: 008043@pnu.edu.ru [Pacific National University (Russian Federation); Savin, D. A.; Shirokov, A. M. [Moscow State University, Skobeltsyn Institute of Nuclear Physics (Russian Federation)

    2017-03-15

    A new method for analyzing resonance states based on the Harmonic-Oscillator Representation of Scattering Equations (HORSE) formalism and analytic properties of partial-wave scattering amplitudes is proposed. The method is tested by applying it to the model problem of neutral-particle scattering and can be used to study resonance states on the basis of microscopic calculations performed within various versions of the shell model.

  4. Interdisciplinary Study of Numerical Methods and Power Plants Engineering

    Directory of Open Access Journals (Sweden)

    Ioana OPRIS

    2014-08-01

    Full Text Available The development of technology, electronics and computing opened the way for a cross-disciplinary research that brings benefits by combining the achievements of different fields. To prepare the students for their future interdisciplinary approach,aninterdisciplinary teaching is adopted. This ensures their progress in knowledge, understanding and ability to navigate through different fields. Aiming these results, the Universities introduce new interdisciplinary courses which explore complex problems by studying subjects from different domains. The paper presents a problem encountered in designingpower plants. The method of solvingthe problem isused to explain the numerical methods and to exercise programming.The goal of understanding a numerical algorithm that solves a linear system of equations is achieved by using the knowledge of heat transfer to design the regenerative circuit of a thermal power plant. In this way, the outcomes from the prior courses (mathematics and physics are used to explain a new subject (numerical methods and to advance future ones (power plants.

  5. Mercury Exposure in Ireland: Results of the DEMOCOPHES Human Biomonitoring Study

    Directory of Open Access Journals (Sweden)

    Elizabeth Cullen

    2014-09-01

    Full Text Available Background: Monitoring of human exposure to mercury is important due to its adverse health effects. This study aimed to determine the extent of mercury exposure among mothers and their children in Ireland, and to identify factors associated with elevated levels. It formed part of the Demonstration of a study to Coordinate and Perform Human Biomonitoring on a European Scale (DEMOCOPHES pilot biomonitoring study. Methods: Hair mercury concentrations were determined from a convenience sample of 120 mother/child pairs. Mothers also completed a questionnaire. Rigorous quality assurance within DEMOCOPHES guaranteed the accuracy and international comparability of results. Results: Mercury was detected in 79.2% of the samples from mothers, and 62.5% of children’s samples. Arithmetic mean levels in mothers (0.262 µg/g hair and children (0.149 µg /g hair did not exceed the US EPA guidance value. Levels were significantly higher for those with higher education, and those who consumed more fish. Conclusions: The study demonstrates the benefit of human biomonitoring for assessing and comparing internal exposure levels, both on a population and an individual basis. It enables the potential harmful impact of mercury to be minimised in those highly exposed, and can therefore significantly contribute to population health.

  6. Methods study of homogeneity and stability test from cerium oxide CRM candidate

    International Nuclear Information System (INIS)

    Samin; Susanna TS

    2016-01-01

    The methods study of homogeneity and stability test from cerium oxide CRM candidate has been studied based on ISO 13258 and KAN DP. 01. 34. The purpose of this study was to select the test method homogeneity and stability tough on making CRM cerium oxide. Prepared 10 sub samples of cerium oxide randomly selected types of analytes which represent two compounds, namely CeO_2 and La_2O_3. At 10 sub sample is analyzed CeO_2 and La_2O_3 contents in duplicate with the same analytical methods, by the same analyst, and in the same laboratory. Data analysis results calculated statistically based on ISO 13528 and KAN DP.01.34. According to ISO 13528 Cerium Oxide samples said to be homogeneous if Ss ≤ 0.3 σ and is stable if | Xr – Yr | ≤ 0.3 σ. In this study, the data of homogeneity test obtained CeO_2 is Ss = 2.073 x 10-4 smaller than 0.3 σ (0.5476) and the stability test obtained | Xr - Yr | = 0.225 and the price is < 0.3 σ. Whereas for La_2O_3, the price for homogeneity test obtained Ss = 1.649 x 10-4 smaller than 0.3 σ (0.4865) and test the stability of the price obtained | Xr - Yr | = 0.2185 where the price is < 0.3 σ. Compared with the method from KAN, a sample of cerium oxide has also been homogenized for Fcalc < Ftable and stable, because | Xi - Xhm | < 0.3 x n IQR. Provided that the results of the evaluation homogeneity and stability test from CeO_2 CRM candidate test data were processed using statistical methods ISO 13528 is not significantly different with statistical methods from KAN DP.01.34, which together meet the requirements of a homogeneous and stable. So the test method homogeneity and stability test based on ISO 13528 can be used to make CRM cerium oxide. (author)

  7. Decision making with consonant belief functions: Discrepancy resulting with the probability transformation method used

    Directory of Open Access Journals (Sweden)

    Cinicioglu Esma Nur

    2014-01-01

    Full Text Available Dempster−Shafer belief function theory can address a wider class of uncertainty than the standard probability theory does, and this fact appeals the researchers in operations research society for potential application areas. However, the lack of a decision theory of belief functions gives rise to the need to use the probability transformation methods for decision making. For representation of statistical evidence, the class of consonant belief functions is used which is not closed under Dempster’s rule of combination but is closed under Walley’s rule of combination. In this research, it is shown that the outcomes obtained using both Dempster’s and Walley’s rules do result in different probability distributions when pignistic transformation is used. However, when plausibility transformation is used, they do result in the same probability distribution. This result shows that the choice of the combination rule and probability transformation method may have a significant effect on decision making since it may change the choice of the decision alternative selected. This result is illustrated via an example of missile type identification.

  8. Doppler method leak detection for LMFBR steam generators. Pt. 1. Experimental results of bubble detection using small models

    International Nuclear Information System (INIS)

    Kumagai, Hiromichi

    1999-01-01

    To prevent the expansion of the tube damage and to maintain structural integrity in the steam generators (SGs) of fast breeder reactors (FBRs), it is necessary to detect precisely and immediately the leakage of water from heat transfer tubes. Therefore, an active acoustic method was developed. Previous studies have revealed that in practical steam generators the active acoustic method can detect bubbles of 10 l/s within 10 seconds. To prevent the expansion of damage to neighboring tubes, it is necessary to detect smaller leakages of water from the heat transfer tubes. The Doppler method is designed to detect small leakages and to find the source of the leak before damage spreads to neighboring tubes. To evaluate the relationship between the detection sensitivity of the Doppler method and the bubble volume and bubble size, the structural shapes and bubble flow conditions were investigated experimentally, using a small structural model. The results show that the Doppler method can detect the bubbles under bubble flow conditions, and it is sensitive enough to detect small leakages within a short time. The doppler method thus has strong potential for the detection of water leakage in SGs. (author)

  9. FY 1974 report on the results of the Sunshine Project. Study of the hydrogen production technology (Study of the hydrogen production technology by thermochemical method); 1974 nendo suiso no seizo gijutsu no kenkyu seika hokokusho. Netsukagakuho ni yoru suiso seizo gijutsu no kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1975-05-30

    For the purpose of developing a new hydrogen production technology, a feasibility study was made of the copper-halogen system and alkali carbonate-iodine system processes presented as a new thermochemical process from viewpoints of the progress of reaction, side reaction, reaction yield, thermal efficiency, etc. The study went forward smoothly, has achieved the target in the early stage, and has been finished. In the study of optimal conditions of the process, the progress of each unit reaction was experimentally confirmed. By measuring the reaction yield, optimal reaction conditions for expediting the reaction were found out. As a result, it was found that the proposed processes proposed both advance rather easily, and was thought that those are worthy of making further engineering study. In the study of improvement and optimization of the process, since some altered processes are considered for the processes presented, an investigational study was made on a method to calculate thermal efficiency which is one of the standards for the process evaluation, and thermal efficiencies in various processes were trially calculated, based on the calculating method. As a result, it was thought that this process is equal to other processes. (NEDO)

  10. Finite Element Method Application in Areal Rainfall Estimation Case Study; Mashhad Plain Basin

    Directory of Open Access Journals (Sweden)

    M. Irani

    2016-10-01

    7.08 software environment. The finite element method is a numerical procedure for obtaining solutions to many of the problems encountered in engineering analysis. First, it utilizes discrete elements to obtain the joint displacements and member forces of a structural framework and estimate areal precipitation. Second, it uses the continuum elements to obtain approximate solutions to heat transfer, fluid mechanics, and solid mechanics problems. Galerkin’s method is used to develop the finite element equations for the field problems. It uses the same functions for Ni(x that was used in the approximating equations. This approach is the basis of finite element method for problems involving first-derivative terms. This method yields the same result as the variational method when applied to differential equations that are self-adjoints. Galerkin’s method is almost simple and eliminates bias by representing the relief by suitable mathematical model and incorporating this into the integration. In this paper, two powerful techniques were introduced which was applied in Galerkin’s method: The use of interpolation functions to transform the shape of the element to a perfect square. The use of Gaussian quadrature to calculate rainfall depth numerically . In this study, Mashhad plain is divided to 40 elements which are quadrilateral. In each element, the rain gauge was situated on the node of the stations. The coordinates are given according to UTM, where x and y are the horizontal and z, the vertical (altitude coordinate. It was necessary at the outset to number the corner nodes in a set manner and for the purpose of this paper, an anticlockwise convention was adopted. Results and Discussion: This paper represented the estimation of mean precipitation (daily, monthly and annual in Mashhad plain by Galerkin’s method which was compared with arithmetic mean, Thiessen, kriging and IDW. The values of Galerkin’s method by Matlab7.08 software and Thiessen, kriging and IDW by

  11. Study of the orbital correction method

    International Nuclear Information System (INIS)

    Meserve, R.A.

    1976-01-01

    Two approximations of interest in atomic, molecular, and solid state physics are explored. First, a procedure for calculating an approximate Green's function for use in perturbation theory is derived. In lowest order it is shown to be equivalent to treating the contribution of the bound states of the unperturbed Hamiltonian exactly and representing the continuum contribution by plane waves orthogonalized to the bound states (OPW's). If the OPW approximation were inadequate, the procedure allows for systematic improvement of the approximation. For comparison purposes an exact but more limited procedure for performing second-order perturbation theory, one that involves solving an inhomogeneous differential equation, is also derived. Second, the Kohn-Sham many-electron formalism is discussed and formulae are derived and discussed for implementing perturbation theory within the formalism so as to find corrections to the total energy of a system through second order in the perturbation. Both approximations were used in the calculation of the polarizability of helium, neon, and argon. The calculation included direct and exchange effects by the Kohn-Sham method and full self-consistency was demanded. The results using the differential equation method yielded excellent agreement with the coupled Hartree-Fock results of others and with experiment. Moreover, the OPW approximation yielded satisfactory comparison with the results of calculation by the exact differential equation method. Finally, both approximations were used in the calculation of properties of hydrogen fluoride and methane. The appendix formulates a procedure using group theory and the internal coordinates of a molecular system to simplify the calculation of vibrational frequencies

  12. The Method for Assessing and Forecasting Value of Knowledge in SMEs – Research Results

    Directory of Open Access Journals (Sweden)

    Justyna Patalas-Maliszewska

    2010-10-01

    Full Text Available Decisions by SMEs regarding knowledge development are made at a strategic level (Haas-Edersheim, 2007. Related to knowledge management are approaches to "measure" knowledge, where literature distinguishes between qualitative and quantitative methods of valuating intellectual capital. Although there is a quite range of such methods to build an intellectual capital reporting system, none of them is really widely recognized. This work presents a method enabling assessing the effectiveness of investing in human resources, taking into consideration existing methods. The method presented is focusing on SMEs (taking into consideration their importance for, especially, regional development. It consists of four parts: an SME reference model, an indicator matrix to assess investments into knowledge, innovation indicators, and the GMDH algorithm for decision making. The method presented is exemplified by a case study including 10 companies.

  13. Diversity Indices as Measures of Functional Annotation Methods in Metagenomics Studies

    KAUST Repository

    Jankovic, Boris R.

    2016-01-26

    Applications of high-throughput techniques in metagenomics studies produce massive amounts of data. Fragments of genomic, transcriptomic and proteomic molecules are all found in metagenomics samples. Laborious and meticulous effort in sequencing and functional annotation are then required to, amongst other objectives, reconstruct a taxonomic map of the environment that metagenomics samples were taken from. In addition to computational challenges faced by metagenomics studies, the analysis is further complicated by the presence of contaminants in the samples, potentially resulting in skewed taxonomic analysis. The functional annotation in metagenomics can utilize all available omics data and therefore different methods that are associated with a particular type of data. For example, protein-coding DNA, non-coding RNA or ribosomal RNA data can be used in such an analysis. These methods would have their advantages and disadvantages and the question of comparison among them naturally arises. There are several criteria that can be used when performing such a comparison. Loosely speaking, methods can be evaluated in terms of computational complexity or in terms of the expected biological accuracy. We propose that the concept of diversity that is used in the ecosystems and species diversity studies can be successfully used in evaluating certain aspects of the methods employed in metagenomics studies. We show that when applying the concept of Hill’s diversity, the analysis of variations in the diversity order provides valuable clues into the robustness of methods used in the taxonomical analysis.

  14. Studies on the method of producing radiographic 170Tm source

    International Nuclear Information System (INIS)

    Maeda, Sho

    1976-08-01

    A method of producing radiographic 170 Tm source has been studied, including target preparation, neutron irradiation, handling of the irradiated target in the hot cell and source capsules. On the basis of the results, practical 170 Tm radiographic sources (29 -- 49Ci, with pellets 3mm in diameter and 3mm long) were produced in trial by neutron irradiation with the JMTR. (auth.)

  15. Study on simulation methods of atrium building cooling load in hot and humid regions

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Yiqun; Li, Yuming; Huang, Zhizhong [Institute of Building Performance and Technology, Sino-German College of Applied Sciences, Tongji University, 1239 Siping Road, Shanghai 200092 (China); Wu, Gang [Weldtech Technology (Shanghai) Co. Ltd. (China)

    2010-10-15

    In recent years, highly glazed atria are popular because of their architectural aesthetics and advantage of introducing daylight into inside. However, cooling load estimation of such atrium buildings is difficult due to complex thermal phenomena that occur in the atrium space. The study aims to find out a simplified method of estimating cooling loads through simulations for various types of atria in hot and humid regions. Atrium buildings are divided into different types. For every type of atrium buildings, both CFD and energy models are developed. A standard method versus the simplified one is proposed to simulate cooling load of atria in EnergyPlus based on different room air temperature patterns as a result from CFD simulation. It incorporates CFD results as input into non-dimensional height room air models in EnergyPlus, and the simulation results are defined as a baseline model in order to compare with the results from the simplified method for every category of atrium buildings. In order to further validate the simplified method an actual atrium office building is tested on site in a typical summer day and measured results are compared with simulation results using the simplified methods. Finally, appropriate methods of simulating different types of atrium buildings are proposed. (author)

  16. Study of nasal swipe analysis methods at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Metcalf, R.A.

    1996-01-01

    The Health Physics Analysis Laboratory (HPAL) performs around 30,000 nasal swipe analyses for transuranic nuclides each year in support of worker health and safety at the Los Alamos National Laboratory (LANL). The analysis method used employs cotton swabs swiped inside a nostril and liquid scintillation analyses of the swabs. The technical basis of this method was developed at LANL and has been in use for over 10 years. Recently, questions regarding the usefulness of a non-homogeneous mixture in liquid scintillation analyses have created a need for re-evaluation of the method. A study of the validity of the method shows it provides reliable, stable, and useful data as an indicator of personnel contamination. The study has also provided insight into the underlying process which occurs to allow the analysis. Further review of this process has shown that similar results can be obtained with different sample matrices, using less material than the current analysis method. This reduction can save HPAL the cost of materials as well as greatly reduce the waste created. Radionuclides of concern include Am-241, Pu-239, and Pu-238

  17. Strengthening of limestone by the impregnation - gamma irradiation method. Results of tests

    International Nuclear Information System (INIS)

    Ramiere, R.; Tassigny, C. de

    1975-04-01

    The method developed by the Centre d'Etudes Nucleaires de Grenoble (France) strengthens the stones by impregnation with a styrene resin/liquid polystyrene mixture followed by polymerization under gamma irradiation. This method is applicable to stones which can be taken into the laboratory for treatment. The increase in strength of 6 different species of French limestone has been quantitatively recorded. The following parameters were studied: possibility of water migration inside the stones, improvements of the mechanical properties of the impregnated stone, standing up to freeze-thaw conditions and artificial ageing of the stones which causes only minor changes in the appearance of the stone and a negligible decrease in weight [fr

  18. Methods and introductory results of the Greek national health and nutrition survey - HYDRIA

    Directory of Open Access Journals (Sweden)

    Georgia Martimianaki

    2018-06-01

    Full Text Available Background:  According to a large prospective cohort study (with baseline examination in the 1990s and smaller studies that followed, the population in Greece has been gradually deprived of the favorable morbidity and mortality indices recorded in the 1960s. The HYDRIA survey conducted in 2013-14 is the first nationally representative survey, which collected data related to the health and nutrition of the population in Greece. Methods: The survey sample consists of 4011 males (47% and females aged 18 years and over. Data collection included interviewer-administered questionnaires on personal characteristics, lifestyle choices, dietary habits and medical history; measurements of somatometry and blood pressure; and, blood drawing. Weighting factors were applied to ensure national representativeness of results. Results: Three out of five adults in Greece reported suffering of a chronic disease, with diabetes mellitus and chronic depression being the more frequent ones among older individuals. The population is also experiencing an overweight/obesity epidemic, since seven out of 10 adults are either overweight or obese. In addition, 40% of the population bears indications of hypertension. Smoking is still common and among women the prevalence was higher in younger age groups. Social disparities were observed in the prevalence of chronic diseases and mortality risk factors (hypertension, obesity, impaired lipid profile and high blood glucose levels. Conclusion: Excess body weight, hypertension, the smoking habit and the population’s limited physical activity are the predominant challenges that public health officials have to deal with in formulating policies and designing actions for the population in Greece.

  19. Rainfall assimilation in RAMS by means of the Kuo parameterisation inversion: method and preliminary results

    Science.gov (United States)

    Orlandi, A.; Ortolani, A.; Meneguzzo, F.; Levizzani, V.; Torricella, F.; Turk, F. J.

    2004-03-01

    In order to improve high-resolution forecasts, a specific method for assimilating rainfall rates into the Regional Atmospheric Modelling System model has been developed. It is based on the inversion of the Kuo convective parameterisation scheme. A nudging technique is applied to 'gently' increase with time the weight of the estimated precipitation in the assimilation process. A rough but manageable technique is explained to estimate the partition of convective precipitation from stratiform one, without requiring any ancillary measurement. The method is general purpose, but it is tuned for geostationary satellite rainfall estimation assimilation. Preliminary results are presented and discussed, both through totally simulated experiments and through experiments assimilating real satellite-based precipitation observations. For every case study, Rainfall data are computed with a rapid update satellite precipitation estimation algorithm based on IR and MW satellite observations. This research was carried out in the framework of the EURAINSAT project (an EC research project co-funded by the Energy, Environment and Sustainable Development Programme within the topic 'Development of generic Earth observation technologies', Contract number EVG1-2000-00030).

  20. US country studies program: Results from mitigation studies

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-31

    This paper describes the U.S. Country Studies Program which was implemented to support the principles and objectives of the Framework Convention on Climate Change (FCCC). There were three principle objectives in this program: to enhance capabilities to conduct climate change assessments, prepare action plans, and implement technology projects; to help establish a process for developing and implementing national policies and measures; to support principles and objective of the FCCC. As a result, 55 countries are completing studies, more than 2000 analysts engaged in the studies have been trained, and there is a much broader understanding and support for climate change concerns. The article describes experiences of some countries, and general observations and conclusions which are broadly seperated into developed countries and those with economies in transition.

  1. Study of shielding analysis methods for casks of spent fuel and radioactive waste

    International Nuclear Information System (INIS)

    Saito, Ai

    2017-01-01

    Casks are used for storage or transport spent fuels or radioactive waste. Because high shielding performances are required, it is very important to confirm the validity of shielding analysis methods in order to evaluate cask shielding abilities appropriately. For this purpose, following studies were carried out. 1) A series of parameter survey for several codes to evaluated the difference of the results. 2) Calculations using the MCNP code are effective and theoretically have better accuracy. However setting reasonable variance reduction parameters is indispensable. Therefore, effectiveness of the ADVANTG code which produces automatically reasonable variance reduction parameters is carried out by comparison with conventional method. As a result, the validity of shielding analysis methods for casks is confirmed. The results will be taken into consideration in our future shielding analysis. (author)

  2. The method of producing climate change datasets impacts the resulting policy guidance and chance of mal-adaptation

    Directory of Open Access Journals (Sweden)

    Marie Ekström

    2016-12-01

    Full Text Available Impact, adaptation and vulnerability (IAV research underpin strategies for adaptation to climate change and help to conceptualise what life may look like in decades to come. Research draws on information from global climate models (GCMs though typically post-processed into a secondary product with finer resolution through methods of downscaling. Through worked examples set in an Australian context we assess the influence of GCM sub-setting, geographic area sub-setting and downscaling method on the regional change signal. Examples demonstrate that choices impact on the final results differently depending on factors such as application needs, range of uncertainty of the projected variable, amplitude of natural variability, and size of study region. For heat extremes, the choice of emissions scenario is of prime importance, but for a given scenario the method of preparing data can affect the magnitude of the projection by a factor of two or more, strongly affecting the indicated adaptation decision. For catchment level runoff projections, the choice of emission scenario is less dominant. Rather the method of selecting and producing application-ready datasets is crucial as demonstrated by results with opposing sign of change, raising the real possibility of mal-adaptive decisions. This work illustrates the potential pitfalls of GCM sub-sampling or the use of a single downscaled product when conducting IAV research. Using the broad range of change from all available model sources, whilst making the application more complex, avoids the larger problem of over-confidence in climate projections and lessens the chance of mal-adaptation.

  3. bNEAT: a Bayesian network method for detecting epistatic interactions in genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Chen Xue-wen

    2011-07-01

    Full Text Available Abstract Background Detecting epistatic interactions plays a significant role in improving pathogenesis, prevention, diagnosis and treatment of complex human diseases. A recent study in automatic detection of epistatic interactions shows that Markov Blanket-based methods are capable of finding genetic variants strongly associated with common diseases and reducing false positives when the number of instances is large. Unfortunately, a typical dataset from genome-wide association studies consists of very limited number of examples, where current methods including Markov Blanket-based method may perform poorly. Results To address small sample problems, we propose a Bayesian network-based approach (bNEAT to detect epistatic interactions. The proposed method also employs a Branch-and-Bound technique for learning. We apply the proposed method to simulated datasets based on four disease models and a real dataset. Experimental results show that our method outperforms Markov Blanket-based methods and other commonly-used methods, especially when the number of samples is small. Conclusions Our results show bNEAT can obtain a strong power regardless of the number of samples and is especially suitable for detecting epistatic interactions with slight or no marginal effects. The merits of the proposed approach lie in two aspects: a suitable score for Bayesian network structure learning that can reflect higher-order epistatic interactions and a heuristic Bayesian network structure learning method.

  4. Microvariability in AGNs: study of different statistical methods - I. Observational analysis

    Science.gov (United States)

    Zibecchi, L.; Andruchow, I.; Cellone, S. A.; Carpintero, D. D.; Romero, G. E.; Combi, J. A.

    2017-05-01

    We present the results of a study of different statistical methods currently used in the literature to analyse the (micro)variability of active galactic nuclei (AGNs) from ground-based optical observations. In particular, we focus on the comparison between the results obtained by applying the so-called C and F statistics, which are based on the ratio of standard deviations and variances, respectively. The motivation for this is that the implementation of these methods leads to different and contradictory results, making the variability classification of the light curves of a certain source dependent on the statistics implemented. For this purpose, we re-analyse the results on an AGN sample observed along several sessions with the 2.15 m 'Jorge Sahade' telescope (CASLEO), San Juan, Argentina. For each AGN, we constructed the nightly differential light curves. We thus obtained a total of 78 light curves for 39 AGNs, and we then applied the statistical tests mentioned above, in order to re-classify the variability state of these light curves and in an attempt to find the suitable statistical methodology to study photometric (micro)variations. We conclude that, although the C criterion is not proper a statistical test, it could still be a suitable parameter to detect variability and that its application allows us to get more reliable variability results, in contrast with the F test.

  5. Mixed methods research design for pragmatic psychoanalytic studies.

    Science.gov (United States)

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  6. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights

    Science.gov (United States)

    2018-01-01

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook’s advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. PMID:29720366

  7. A comparative study of three different gene expression analysis methods.

    Science.gov (United States)

    Choe, Jae Young; Han, Hyung Soo; Lee, Seon Duk; Lee, Hanna; Lee, Dong Eun; Ahn, Jae Yun; Ryoo, Hyun Wook; Seo, Kang Suk; Kim, Jong Kun

    2017-12-04

    TNF-α regulates immune cells and acts as an endogenous pyrogen. Reverse transcription polymerase chain reaction (RT-PCR) is one of the most commonly used methods for gene expression analysis. Among the alternatives to PCR, loop-mediated isothermal amplification (LAMP) shows good potential in terms of specificity and sensitivity. However, few studies have compared RT-PCR and LAMP for human gene expression analysis. Therefore, in the present study, we compared one-step RT-PCR, two-step RT-LAMP and one-step RT-LAMP for human gene expression analysis. We compared three gene expression analysis methods using the human TNF-α gene as a biomarker from peripheral blood cells. Total RNA from the three selected febrile patients were subjected to the three different methods of gene expression analysis. In the comparison of three gene expression analysis methods, the detection limit of both one-step RT-PCR and one-step RT-LAMP were the same, while that of two-step RT-LAMP was inferior. One-step RT-LAMP takes less time, and the experimental result is easy to determine. One-step RT-LAMP is a potentially useful and complementary tool that is fast and reasonably sensitive. In addition, one-step RT-LAMP could be useful in environments lacking specialized equipment or expertise.

  8. Reflections on the added value of using mixed methods in the SCAPE study.

    Science.gov (United States)

    Murphy, Kathy; Casey, Dympna; Devane, Declan; Meskell, Pauline; Higgins, Agnes; Elliot, Naomi; Lalor, Joan; Begley, Cecily

    2014-03-01

    To reflect on the added value that a mixed method design gave in a large national evaluation study of specialist and advanced practice (SCAPE), and to propose a reporting guide that could help make explicit the added value of mixed methods in other studies. Recently, researchers have focused on how to carry out mixed methods research (MMR) rigorously. The value-added claims for MMR include the capacity to exploit the strengths and compensate for weakness inherent in single designs, generate comprehensive description of phenomena, produce more convincing results for funders or policy-makers and build methodological expertise. Data illustrating value added claims were drawn from the SCAPE study. Studies about the purpose of mixed methods were identified from a search of literature. The authors explain why and how they undertook components of the study, and propose a guideline to facilitate such studies. If MMR is to become the third methodological paradigm, then articulation of what extra benefit MMR adds to a study is essential. The authors conclude that MMR has added value and found the guideline useful as a way of making value claims explicit. The clear articulation of the procedural aspects of mixed-methods research, and identification of a guideline to facilitate such research, will enable researchers to learn more effectively from each other.

  9. Pharmacogenomics Bias - Systematic distortion of study results by genetic heterogeneity

    Directory of Open Access Journals (Sweden)

    Zietemann, Vera

    2008-04-01

    Full Text Available Background: Decision analyses of drug treatments in chronic diseases require modeling the progression of disease and treatment response beyond the time horizon of clinical or epidemiological studies. In many such models, progression and drug effect have been applied uniformly to all patients; heterogeneity in progression, including pharmacogenomic effects, has been ignored. Objective: We sought to systematically evaluate the existence, direction and relative magnitude of a pharmacogenomics bias (PGX-Bias resulting from failure to adjust for genetic heterogeneity in both treatment response (HT and heterogeneity in progression of disease (HP in decision-analytic studies based on clinical study data. Methods: We performed a systematic literature search in electronic databases for studies regarding the effect of genetic heterogeneity on the validity of study results. Included studies have been summarized in evidence tables. In the case of lacking evidence from published studies we sought to perform our own simulation considering both HT and HP. We constructed two simple Markov models with three basic health states (early-stage disease, late-stage disease, dead, one adjusting and the other not adjusting for genetic heterogeneity. Adjustment was done by creating different disease states for presence (G+ and absence (G- of a dichotomous genetic factor. We compared the life expectancy gains attributable to treatment resulting from both models and defined pharmacogenomics bias as percent deviation of treatment-related life expectancy gains in the unadjusted model from those in the adjusted model. We calculated the bias as a function of underlying model parameters to create generic results. We then applied our model to lipid-lowering therapy with pravastatin in patients with coronary atherosclerosis, incorporating the influence of two TaqIB polymorphism variants (B1 and B2 on progression and drug efficacy as reported in the DNA substudy of the REGRESS

  10. The statistical properties of 111,112,113Sn studied with the Oslo method

    Science.gov (United States)

    Tveten, G. M.; Bello Garrote, F. L.; Campo, L. C.; Eriksen, T. K.; Giacoppo, F.; Guttormsen, M.; Görgen, A.; Hagen, T. W.; Hadynska-Klek, K.; Klintefjord, M.; Larsen, A. C.; Maharromova, S.; Nyhus, H. T.; Renstrøm, T.; Rose, S.; Sahin, E.; Siem, S.; Tornyi, T. G.

    2015-05-01

    The γ-ray strength function and level density of 111, 112, 113Sn are being studied at the Oslo Cyclotron Laboratory (OCL) up to the neutron binding energy by applying the Oslo method to particle-γ coincidence data. The preliminary results for the γ-ray strength function are discussed in the context of the results for the more neutron-rich Sn-isotopes previously studied at OCL.

  11. The strategic value of e-HRM: results from an exploratory study in a governmental organization

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria

    2013-01-01

    This paper presents results from an exploratory study in a governmental organization on the strategic value of electronic human resource management (e-HRM). By applying the organizational capabilities approach, and by means of mixed research methods, data were collected on two generally acclaimed

  12. Study on effective methods to call attention in power plant sites. Towards sharing know-how

    International Nuclear Information System (INIS)

    Tsukada, Tetsuya; Nakamura, Hajime

    1999-01-01

    Methods to call attention during field work in nuclear power plants have not obtained the desired results due to redundancy and poor theoretical support. From the points of view of psychology and human engineering, we theoretically examined the validity of each of the following methods for calling attention: methods deployed in power plants, methods obtained through case studies in other industries, and newly developed methods, and then systematized these methods. Using five typical operations with different operating characteristics as models, we also determined methods deployed for each situation in the operation process. Then we determined and categorized the ways of utilizing methods to call attention according to each operating characteristic. With the aim of utilizing these results in many power plants and promoting the sharing of know-how concerning calling attention, we put together an easy-to-understand 'instruction manual', which contains know-how concerning methods to call attention and an introduction to the newly developed methods. Moreover, we established a 'database' (with a registration function) of methods to call attention, which contains organized methods and patterns of utilizing such methods in each operating characteristic. The present study is thus a report that aims at sharing the know-how, centered on this database. (author)

  13. Insufficiently studied factors related to burnout in nursing: Results from an e-Delphi study

    Science.gov (United States)

    2017-01-01

    Objective This study aimed to identify potentially important factors in explaining burnout in nursing that have been insufficiently studied or ignored. Methods A three-round Delphi study via e-mail correspondence was conducted, with a group of 40 European experts. The e-Delphi questionnaire consisted of 52 factors identified from a literature review. Experts rated and scored the importance of factors in the occurrence of burnout and the degree of attention given by researchers to each of the variables listed, on a six-point Likert scale. We used the agreement percentage (>80%) to measure the level of consensus between experts. Furthermore, to confirm the level of consensus, we also calculated mean scores and modes. Regardless of the degree of consensus reached by the experts, we have calculated the mean of the stability of the answers for each expert (individual's qualitative stability) and the mean of the stability percentages of the experts (qualitative group stability). Results The response rate in the three rounds was 93.02% (n = 40). Eight new factors were suggested in the first round. After modified, the e-Delphi questionnaire in the second and third rounds had 60 factors. All the factors reached the third round with a consensus level above 80% in terms of the attention that researchers gave them in their studies. Moreover, the data show a total mean qualitative group stability of 96.21%. In the third round 9 factors were classified by experts as ‘studied very little’, 17 as ‘studied little’ and 34 as 'well studied' Conclusion Findings show that not all the factors that may influence nursing burnout have received the same attention from researchers. The panel of experts has identified factors that, although important in explaining burnout, have been poorly studied or even forgotten. Our results suggest that further study into factors such as a lack of recognition of part of the tasks that nurses perform, feminine stereotype or excessive bureaucracy is

  14. Case studies: Soil mapping using multiple methods

    Science.gov (United States)

    Petersen, Hauke; Wunderlich, Tina; Hagrey, Said A. Al; Rabbel, Wolfgang; Stümpel, Harald

    2010-05-01

    Soil is a non-renewable resource with fundamental functions like filtering (e.g. water), storing (e.g. carbon), transforming (e.g. nutrients) and buffering (e.g. contamination). Degradation of soils is meanwhile not only to scientists a well known fact, also decision makers in politics have accepted this as a serious problem for several environmental aspects. National and international authorities have already worked out preservation and restoration strategies for soil degradation, though it is still work of active research how to put these strategies into real practice. But common to all strategies the description of soil state and dynamics is required as a base step. This includes collecting information from soils with methods ranging from direct soil sampling to remote applications. In an intermediate scale mobile geophysical methods are applied with the advantage of fast working progress but disadvantage of site specific calibration and interpretation issues. In the framework of the iSOIL project we present here some case studies for soil mapping performed using multiple geophysical methods. We will present examples of combined field measurements with EMI-, GPR-, magnetic and gammaspectrometric techniques carried out with the mobile multi-sensor-system of Kiel University (GER). Depending on soil type and actual environmental conditions, different methods show a different quality of information. With application of diverse methods we want to figure out, which methods or combination of methods will give the most reliable information concerning soil state and properties. To investigate the influence of varying material we performed mapping campaigns on field sites with sandy, loamy and loessy soils. Classification of measured or derived attributes show not only the lateral variability but also gives hints to a variation in the vertical distribution of soil material. For all soils of course soil water content can be a critical factor concerning a succesful

  15. Triangulation of Methods in Labour Studies in Nigeria: Reflections ...

    African Journals Online (AJOL)

    One of the distinctive aspects of social science research in Nigeria as in other ... method in their investigations while relegating qualitative methods to the background. In labour studies, adopting only quantitative method to studying workers ...

  16. RESULTS OF THE QUESTIONNAIRE: ANALYSIS METHODS

    CERN Multimedia

    Staff Association

    2014-01-01

    Five-yearly review of employment conditions   Article S V 1.02 of our Staff Rules states that the CERN “Council shall periodically review and determine the financial and social conditions of the members of the personnel. These periodic reviews shall consist of a five-yearly general review of financial and social conditions;” […] “following methods […] specified in § I of Annex A 1”. Then, turning to the relevant part in Annex A 1, we read that “The purpose of the five-yearly review is to ensure that the financial and social conditions offered by the Organization allow it to recruit and retain the staff members required for the execution of its mission from all its Member States. […] these staff members must be of the highest competence and integrity.” And for the menu of such a review we have: “The five-yearly review must include basic salaries and may include any other financial or soc...

  17. Study of Seed Germination by Soaking Methode of Cacao (Theobroma cacao L.

    Directory of Open Access Journals (Sweden)

    Sulistyani Pancaningtyas

    2014-12-01

    Full Text Available Study of germination methods conduct to get information about seed viability based on germination rate, percentage of germination and vigority. Germination methods was studied to get the efficiency and effectivity of germination, easy to handle, low costs with high vigority. Sand and gunny sack methods  for germination, need extensive place  and 3-4 days germination period after planting. This research will study the alternative of germination method with soaking. This method can be accelerating  germination rate and effectively place usage without decreasing the quality of cacao seedling.The research was done at Kaliwining Experimental Station, Indonesian Coffee and Cocoa Research Institue. This research consist of two experiment was arranged based on factorial completely random design. First experiment will observed to compared germination rate and the second experiment will observed seedling quality between soaking and wet gunny sack germination method.The results showed that length of radicel on soaking method longer than wet gunny sack method. Growth of radicel started from 2 hours after soaking, moreover length of radicel at 4 hours after soaking have significant different value with gunny sack method. On 24 hours after soaking have 3,69 mm and 0,681 mm on wet gunny sack treatment. Except lengt of hipocotyl, there is not different condition between seedling that out came  from soaking and wet gunny sack method. Length of hipocotyl on 36 hours after soaking have 9,15 cm and significant different between wet gunny sack germination method that have 5,40 cm. Keywords : seed germination, soaking method, Theobroma cacao L., cocoa seedlings

  18. EXPERIMENTAL RESULTS OF THE NEPHELINE PHASE III STUDY

    Energy Technology Data Exchange (ETDEWEB)

    Fox, K.; Edwards, T.

    2009-11-09

    This study is the third phase in a series of experiments designed to reduce conservatism in the model that predicts the formation of nepheline, a crystalline phase that can reduce the durability of high level waste glass. A Phase I study developed a series of glass compositions that were very durable while their nepheline discriminator values were well below the current nepheline discriminator limit of 0.62, where nepheline is predicted to crystallize upon slow cooling. A Phase II study selected glass compositions to identify any linear effects of composition on nepheline crystallization and that were restricted to regions that fell within the validation ranges of the Defense Waste Processing Facility (DWPF) Product Composition Control System (PCCS) models. However, it was not possible to identify any linear effects of composition on chemical durability performance for this set of study glasses. The results of the Phase II study alone were not sufficient to recommend modification of the current nepheline discriminator. It was recommended that the next series of experiments continue to focus not only on compositional regions where the PCCS models are considered applicable (i.e., the model validation ranges), but also be restricted to compositional regions where the only constraint limiting processing is the current nepheline discriminator. Two methods were used in selecting glasses for this Phase III nepheline study. The first was based on the relationship of the current nepheline discriminator model to the other DWPF PCCS models, and the second was based on theory of crystallization in mineral and glass melts. A series of 29 test glass compositions was selected for this study using a combination of the two approaches. The glasses were fabricated and characterized in the laboratory. After reviewing the data, the study glasses generally met the target compositions with little issue. Product Consistency Test results correlated well with the crystallization analyses in

  19. Hydrogeochemical methods for studying uranium mineralization in sedimentary rocks

    International Nuclear Information System (INIS)

    Lisitsin, A.K.

    1985-01-01

    The role of hydrogeochemical studies of uranium deposits is considered, which permits to obtain data on ore forming role of water solutions. The hydrogeochemistry of ore formation is determined as a result of physicochemical analysis of mineral paragenesis. Analysis results of the content of primary and secondary gaseous - liquid inclusions into the minerals are of great importance. Another way to determine the main features of ore formation hydrogeochemistry envisages simultaneous analysis of material from a number of deposits of one genetic type but in different periods of their geochemical life: being formed, formed and preserved, and being destructed. Comparison of mineralogo-geochemical zonation and hydrogeochemical one in water-bearing horizon is an efficient method, resulting in the objective interpretation of the facts. The comparison is compulsory when determining deposit genesis

  20. Intraarterial Ultrasound in Pancreatic Cancer: Feasibility Study and Preliminary Results

    International Nuclear Information System (INIS)

    Larena-Avellaneda, Axel; Timm, Stephan; Kickuth, Ralph; Kenn, Werner; Steger, Ulrich; Jurowich, Christian; Germer, Christoph-Thomas

    2010-01-01

    Despite technological advances in computed tomography (CT) and magnetic resonance imaging, the involvement of the celiac or mesenteric artery in pancreatic cancer remains uncertain in many cases. Infiltration of these vessels is important in making decisions about therapy choices but often can only be definitively determined through laparotomy. Local (intraarterial) ultrasound may increase diagnostic accuracy. Using the Volcano intravascular ultrasound (IVUS) system, we applied a transfemoral method to scan the celiac and mesenteric arteries directly intraarterial. This technique was used in five patients with suspected pancreatic cancer. Technical success was achieved in all cases. In one case, a short dissection of the mesenteric artery occurred but could be managed interventionally. In tumors that did not contact with the vessels, IVUS was unable to display the tissue pathology. Our main interest was the infiltration of the arteries. In one case, infiltration was certain in the CT scan but uncertain in two patients. In the latter two cases, IVUS correctly predicted infiltration in one and freedom from tumor in the other case. In our preliminary study, IVUS correctly predicted arterial infiltration in all cases. IVUS did not provide new information when the tumor was far away from the vessel. Compared with IVUS in the portal vein, the information about the artery is more detailed, and the vessel approach is easier. These results encouraged us to design a prospective study to evaluate the sensitivity and specificity of this method.

  1. Study on calculation of influence function with fracture mechanics analysis method for circumferential through-crack pipe

    International Nuclear Information System (INIS)

    Zheng Bin; Lu Yuechuan; Zang Fenggang; Sun Yingxue

    2009-01-01

    In order to widen the application of the engineering method of EPRI, with a series of analysis on the 3D elastic and elastic-plastic fracture mechanics finite element, the crack open displacements (COD) of cracked pipe were calculated and a key influence function h 2 in EPRI engineering method was studied against the COD results of FEM. A calculation method of h2 under the condition of tension and bending combined load was introduced in detail. In order to validate this method, the calculated h 2 results were compared with that of EPRI, and the calculated COD results based on the h 2 results were compared with that of PICEP. The compared results indicated that the calculated h 2 results as well as the COD results and the corresponding reference values were respectively accordant, and the calculation method in this paper was validated accordingly. (authors)

  2. Study on method of characteristics based on cell modular ray tracing

    International Nuclear Information System (INIS)

    Tang Chuntao; Zhang Shaohong

    2009-01-01

    To address the issue of accurately solving neutron transport problem in complex geometry, method of characteristics (MOC) is studied in this paper, and a quite effective and memory saving cell modular ray tracing (CMRT) method is developed and related angle discretization and boundary condition handling issues are discussed. A CMRT based MOC code-PEACH is developed and tested against C5G7 MOX benchmark problem. Numerical results demonstrate that PEACH can give excellent accuracy for both k eff and pin power distribution for neutron transport problem. (authors)

  3. A feasibility study in adapting Shamos Bickel and Hodges Lehman estimator into T-Method for normalization

    Science.gov (United States)

    Harudin, N.; Jamaludin, K. R.; Muhtazaruddin, M. Nabil; Ramlie, F.; Muhamad, Wan Zuki Azman Wan

    2018-03-01

    T-Method is one of the techniques governed under Mahalanobis Taguchi System that developed specifically for multivariate data predictions. Prediction using T-Method is always possible even with very limited sample size. The user of T-Method required to clearly understanding the population data trend since this method is not considering the effect of outliers within it. Outliers may cause apparent non-normality and the entire classical methods breakdown. There exist robust parameter estimate that provide satisfactory results when the data contain outliers, as well as when the data are free of them. The robust parameter estimates of location and scale measure called Shamos Bickel (SB) and Hodges Lehman (HL) which are used as a comparable method to calculate the mean and standard deviation of classical statistic is part of it. Embedding these into T-Method normalize stage feasibly help in enhancing the accuracy of the T-Method as well as analysing the robustness of T-method itself. However, the result of higher sample size case study shows that T-method is having lowest average error percentages (3.09%) on data with extreme outliers. HL and SB is having lowest error percentages (4.67%) for data without extreme outliers with minimum error differences compared to T-Method. The error percentages prediction trend is vice versa for lower sample size case study. The result shows that with minimum sample size, which outliers always be at low risk, T-Method is much better on that, while higher sample size with extreme outliers, T-Method as well show better prediction compared to others. For the case studies conducted in this research, it shows that normalization of T-Method is showing satisfactory results and it is not feasible to adapt HL and SB or normal mean and standard deviation into it since it’s only provide minimum effect of percentages errors. Normalization using T-method is still considered having lower risk towards outlier’s effect.

  4. A novel method for trace tritium transport studies

    International Nuclear Information System (INIS)

    Bonheure, Georges; Mlynar, Jan; Murari, A.; Giroud, C.; Popovichev, S.; Belo, P.; Bertalot, L.

    2009-01-01

    A new method combining a free-form solution for the neutron emissivity and the ratio method (Bonheure et al 2006 Nucl. Fusion 46 725-40) is applied to the investigation of tritium particle transport in JET plasmas. The 2D neutron emissivity is calculated using the minimum Fisher regularization method (MFR) (Anton et al 1996 Plasma Phys. Control. Fusion 38 1849, Mlynar et al 2003 Plasma Phys. Control. Fusion 45 169). This method is being developed and studied alongside other methods at JET. The 2D neutron emissivity was significantly improved compared with the first MFR results by constraining the emissivity along the magnetic flux surfaces. 1D profiles suitable for transport analysis are then obtained by subsequent poloidal integration. In methods on which previous JET publications are based (Stork et al 2005 Nucl. Fusion 45 S181, JET Team (prepared by Zastrow) 1999 Nucl. Fusion 39 1891, Zastrow et al 2004 Plasma Phys. Control. Fusion 46 B255, Adams et al 1993 Nucl. Instrum. Methods A 329 277, Jarvis et al 1997 Fusion Eng. Des. 34-35 59, Jarvis et al 1994 Plasma Phys. Control. Fusion 36 219), the 14.07 MeV D-T neutron line integrals measurements were simulated and the transport coefficients varied until good fits were obtained. In this novel approach, direct knowledge of tritium concentration or the fuel ratio n T /n D is obtained using all available neutron profile information, e.g both 2.45 MeV D-D neutron profiles and 14.07 MeV D-T neutron profiles (Bonheure et al 2006 Nucl.Fusion 46 725-40). Tritium particle transport coefficients are then determined using a linear regression from the dynamic response of the tritium concentration n T /n D profile. The temporal and spatial evolution of tritium particle concentration was studied for a set of JET discharges with tritium gas puffs from the JET trace tritium experiments. Local tritium transport coefficients were derived from the particle flux equation Γ = -D∇n T + Vn T , where D is the particle diffusivity and V

  5. Ion implantation as a method of studying inhomogeneities in superconductors: results for indium films with embedded helium particles

    International Nuclear Information System (INIS)

    Fogel, N.Ya.; Moshenski, A.A.; Dmitrenko, I.M.

    1978-01-01

    The paper considers the applicability of ion implantation into superconductors to investigate inhomogeneity effects on their macroscopic properties. Noble-gas-ion implantation into thin superconducting films is shown to be a unique means of systematically studying these effects in a single sample. Data demonstrating the effect of inhomogeneities on the critical current, Isub(c) in the mixed state and phase-transition smearing in He + -ion-irradiated indium films are presented. First, experimental evidence was obtained to support the Larkin-Ovchinnikov theory which relates Isub(c) and the phase-transition smearing to inhomogeneities of the electron-electron interaction constant g(r) and the electron mean free path (r). Results are presented for parallel critical field anomalies in He-implanted indium films which are due to an implantation-induced anisotropy of xi(t). Changes in the critical parameters for the film resulting from the implantation are compared to structural changes. (Auth.)

  6. Soybean allergen detection methods--a comparison study

    DEFF Research Database (Denmark)

    Pedersen, M. Højgaard; Holzhauser, T.; Bisson, C.

    2008-01-01

    Soybean containing products are widely consumed, thus reliable methods for detection of soy in foods are needed in order to make appropriate risk assessment studies to adequately protect soy allergic patients. Six methods were compared using eight food products with a declared content of soy...

  7. Analytical methods applied to the study of lattice gauge and spin theories

    International Nuclear Information System (INIS)

    Moreo, Adriana.

    1985-01-01

    A study of interactions between quarks and gluons is presented. Certain difficulties of the quantum chromodynamics to explain the behaviour of quarks has given origin to the technique of lattice gauge theories. First the phase diagrams of the discrete space-time theories are studied. The analysis of the phase diagrams is made by numerical and analytical methods. The following items were investigated and studied: a) A variational technique was proposed to obtain very accurated values for the ground and first excited state energy of the analyzed theory; b) A mean-field-like approximation for lattice spin models in the link formulation which is a generalization of the mean-plaquette technique was developed; c) A new method to study lattice gauge theories at finite temperature was proposed. For the first time, a non-abelian model was studied with analytical methods; d) An abelian lattice gauge theory with fermionic matter at the strong coupling limit was analyzed. Interesting results applicable to non-abelian gauge theories were obtained. (M.E.L.) [es

  8. A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis

    Science.gov (United States)

    Schiazza, Daniela Marie

    2013-01-01

    The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…

  9. Assessment of South African uranium resources: methods and results

    International Nuclear Information System (INIS)

    Camisani-Calzolari, F.A.G.M.; De Klerk, W.J.; Van der Merwe, P.J.

    1985-01-01

    This paper deals primarily with the methods used by the Atomic Energy Corporation of South Africa, in arriving at the assessment of the South African uranium resources. The Resource Evaluation Group is responsible for this task, which is carried out on a continuous basis. The evaluation is done on a property-by-property basis and relies upon data submitted to the Nuclear Development Corporation of South Africa by the various companies involved in uranium mining and prospecting in South Africa. Resources are classified into Reasonably Assured (RAR), Estimated Additional (EAR) and Speculative (SR) categories as defined by the NEA/IAEA Steering Group on Uranium Resources. Each category is divided into three categories, viz, resources exploitable at less than $80/kg uranium, at $80-130/kg uranium and at $130-260/kg uranium. Resources are reported in quantities of uranium metal that could be recovered after mining and metallurgical losses have been taken into consideration. Resources in the RAR and EAR categories exploitable at costs of less than $130/kg uranium are now estimated at 460 000 t uranium which represents some 14 per cent of WOCA's (World Outside the Centrally Planned Economies Area) resources. The evaluation of a uranium venture is carried out in various steps, of which the most important, in order of implementation, are: geological interpretation, assessment of in situ resources using techniques varying from manual contouring of values, geostatistics, feasibility studies and estimation of recoverable resources. Because the choice of an evaluation method is, to some extent, dictated by statistical consderations, frequency distribution curves of the uranium grade variable are illustrated and discussed for characteristic deposits

  10. Study of the Rancimat test method in measuring the oxidation stability of biodiesel ester and blends

    Energy Technology Data Exchange (ETDEWEB)

    Berthiaume, D.; Tremblay, A. [Oleotek Inc., Thetford Mines, PQ (Canada)

    2006-11-15

    This paper provided details of a study conducted to examine the oxidation stability of biodiesel blends. The study tested samples of canola oil, soybean oil, fish oil, yellow grease, and tallow. The EN 14112 (Rancimat) method was used to compare oxidation stability results obtained in previous tests conducted in the United States and Europe. The aim of the study was also to evaluate the influence of peroxide value (PV), acid value (AV) and feedstock source on the the oxidative stability of different samples. The study also evaluated the possibility of developing a validated test method developed from the EN 14112 methods to specifically consider biodiesel blends. Results of the study indicated that the Rancimat method was not suitable for measuring the oxidation stability of biodiesels blended with petrodiesels. No direct correlation between oxidative stability and PV or AV was observed. It was concluded that fatty acid distribution was not a principal factor in causing changes in oxidation stability. 22 refs., 3 tabs., 1 fig.

  11. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    Science.gov (United States)

    Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.

    2017-05-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.

  12. Potential theory for stationary Schrödinger operators: a survey of results obtained with non-probabilistic methods

    Directory of Open Access Journals (Sweden)

    Marco Bramanti

    1992-05-01

    Full Text Available In this paper we deal with a uniformly elliptic operator of the kind: Lu  Au + Vu, where the principal part A is in divergence form, and V is a function assumed in a “Kato class”. This operator has been studied in different contexts, especially using probabilistic techniques. The aim of the present work is to give a unified and simplified presentation of the results obtained with non probabilistic methods for the operator L on a bounded Lipschitz domain. These results regard: continuity of the solutions of Lu=0; Harnack inequality; estimates on the Green's function and L-harmonic measure; boundary behavior of positive solutions of Lu=0, in particular a “Fatou's theorem”.

  13. METHODS OF MEASURING THE EFFECTS OF LIGHTNING BY SIMULATING ITS STRIKES WITH THE INTERVAL ASSESSMENT OF THE RESULTS OF MEASUREMENTS

    Directory of Open Access Journals (Sweden)

    P. V. Kriksin

    2017-01-01

    Full Text Available The article presents the results of the development of new methods aimed at more accurate interval estimate of the experimental values of voltages on grounding devices of substations and circuits in the control cables, that occur when lightning strikes to lightning rods; the abovementioned estimate made it possible to increase the accuracy of the results of the study of lightning noise by 28 %. A more accurate value of interval estimation were achieved by developing a measurement model that takes into account, along with the measured values, different measurement errors and includes the special processing of the measurement results. As a result, the interval of finding the true value of the sought voltage is determined with an accuracy of 95 %. The methods can be applied to the IK-1 and IKP-1 measurement complexes, consisting in the aperiodic pulse generator, the generator of high-frequency pulses and selective voltmeters, respectively. To evaluate the effectiveness of the developed methods series of experimental voltage assessments of grounding devices of ten active high-voltage substation have been fulfilled in accordance with the developed methods and traditional techniques. The evaluation results confirmed the possibility of finding the true values of voltage over a wide range, that ought to be considered in the process of technical diagnostics of lightning protection of substations when the analysis of the measurement results and the development of measures to reduce the effects of lightning are being fulfilled. Also, a comparative analysis of the results of measurements made in accordance with the developed methods and traditional techniques has demonstrated that the true value of the sought voltage may exceed the measured value at an average of 28 %, that ought to be considered in the further analysis of the parameters of lightning protection at the facility and in the development of corrective actions. The developed methods have been

  14. Marshland study brings fruitful results

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    There are approximately 110,000 square kilometers of marshland in China containing peat and other valuable resources. The results of a marshland study have been presented as a chapter in a book titled Physical Geography of China. In 1970, the Kirin Normal University established an experimental plant for the utilization of peat. Four kinds of peat fertilizers produced by this plant have been used with good results at 20 communes in Kirin Province. Four kinds of construction materials (peat board, peat tiles, peat insulation bricks and peat insulation tubes) were also successfully made. A certain type of peat can be used as fuel to heat malt in the process of whiskey making. New applications of peat have been found in medicine, water purification, and in the manufacturing of electrodes for condensers.

  15. Analysis method for the search for neutrinoless double beta decay in the NEMO3 experiment: study of the background and first results

    International Nuclear Information System (INIS)

    Etienvre, A.I.

    2003-04-01

    The NEMO3 detector, installed in the Frejus Underground Laboratory, is dedicated to the study of neutrinoless double beta decay: the observation of this process would sign the massive and Majorana nature of neutrino. The experiment consists in very thin central source foils (the total mass is equal to 10 kg), a tracking detector made of drift cells operating in Geiger mode, a calorimeter made of plastic scintillators associated to photomultipliers, a coil producing a 30 gauss magnetic field and two shields, dedicated to the reduction of the γ-ray and neutron fluxes. In the first part, I describe the implications of several mechanisms, related to trilinear R-parity violation, on double beta decay. The second part is dedicated to a detailed study of the tracking detector of the experiment: after a description of the different working tests, I present the determination of the characteristics of the tracking reconstruction (transverse and longitudinal resolution, by Geiger cell and precision on vertex determination, charge recognition). The last part corresponds to the analysis of the data taken by the experiment. On the one hand, an upper limit on the Tl 208 activity of the sources has been determined: it is lower than 68 mBq/kg, at 90% of confidence level. On the other hand, I have developed and tested on these data a method in order to analyse the neutrinoless double beta decay signal; this method is based on a maximum of likelihood using all the available information. Using this method, I could determine a first and very preliminary upper limit on the effective mass of the neutrino. (author)

  16. The statistical properties of 111,112,113Sn studied with the Oslo method

    Directory of Open Access Journals (Sweden)

    Tveten G. M.

    2015-01-01

    Full Text Available The γ-ray strength function and level density of 111, 112, 113Sn are being studied at the Oslo Cyclotron Laboratory (OCL up to the neutron binding energy by applying the Oslo method to particle-γ coincidence data. The preliminary results for the γ-ray strength function are discussed in the context of the results for the more neutron-rich Sn-isotopes previously studied at OCL.

  17. IMPLEMENTATION OF THE CASE STUDY METHOD IN DISTANCE LEARNING OF ENTREPRENEURSHIP FUNDAMENTALS

    Directory of Open Access Journals (Sweden)

    O. R. Chepyuk

    2016-01-01

    Full Text Available The aim of the presented publication is to show new opportunities of application of a a case study method (educational situations in modern edu cational process of the higher school in general, and in particular – in teaching fundementals of business and economy wherein this method has gained special popularity.Methods and results. By means of methods of aggregation, deduction and logical synthesis, the authors developed the principles of the organization of distance training of economic disciplines on the basis of the case study method. The structure of educational cases is designated; the standard set of the materials accompanying them is designed. These practical problems were solved within implementation of the project Tempus «Acquisition of Professional and Entrepreneurial Skills by means of Education of Entrepreneurial Spirit and Consultation of the Beginning Entrepreneurs». Possible types and forms of cases were studied; several options of adaptation of their content to the electronic training environment which possesses both restrictions, and extensive additional educational potential are allocated. Various types of cases are shown based on specific examples: illustrating processes and concepts; imitating sample processes; describing original situations in real business and having decisions which are already realized in practice; cases with an uncertain answer to the asked problematic issue. The choice of this or that type of case study tasks is determined by the educational purposes and necessary level of development of a discipline. Cases supplement each other when forming the fund of evaluative means.Scientific novelty. The majority of researches define the case study method as a group discussion in the educational purposes of any problem situation and collective search of its decision, i.e. application of this method assumes classroom full-time courses. The question of use the case study method in a distance format for individual

  18. Learning phacoemulsification. Results of different teaching methods.

    Directory of Open Access Journals (Sweden)

    Hennig Albrecht

    2004-01-01

    Full Text Available We report the learning curves of three eye surgeons converting from sutureless extracapsular cataract extraction to phacoemulsification using different teaching methods. Posterior capsule rupture (PCR as a per-operative complication and visual outcome of the first 100 operations were analysed. The PCR rate was 4% and 15% in supervised and unsupervised surgery respectively. Likewise, an uncorrected visual acuity of > or = 6/18 on the first postoperative day was seen in 62 (62% of patients and in 22 (22% in supervised and unsupervised surgery respectively.

  19. Study of flow over object problems by a nodal discontinuous Galerkin-lattice Boltzmann method

    Science.gov (United States)

    Wu, Jie; Shen, Meng; Liu, Chen

    2018-04-01

    The flow over object problems are studied by a nodal discontinuous Galerkin-lattice Boltzmann method (NDG-LBM) in this work. Different from the standard lattice Boltzmann method, the current method applies the nodal discontinuous Galerkin method into the streaming process in LBM to solve the resultant pure convection equation, in which the spatial discretization is completed on unstructured grids and the low-storage explicit Runge-Kutta scheme is used for time marching. The present method then overcomes the disadvantage of standard LBM for depending on the uniform meshes. Moreover, the collision process in the LBM is completed by using the multiple-relaxation-time scheme. After the validation of the NDG-LBM by simulating the lid-driven cavity flow, the simulations of flows over a fixed circular cylinder, a stationary airfoil and rotating-stationary cylinders are performed. Good agreement of present results with previous results is achieved, which indicates that the current NDG-LBM is accurate and effective for flow over object problems.

  20. KAP STUDY ON CONTRACEPTIVE METHODS IN KANPUR DISTRICT OF UP

    Directory of Open Access Journals (Sweden)

    S K Kaushal

    2010-06-01

    Full Text Available Research question- what is the status of knowledge, attitude andpractices aboutfamilyplanning Objectives: 1. To study the knowledge about various contraceptive, 2. To study the prevalent attitude and practices regarding family planning, 3. To study the influence ofsocialfactors affecting contraceptive use & 4. Tofind out reasons for not adopting contraception. Study design : Cross sectional study Setting and participants: Rural block ofKanpur District and marriedwomen of reproductive age group i.e. 15-49yrs Study period:July to December2005 Sample size : 280 Married women ofreproductive age group i.e. 15-49yrs. Study variable: Knowledge status, attitude, practices, social factors, reasons for not using contraceptives Results : Awareness about contraception was more than 90 percent for all available methods except vasectomy and injectables whiOfi was 31.5% and 8.6% respectively. Only 29.3% of women were currently practicing contraceptive and nearly half (46.42% had never used. OCP and Condoms qre most commonly accepted methods. Most common reasons observedfor contraceptive defaulter were unavailability (30.88% and adverse effect (2 6.47% and for never user need not felt (36.92% and desire of more children (13.84%. Educational status and joint family structure has positive impact on contraceptive acceptance.

  1. Monte Carlo evaluation of scattering correction methods in 131I studies using pinhole collimator

    International Nuclear Information System (INIS)

    López Díaz, Adlin; San Pedro, Aley Palau; Martín Escuela, Juan Miguel; Rodríguez Pérez, Sunay; Díaz García, Angelina

    2017-01-01

    Scattering is quite important for image activity quantification. In order to study the scattering factors and the efficacy of 3 multiple window energy scatter correction methods during 131 I thyroid studies with a pinhole collimator (5 mm hole) a Monte Carlo simulation (MC) was developed. The GAMOS MC code was used to model the gamma camera and the thyroid source geometry. First, to validate the MC gamma camera pinhole-source model, sensibility in air and water of the simulated and measured thyroid phantom geometries were compared. Next, simulations to investigate scattering and the result of triple energy (TEW), Double energy (DW) and Reduced double (RDW) energy windows correction methods were performed for different thyroid sizes and depth thicknesses. The relative discrepancies to MC real event were evaluated. Results: The accuracy of the GAMOS MC model was verified and validated. The image’s scattering contribution was significant, between 27-40 %. The discrepancies between 3 multiple window energy correction method results were significant (between 9-86 %). The Reduce Double Window methods (15%) provide discrepancies of 9-16 %. Conclusions: For the simulated thyroid geometry with pinhole, the RDW (15 %) was the most effective. (author)

  2. Thin layer chromatographic method for the detection of uric acid: collaborative study.

    Science.gov (United States)

    Thrasher, J J; Abadie, A

    1978-07-01

    A collaborative study has been completed on an improved method for the detection and confirmation of uric acid from bird and insect excreta. The proposed method involves the lithium carbonate solubilization of the suspect excreta material, followed by butanol-methanol-water-acetic acid thin layer chromatography, and trisodium phosphate-phosphotungstic acid color development. The collaborative tests resulted in 100% detection of uric acid standard at the 50 ng level and 75% detection at the 20-25 ng level. No false positives were reported during tests of compounds similar to uric acid. The proposed method has been adopted official first action; the present official final action method, 44.161, will be retained for screening purposes.

  3. Advanced methods for the study of PWR cores; Les methodes d'etudes avancees pour les coeurs de REP

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, M.; Salvatores, St.; Ferrier, A. [Electricite de France (EDF), Service Etudes et Projets Thermiques et Nucleaires, 92 - Courbevoie (France); Pelet, J.; Nicaise, N.; Pouliquen, J.Y.; Foret, F. [FRAMATOME ANP, 92 - Paris La Defence (France); Chauliac, C. [CEA Saclay, Dir. de l' Energie Nucleaire (DEN), 91 - Gif sur Yvette (France); Johner, J. [CEA Cadarache, Dept. de Recherches sur la Fusion Controlee (DRFC), 13 - Saint Paul lez Durance (France); Cohen, Ch

    2003-07-01

    This document gathers the transparencies presented at the 6. technical session of the French nuclear energy society (SFEN) in October 2003. The transparencies of the annual meeting are presented in the introductive part: 1 - status of the French nuclear park: nuclear energy results, management of an exceptional climatic situation: the heat wave of summer 2003 and the power generation (J.C. Barral); 2 - status of the research on controlled thermonuclear fusion (J. Johner). Then follows the technical session about the advanced methods for the study of PWR reactor cores: 1 - the evolution approach of study methodologies (M. Lambert, J. Pelet); 2 - the point of view of the nuclear safety authority (D. Brenot); 3 - the improved decoupled methodology for the steam pipe rupture (S. Salvatores, J.Y. Pouliquen); 4 - the MIR method for the pellet-clad interaction (renovated IPG methodology) (E. Baud, C. Royere); 5 - the improved fuel management (IFM) studies for Koeberg (C. Cohen); 6 - principle of the methods of accident study implemented for the European pressurized reactor (EPR) (F. Foret, A. Ferrier); 7 - accident studies with the EPR, steam pipe rupture (N. Nicaise, S. Salvatores); 8 - the co-development platform, a new generation of software tools for the new methodologies (C. Chauliac). (J.S.)

  4. A COMPARISON OF STUDY RESULTS OF BUSINESS ENGLISH STUDENTS IN E-LEARNING AND FACE-TO-FACE COURSES

    Directory of Open Access Journals (Sweden)

    Petr Kučera

    2012-09-01

    Full Text Available The paper deals with the comparison of results of students in thelessons of Business English e-learning course with face-to-faceteaching at the Faculty of Economics and Management of the CULSin Prague. E-learning as a method of instruction refers to learningusing technology, such as the Internet, CD-ROMs and portabledevices. A current trend in university teaching is a particular focus one-learning method of studies enhancing the quality and effectivenessof studies and self-studies. In the paper we have analysed the currentstate in the area of English for Specific Purposes (ESP e-learningresearch, pointed out the results of a pilot ESP e-learning course intesting a control and an experimental group of students and resultsof questionnaires with views of students on e-learning. The paperfocuses on the experimental verification of e-learning influenceon the results of both groups of students. Online study materialsupports an interactive form of the teaching by means of multimediaapplication. It could be used not only for full-time students but alsofor distance students and centers of lifelong learning.

  5. Child/Adolescent Anxiety Multimodal Study (CAMS: rationale, design, and methods

    Directory of Open Access Journals (Sweden)

    Waslick Bruce D

    2010-01-01

    Full Text Available Abstract Objective To present the design, methods, and rationale of the Child/Adolescent Anxiety Multimodal Study (CAMS, a recently completed federally-funded, multi-site, randomized placebo-controlled trial that examined the relative efficacy of cognitive-behavior therapy (CBT, sertraline (SRT, and their combination (COMB against pill placebo (PBO for the treatment of separation anxiety disorder (SAD, generalized anxiety disorder (GAD and social phobia (SoP in children and adolescents. Methods Following a brief review of the acute outcomes of the CAMS trial, as well as the psychosocial and pharmacologic treatment literature for pediatric anxiety disorders, the design and methods of the CAMS trial are described. Results CAMS was a six-year, six-site, randomized controlled trial. Four hundred eighty-eight (N = 488 children and adolescents (ages 7-17 years with DSM-IV-TR diagnoses of SAD, GAD, or SoP were randomly assigned to one of four treatment conditions: CBT, SRT, COMB, or PBO. Assessments of anxiety symptoms, safety, and functional outcomes, as well as putative mediators and moderators of treatment response were completed in a multi-measure, multi-informant fashion. Manual-based therapies, trained clinicians and independent evaluators were used to ensure treatment and assessment fidelity. A multi-layered administrative structure with representation from all sites facilitated cross-site coordination of the entire trial, study protocols and quality assurance. Conclusions CAMS offers a model for clinical trials methods applicable to psychosocial and psychopharmacological comparative treatment trials by using state-of-the-art methods and rigorous cross-site quality controls. CAMS also provided a large-scale examination of the relative and combined efficacy and safety of the best evidenced-based psychosocial (CBT and pharmacologic (SSRI treatments to date for the most commonly occurring pediatric anxiety disorders. Primary and secondary results

  6. Impact of dental implant insertion method on the peri-implant bone tissue: Experimental study

    Directory of Open Access Journals (Sweden)

    Stamatović Novak

    2013-01-01

    Full Text Available Background/Aim. The function of dental implants depends on their stability in bone tissue over extended period of time, i.e. on osseointegration. The process through which osseointegration is achieved depends on several factors, surgical insertion method being one of them. The aim of this study was to histopathologically compare the impact of the surgical method of implant insertion on the peri-implant bone tissue. Methods. The experiment was performed on 9 dogs. Eight weeks following the extraction of lower premolars implants were inserted using the one-stage method on the right mandibular side and two-stage method on the left side. Three months after implantation the animals were sacrificed. Three distinct regions of bone tissue were histopathologically analyzed, the results were scored and compared. Results. In the specimens of one-stage implants increased amount of collagen fibers was found in 5 specimens where tissue necrosis was also observed. Only moderate osteoblastic activity was found in 3 sections. The analysis of bone-to-implant contact region revealed statistically significantly better results regarding the amount of collagen tissue fibers for the implants inserted in the two-stage method (Wa = 59 105, α = 0.05. No necrosis and osteoblastic activity were observed. Conclusion. Better results were achieved by the two-stage method in bone-to-implant contact region regarding the amount of collagen tissue, while the results were identical regarding the osteoblastic activity and bone tissue necrosis. There was no difference between the methods in the bone-implant interface region. In the bone tissue adjacent to the implant the results were identical regarding the amount of collagen tissue, osteoblastic reaction and bone tissue necrosis, while better results were achieved by the two-stage method regarding the number of osteocytes.

  7. Study of classification and disposed method for disused sealed radioactive source in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Hoon; Kim, Ju Youl; Lee, Seung Hee [FNC Technology Co., Ltd.,Yongin (Korea, Republic of)

    2016-09-15

    In accordance with the classification system of radioactive waste in Korea, all the disused sealed radioactive sources (DSRSs) fall under the category of EW, VLLW or LILW, and should be managed in compliance with the restrictions for the disposal method. In this study, the management and disposal method are drawn in consideration of half-life of radionuclides contained in the source and A/D value (i.e. the activity A of the source dividing by the D value for the relevant radionuclide, which is used to provide an initial ranking of relative risk for sources) in addition to the domestic classification scheme and disposal method, based on the characteristic analysis and review results of the management practices in IAEA and foreign countries. For all the DSRSs that are being stored (as of March 2015) in the centralized temporary disposal facility for radioisotope wastes, applicability of the derivation result is confirmed through performing the characteristic analysis and case studies for assessing quantity and volume of DSRSs to be managed by each method. However, the methodology derived from this study is not applicable to the following sources; i) DSRSs without information on the radioactivity, ii) DSRSs that are not possible to calculate the specific activity and/or the source-specific A/D value. Accordingly, it is essential to identify the inherent characteristics for each of DSRSs prior to implementation of this management and disposal method.

  8. A method optimization study for atomic absorption ...

    African Journals Online (AJOL)

    A sensitive, reliable and relative fast method has been developed for the determination of total zinc in insulin by atomic absorption spectrophotometer. This designed study was used to optimize the procedures for the existing methods. Spectrograms of both standard and sample solutions of zinc were recorded by measuring ...

  9. A study of various methods for calculating locations of lightning events

    Science.gov (United States)

    Cannon, John R.

    1995-01-01

    This article reports on the results of numerical experiments on finding the location of lightning events using different numerical methods. The methods include linear least squares, nonlinear least squares, statistical estimations, cluster analysis and angular filters and combinations of such techniques. The experiments involved investigations of methods for excluding fake solutions which are solutions that appear to be reasonable but are in fact several kilometers distant from the actual location. Some of the conclusions derived from the study are that bad data produces fakes, that no fool-proof method of excluding fakes was found, that a short base-line interferometer under development at Kennedy Space Center to measure the direction cosines of an event shows promise as a filter for excluding fakes. The experiments generated a number of open questions, some of which are discussed at the end of the report.

  10. New method for the study and control of crystal growth: dilatometry under thermal gradient. Theory and application

    International Nuclear Information System (INIS)

    Potard, C.

    1975-01-01

    A new method was developed to study and control solidification processes by means of differential dilatometry. A mathematical analysis of this method is made and first results are presented. A relation is established between the variations of the volume of the sample and that of the solid obtained. The gravimetric method used for volume measurement is also mathematically analyzed. These results are applied to two solidification experiments on InSb, in strongly perturbed and controlled cooling regimes. Precisions are given on the limits of this method, and further developments towards phase transformation studies and control are envisaged [fr

  11. Study of Soil Decontamination Method Using Supercritical Carbon Dioxide and TBP

    International Nuclear Information System (INIS)

    Park, Jihye; Park, Kwangheon; Jung, Wonyoung

    2014-01-01

    The result of this study means that we have a possible new method for cheap and less wasteful nuclear waste decontamination. When severe accidents such as the incident at the Fukushima nuclear site occur, the soil near the power plant is contaminated with fission products or the activation metal structure of the power plant. The soil pollution form depends on the environment and soil characteristics of the contaminated areas. Thus, a- single-decontamination method is not effective for site cleanup. In addition, some soil decontamination methods are expensive and large amounts of secondary waste are generated. Therefore, we need new soil decontamination methods. In this study, instead of using a conventional solvent method that generates secondary waste, supercritical carbon dioxide was used to remove metal ions from the soil. Supercritical carbon dioxide is known for good permeation characteristics. We expect that we will reduce the cost of soil pollution management. Supercritical carbon dioxide can decontaminate soil easily, as it has the ability to penetrate even narrow gaps with very good moisture permeability. We used TBP, which is a known for extractant of actinium metal. TBP is usually used for uranium and strontium extraction. Using TBP-HNO 3 complex and supercritical carbon dioxide, we did extraction experiments for several heavy metals in contaminated soil

  12. Study of Soil Decontamination Method Using Supercritical Carbon Dioxide and TBP

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jihye; Park, Kwangheon; Jung, Wonyoung [Kyunghee Univ., Yongin (Korea, Republic of)

    2014-05-15

    The result of this study means that we have a possible new method for cheap and less wasteful nuclear waste decontamination. When severe accidents such as the incident at the Fukushima nuclear site occur, the soil near the power plant is contaminated with fission products or the activation metal structure of the power plant. The soil pollution form depends on the environment and soil characteristics of the contaminated areas. Thus, a- single-decontamination method is not effective for site cleanup. In addition, some soil decontamination methods are expensive and large amounts of secondary waste are generated. Therefore, we need new soil decontamination methods. In this study, instead of using a conventional solvent method that generates secondary waste, supercritical carbon dioxide was used to remove metal ions from the soil. Supercritical carbon dioxide is known for good permeation characteristics. We expect that we will reduce the cost of soil pollution management. Supercritical carbon dioxide can decontaminate soil easily, as it has the ability to penetrate even narrow gaps with very good moisture permeability. We used TBP, which is a known for extractant of actinium metal. TBP is usually used for uranium and strontium extraction. Using TBP-HNO{sub 3} complex and supercritical carbon dioxide, we did extraction experiments for several heavy metals in contaminated soil.

  13. Preparation of Samples for Leaf Architecture Studies, A Method for Mounting Cleared Leaves

    Directory of Open Access Journals (Sweden)

    Alejandra Vasco

    2014-09-01

    Full Text Available Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.

  14. A Comparative Study of Potential Evapotranspiration Estimation by Eight Methods with FAO Penman–Monteith Method in Southwestern China

    Directory of Open Access Journals (Sweden)

    Dengxiao Lang

    2017-09-01

    Full Text Available Potential evapotranspiration (PET is crucial for water resources assessment. In this regard, the FAO (Food and Agriculture Organization–Penman–Monteith method (PM is commonly recognized as a standard method for PET estimation. However, due to requirement of detailed meteorological data, the application of PM is often constrained in many regions. Under such circumstances, an alternative method with similar efficiency to that of PM needs to be identified. In this study, three radiation-based methods, Makkink (Mak, Abtew (Abt, and Priestley–Taylor (PT, and five temperature-based methods, Hargreaves–Samani (HS, Thornthwaite (Tho, Hamon (Ham, Linacre (Lin, and Blaney–Criddle (BC, were compared with PM at yearly and seasonal scale, using long-term (50 years data from 90 meteorology stations in southwest China. Indicators, viz. (videlicet Nash–Sutcliffe efficiency (NSE, relative error (Re, normalized root mean squared error (NRMSE, and coefficient of determination (R2 were used to evaluate the performance of PET estimations by the above-mentioned eight methods. The results showed that the performance of the methods in PET estimation varied among regions; HS, PT, and Abt overestimated PET, while others underestimated. In Sichuan basin, Mak, Abt and HS yielded similar estimations to that of PM, while, in Yun-Gui plateau, Abt, Mak, HS, and PT showed better performances. Mak performed the best in the east Tibetan Plateau at yearly and seasonal scale, while HS showed a good performance in summer and autumn. In the arid river valley, HS, Mak, and Abt performed better than the others. On the other hand, Tho, Ham, Lin, and BC could not be used to estimate PET in some regions. In general, radiation-based methods for PET estimation performed better than temperature-based methods among the selected methods in the study area. Among the radiation-based methods, Mak performed the best, while HS showed the best performance among the temperature

  15. A comparison between NASCET and ECST methods in the study of carotids

    International Nuclear Information System (INIS)

    Saba, Luca; Mallarini, Giorgio

    2010-01-01

    Purpose: NASCET and ECST systems to quantify carotid artery stenosis use percent diameter ratios from conventional angiography. With the use of Multi-Detector-Row CT scanners it is possible to easily measure plaque area and residual lumen in order to calculate carotid stenosis degree. Our purpose was to compare NASCET and ECST techniques in the measurement of carotid stenosis degree by using MDCTA. Methods and material: From February 2007 to October 2007, 83 non-consecutive patients (68 males; 15 females) were studied using Multi-Detector-Row CT. Each patient was assessed by two experienced radiologists for stenosis degree by using both NASCET and ECST methods. Statistic analysis was performed to determine the entity of correlation (method of Pearson) between NASCET and ECST. The Cohen kappa test and Bland-Altman analysis were applied to assess the level of inter- and intra-observer agreement. Results: The correlation Pearson coefficient between NASCET and ECST was 0.962 (p < 0.01). Intra-observer agreement in the NASCET evaluation, by using Cohen statistic was 0.844 and 0.825. Intra-observer agreement in the ECST evaluation was 0.871 and 0.836. Inter-observer agreement in the NASCET and ECTS were 0.822 and 0.834, respectively. Agreement analysis by using Bland-Altman plots showed a good intra-/inter-observer agreement for the NASCET and an optimal intra-/inter-observer agreement for the ECST. Conclusions: Results of our study suggest that NASCET and ECST methods show a strength correlation according to quadratic regression. Intra-observer agreement results high for both NASCET and ECST.

  16. Effect of dactyloscopic powders on DNA profiling from enhanced fingerprints: results from an experimental study.

    Science.gov (United States)

    Tozzo, Pamela; Giuliodori, Alice; Rodriguez, Daniele; Caenazzo, Luciana

    2014-03-01

    We conducted a study on the effect of fingerprint enhancement methods on subsequent short tandem repeat profiling. First, we performed a study typing blood traces deposited on 5 different surfaces, treated with 8 types of dactyloscopic powders. Three different DNA extraction methods were used. Subsequently, we analyzed latent fingerprints on the same 5 surfaces enhanced with the 8 different powders used in the first part of the study. This study has demonstrated that DNA profiling can be performed on fingerprints left on different substrates, and the substrate will affect the amount of DNA that can be recovered for DNA typing. In the first phase of the study, a profile was obtained in 92% of the 120 samples analyzed; in the second part, in 55% of the 80 samples analyzed, we obtained a profile complete in 32.5% of the cases. From the results obtained, it seems that the powders used in latent fingerprints enhancement, rather than having a direct inhibitory effect on extraction and amplification of DNA, may cause partial degradation of DNA, reducing the efficiency of amplification reaction. It should not be forgotten that these results were obtained under laboratory conditions, and in real caseworks, there may still be different problems involved.

  17. When Educational Material Is Delivered: A Mixed Methods Content Validation Study of the Information Assessment Method.

    Science.gov (United States)

    Badran, Hani; Pluye, Pierre; Grad, Roland

    2017-03-14

    The Information Assessment Method (IAM) allows clinicians to report the cognitive impact, clinical relevance, intention to use, and expected patient health benefits associated with clinical information received by email. More than 15,000 Canadian physicians and pharmacists use the IAM in continuing education programs. In addition, information providers can use IAM ratings and feedback comments from clinicians to improve their products. Our general objective was to validate the IAM questionnaire for the delivery of educational material (ecological and logical content validity). Our specific objectives were to measure the relevance and evaluate the representativeness of IAM items for assessing information received by email. A 3-part mixed methods study was conducted (convergent design). In part 1 (quantitative longitudinal study), the relevance of IAM items was measured. Participants were 5596 physician members of the Canadian Medical Association who used the IAM. A total of 234,196 ratings were collected in 2012. The relevance of IAM items with respect to their main construct was calculated using descriptive statistics (relevance ratio R). In part 2 (qualitative descriptive study), the representativeness of IAM items was evaluated. A total of 15 family physicians completed semistructured face-to-face interviews. For each construct, we evaluated the representativeness of IAM items using a deductive-inductive thematic qualitative data analysis. In part 3 (mixing quantitative and qualitative parts), results from quantitative and qualitative analyses were reviewed, juxtaposed in a table, discussed with experts, and integrated. Thus, our final results are derived from the views of users (ecological content validation) and experts (logical content validation). Of the 23 IAM items, 21 were validated for content, while 2 were removed. In part 1 (quantitative results), 21 items were deemed relevant, while 2 items were deemed not relevant (R=4.86% [N=234,196] and R=3.04% [n

  18. in the health service sector – results of literature study

    Directory of Open Access Journals (Sweden)

    Irena Sobańska

    2015-12-01

    Full Text Available The aim of this paper is to provide a review of the existing literature related to the directions of change from thepoint of view of the influence that lean approach has for management and accounting in health care institutions.The article is an account of the content of the selected 19 papers (from more than 200 analyzed published in thefield within the period 1995–2013. The investigation of the literature was conducted in two basic perspectives:theoretical considerations and results of empirical research (case study, questionnaire survey.The method of literature analysis was applied for the realization of the aim formulated in the paper. Twogroups of articles were the object of the analysis: theoretical and presenting explanatory results of empiricalinvestigations.The lean approach, which originated in the motor industry (production factories, is fully suitable for use inhealthcare organizations operating in various cultural contexts, and for reforming national healthcare systems toincrease their efficiency. The spreading and adoption of the lean concept in the medical services sector has anevolutionary character, similarly to the earlier spread of lean in manufacturing industries.

  19. Methods and findings of the SNR study

    International Nuclear Information System (INIS)

    Koeberlein, K.; Schaefer, H.; Spindler, H.

    1983-01-01

    A featfinding committee of the German Federal Parliament in July 1980 recommended to perform a ''risk-oriented study'' of the SNR-300, the German 300 MW fast breeder prototype reactor being under construction in Kalkar. The main aim of this study was to allow a comparative safety evaluation between the SNR-300 and a modern PWR, thus to prepare a basis for a political decision on the SNR-300. Methods and main results of the study are presented in this paper. In the first step of the risk analysis six groups of accidents have been identified which may initiate core destruction. These groups comprise all conceivable courses, potentially leading to core destruction. By reliability analyses, expected frequency of each group has been calculated. In the accident analysis potential failure modes of the reactor tank have been investigated. Core destruction may be accompanied by the release of significant amounts of mechanical energy. The primary coolant system of SNR-300 is designed to withstand mechanical energy releases up to 370 MJ. Design features make it possible to cool the molten core inside the reactor tank. (orig./RW) [de

  20. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights.

    Science.gov (United States)

    Borodovsky, Jacob T; Marsch, Lisa A; Budney, Alan J

    2018-05-02

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook's advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. ©Jacob T Borodovsky, Lisa A Marsch, Alan J Budney. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 02.05.2018.

  1. Review of studies on criticality safety evaluation and criticality experiment methods

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Yamamoto, Toshihiro; Misawa, Tsuyoshi; Yamane, Yuichi

    2013-01-01

    Since the early 1960s, many studies on criticality safety evaluation have been conducted in Japan. Computer code systems were developed initially by employing finite difference methods, and more recently by using Monte Carlo methods. Criticality experiments have also been carried out in many laboratories in Japan as well as overseas. By effectively using these study results, the Japanese Criticality Safety Handbook was published in 1988, almost the intermediate point of the last 50 years. An increased interest has been shown in criticality safety studies, and a Working Party on Nuclear Criticality Safety (WPNCS) was set up by the Nuclear Science Committee of Organisation Economic Co-operation and Development in 1997. WPNCS has several task forces in charge of each of the International Criticality Safety Benchmark Evaluation Program (ICSBEP), Subcritical Measurement, Experimental Needs, Burn-up Credit Studies and Minimum Critical Values. Criticality safety studies in Japan have been carried out in cooperation with WPNCS. This paper describes criticality safety study activities in Japan along with the contents of the Japanese Criticality Safety Handbook and the tasks of WPNCS. (author)

  2. Calculation of the energy provided by a PV generator. Comparative study: Conventional methods vs. artificial neural networks

    International Nuclear Information System (INIS)

    Almonacid, F.; Rus, C.; Perez-Higueras, P.; Hontoria, L.

    2011-01-01

    The use of photovoltaics for electricity generation purposes has recorded one of the largest increases in the field of renewable energies. The energy production of a grid-connected PV system depends on various factors. In a wide sense, it is considered that the annual energy provided by a generator is directly proportional to the annual radiation incident on the plane of the generator and to the installed nominal power. However, a range of factors is influencing the expected outcome by reducing the generation of energy. The aim of this study is to compare the results of four different methods for estimating the annual energy produced by a PV generator: three of them are classical methods and the fourth one is based on an artificial neural network developed by the R and D Group for Solar and Automatic Energy at the University of Jaen. The results obtained shown that the method based on an artificial neural network provides better results than the alternative classical methods in study, mainly due to the fact that this method takes also into account some second order effects, such as low irradiance, angular and spectral effects. -- Research highlights: → It is considered that the annual energy provided by a PV generator is directly proportional to the annual radiation incident on the plane of the generator and to the installed nominal power. → A range of factors are influencing the expected outcome by reducing the generation of energy (mismatch losses, dirt and dust, Ohmic losses,.). → The aim of this study is to compare the results of four different methods for estimating the annual energy produced by a PV generator: three of them are classical methods and the fourth one is based on an artificial neural network. → The results obtained shown that the method based on an artificial neural network provides better results than the alternative classical methods in study. While classical methods have only taken into account temperature losses, the method based in

  3. Comparison of microstickies measurement methods. Part II, Results and discussion

    Science.gov (United States)

    Mahendra R. Doshi; Angeles Blanco; Carlos Negro; Concepcion Monte; Gilles M. Dorris; Carlos C. Castro; Axel Hamann; R. Daniel Haynes; Carl Houtman; Karen Scallon; Hans-Joachim Putz; Hans Johansson; R. A. Venditti; K. Copeland; H.-M. Chang

    2003-01-01

    In part I of the article we discussed sample preparation procedure and described various methods used for the measurement of microstickies. Some of the important features of different methods are highlighted in Table 1. Temperatures used in the measurement methods vary from room temperature in some cases, 45 °C to 65 °C in other cases. Sample size ranges from as low as...

  4. ABCD Matrix Method a Case Study

    CERN Document Server

    Seidov, Zakir F; Yahalom, Asher

    2004-01-01

    In the Israeli Electrostatic Accelerator FEL, the distance between the accelerator's end and the wiggler's entrance is about 2.1 m, and 1.4 MeV electron beam is transported through this space using four similar quadrupoles (FODO-channel). The transfer matrix method (ABCD matrix method) was used for simulating the beam transport, a set of programs is written in the several programming languages (MATHEMATICA, MATLAB, MATCAD, MAPLE) and reasonable agreement is demonstrated between experimental results and simulations. Comparison of ABCD matrix method with the direct "numerical experiments" using EGUN, ELOP, and GPT programs with and without taking into account the space-charge effects showed the agreement to be good enough as well. Also the inverse problem of finding emittance of the electron beam at the S1 screen position (before FODO-channel), by using the spot image at S2 screen position (after FODO-channel) as function of quad currents, is considered. Spot and beam at both screens are described as tilted eel...

  5. Project Oriented Immersion Learning: Method and Results

    DEFF Research Database (Denmark)

    Icaza, José I.; Heredia, Yolanda; Borch, Ole M.

    2005-01-01

    A pedagogical approach called “project oriented immersion learning” is presented and tested on a graduate online course. The approach combines the Project Oriented Learning method with immersion learning in a virtual enterprise. Students assumed the role of authors hired by a fictitious publishing...... house that develops digital products including e-books, tutorials, web sites and so on. The students defined the problem that their product was to solve; choose the type of product and the content; and built the product following a strict project methodology. A wiki server was used as a platform to hold...

  6. Results of a comparative study on insulin radioimmunoassay in 36 Italian laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Costantini, A; Lostia, O; Malvano, R; Rolleri, E; Taggi, F; Zucchelli, G C [Istituto Superiore di Sanita, Rome (Italy); Consiglio Nazionale delle Ricerche, Pisa (Italy))

    1975-12-01

    An interlaboratory study in which the insulin contents of five plasma samples were estimated in 36 italian laboratories was coordinated by the Istituto Superiore di Sanita (National Institute of Health) and the Consiglio Nazionale delle Ricerche (National Research Council). A rather large between-laboratory variability resulted, though the ranking of samples according to their insulin concentrations was practically the same. A significant dependence of estimates on the method used was established. The analysis of data aimed at defining the possible reasons of the assay variability is reported and discussed.

  7. Bioanalysis works in the IAA AMS facility: Comparison of AMS analytical method with LSC method in human mass balance study

    International Nuclear Information System (INIS)

    Miyaoka, Teiji; Isono, Yoshimi; Setani, Kaoru; Sakai, Kumiko; Yamada, Ichimaro; Sato, Yoshiaki; Gunji, Shinobu; Matsui, Takao

    2007-01-01

    Institute of Accelerator Analysis Ltd. (IAA) is the first Contract Research Organization in Japan providing Accelerator Mass Spectrometry (AMS) analysis services for carbon dating and bioanalysis works. The 3 MV AMS machines are maintained by validated analysis methods using multiple control compounds. It is confirmed that these AMS systems have reliabilities and sensitivities enough for each objective. The graphitization of samples for bioanalysis is prepared by our own purification lines including the measurement of total carbon content in the sample automatically. In this paper, we present the use of AMS analysis in human mass balance and metabolism profiling studies with IAA 3 MV AMS, comparing results obtained from the same samples with liquid scintillation counting (LSC). Human samples such as plasma, urine and feces were obtained from four healthy volunteers orally administered a 14 C-labeled drug Y-700, a novel xanthine oxidase inhibitor, of which radioactivity was about 3 MBq (85 μCi). For AMS measurement, these samples were diluted 100-10,000-fold with pure-water or blank samples. The results indicated that AMS method had a good correlation with LSC method (e.g. plasma: r = 0.998, urine: r = 0.997, feces: r = 0.997), and that the drug recovery in the excreta exceeded 92%. The metabolite profiles of plasma, urine and feces obtained with HPLC-AMS corresponded to radio-HPLC results measured at much higher radioactivity level. These results revealed that AMS analysis at IAA is useful to measure 14 C-concentration in bioanalysis studies at very low radioactivity level

  8. Leveraging social and digital media for participant recruitment: A review of methods from the Bayley Short Form Formative Study

    OpenAIRE

    Burke-Garcia, Amelia; Mathew, Sunitha

    2017-01-01

    Introduction Social media is increasingly being used in research, including recruitment. Methods For the Bayley Short Form Formative Study, which was conducted under the the National Children’s Study, traditional methods of recruitment proved to be ineffective. Therefore, digital media were identified as potential channels for recruitment. Results Results included successful recruitment of over 1800 infant and toddler participants to the Study. Conclusions This paper outlines the methods, res...

  9. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  10. Methods for environmental change; an exploratory study

    Directory of Open Access Journals (Sweden)

    Kok Gerjo

    2012-11-01

    Full Text Available Abstract Background While the interest of health promotion researchers in change methods directed at the target population has a long tradition, interest in change methods directed at the environment is still developing. In this survey, the focus is on methods for environmental change; especially about how these are composed of methods for individual change (‘Bundling’ and how within one environmental level, organizations, methods differ when directed at the management (‘At’ or applied by the management (‘From’. Methods The first part of this online survey dealt with examining the ‘bundling’ of individual level methods to methods at the environmental level. The question asked was to what extent the use of an environmental level method would involve the use of certain individual level methods. In the second part of the survey the question was whether there are differences between applying methods directed ‘at’ an organization (for instance, by a health promoter versus ‘from’ within an organization itself. All of the 20 respondents are experts in the field of health promotion. Results Methods at the individual level are frequently bundled together as part of a method at a higher ecological level. A number of individual level methods are popular as part of most of the environmental level methods, while others are not chosen very often. Interventions directed at environmental agents often have a strong focus on the motivational part of behavior change. There are different approaches targeting a level or being targeted from a level. The health promoter will use combinations of motivation and facilitation. The manager will use individual level change methods focusing on self-efficacy and skills. Respondents think that any method may be used under the right circumstances, although few endorsed coercive methods. Conclusions Taxonomies of theoretical change methods for environmental change should include combinations of individual

  11. Realism and Pragmatism in a mixed methods study.

    Science.gov (United States)

    Allmark, Peter; Machaczek, Katarzyna

    2018-06-01

    A discussion of how adopting a Realist rather than Pragmatist methodology affects the conduct of mixed methods research. Mixed methods approaches are now extensively employed in nursing and other healthcare research. At the same time, realist methodology is increasingly used as philosophical underpinning of research in these areas. However, the standard philosophical underpinning of mixed methods research is Pragmatism, which is generally considered incompatible or at least at odds with Realism. This paper argues that Realism can be used as the basis of mixed methods research and that doing so carries advantages over using Pragmatism. A mixed method study into patient handover reports is used to illustrate how Realism affected its design and how it would have differed had a Pragmatist approach been taken. Discussion Paper. Philosophers Index; Google Scholar. Those undertaking mixed methods research should consider the use of Realist methodology with the addition of some insights from Pragmatism to do with the start and end points of enquiry. Realism is a plausible alternative methodology for those undertaking mixed methods studies. © 2018 John Wiley & Sons Ltd.

  12. Methods used by Elsam for monitoring precision and accuracy of analytical results

    Energy Technology Data Exchange (ETDEWEB)

    Hinnerskov Jensen, J [Soenderjyllands Hoejspaendingsvaerk, Faelleskemikerne, Aabenraa (Denmark)

    1996-12-01

    Performing round robins at regular intervals is the primary method used by ELsam for monitoring precision and accuracy of analytical results. The firs round robin was started in 1974, and today 5 round robins are running. These are focused on: boiler water and steam, lubricating oils, coal, ion chromatography and dissolved gases in transformer oils. Besides the power plant laboratories in Elsam, the participants are power plant laboratories from the rest of Denmark, industrial and commercial laboratories in Denmark, and finally foreign laboratories. The calculated standard deviations or reproducibilities are compared with acceptable values. These values originate from ISO, ASTM and the like, or from own experiences. Besides providing the laboratories with a tool to check their momentary performance, the round robins are vary suitable for evaluating systematic developments on a long term basis. By splitting up the uncertainty according to methods, sample preparation/analysis, etc., knowledge can be extracted from the round robins for use in many other situations. (au)

  13. A mixed-methods study into ballet for people living with Parkinson's1

    Science.gov (United States)

    Houston, Sara; McGill, Ashley

    2012-01-01

    Background: Parkinson's is a neurological disease that is physically debilitating and can be socially isolating. Dance is growing in popularity for people with Parkinson's and claims have been made for its benefits. The paper details a mixed-methods study that examined a 12-week dance project for people with Parkinson's, led by English National Ballet. Methods: The effects on balance, stability and posture were measured through the Fullerton Advanced Balance Scale and a plumb-line analysis. The value of participation and movement quality were interpreted through ethnographic methods, grounded theory and Effort analysis. Results: Triangulation of results indicates that people were highly motivated, with 100% adherence, and valued the classes as an important part of their lives. Additionally, results indicated an improvement in balance and stability, although not in posture. Conclusions: Dancing may offer benefit to people with Parkinson's through its intellectual, artistic, social and physical aspects. The paper suggests that a range of research methods is fundamental to capture the importance of multifaceted activity, such as dance, to those with Parkinson's. PMID:23805165

  14. Bulk Electric Load Cost Calculation Methods: Iraqi Network Comparative Study

    Directory of Open Access Journals (Sweden)

    Qais M. Alias

    2016-09-01

    Full Text Available It is vital in any industry to regain the spent capitals plus running costs and a margin of profits for the industry to flourish. The electricity industry is an everyday life touching industry which follows the same finance-economic strategy. Cost allocation is a major issue in all sectors of the electric industry, viz, generation, transmission and distribution. Generation and distribution service costing’s well documented in the literature, while the transmission share is still of need for research. In this work, the cost of supplying a bulk electric load connected to the EHV system is calculated. A sample basic lump-average method is used to provide a rough costing guide. Also, two transmission pricing methods are employed, namely, the postage-stamp and the load-flow based MW-distance methods to calculate transmission share in the total cost of each individual bulk load. The three costing methods results are then analyzed and compared for the 400kV Iraqi power grid considered for a case study.

  15. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    Science.gov (United States)

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  16. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  17. Justice Is the Missing Link in One Health: Results of a Mixed Methods Study in an Urban City State.

    Science.gov (United States)

    Lysaght, Tamra; Capps, Benjamin; Bailey, Michele; Bickford, David; Coker, Richard; Lederman, Zohar; Watson, Sangeetha; Tambyah, Paul Anantharajah

    2017-01-01

    One Health (OH) is an interdisciplinary collaborative approach to human and animal health that aims to break down conventional research and policy 'silos'. OH has been used to develop strategies for zoonotic Emerging Infectious Diseases (EID). However, the ethical case for OH as an alternative to more traditional public health approaches is largely absent from the discourse. To study the ethics of OH, we examined perceptions of the human health and ecological priorities for the management of zoonotic EID in the Southeast Asia country of Singapore. We conducted a mixed methods study using a modified Delphi technique with a panel of 32 opinion leaders and 11 semi-structured interviews with a sub-set of those experts in Singapore. Panellists rated concepts of OH and priorities for zoonotic EID preparedness planning using a series of scenarios developed through the study. Interview data were examined qualitatively using thematic analysis. We found that panellists agreed that OH is a cross-disciplinary collaboration among the veterinary, medical, and ecological sciences, as well as relevant government agencies encompassing animal, human, and environmental health. Although human health was often framed as the most important priority in zoonotic EID planning, our qualitative analysis suggested that consideration of non-human animal health and welfare was also important for an effective and ethical response. The panellists also suggested that effective pandemic planning demands regional leadership and investment from wealthier countries to better enable international cooperation. We argue that EID planning under an OH approach would benefit greatly from an ethical ecological framework that accounts for justice in human, animal, and environmental health.

  18. Hydrogen storage in single-walled carbon nanotubes: methods and results

    International Nuclear Information System (INIS)

    Poirier, E.; Chahine, R.; Tessier, A.; Cossement, D.; Lafi, L.; Bose, T.K.

    2004-01-01

    We present high sensitivity gravimetric and volumetric hydrogen sorption measurement systems adapted for in situ conditioning under high temperature and high vacuum. These systems, which allow for precise measurements on small samples and thorough degassing, are used for sorption measurements on carbon nanostructures. We developed one volumetric system for the pressure range 0-1 bar, and two gravimetric systems for 0-1 bar and 0-100 bars. The use of both gravimetric and volumetric methods allows for the cross-checking of the results. The accuracy of the systems has been determined from hydrogen absorption measurements on palladium. The accuracies of the 0-1 bar volumetric and gravimetric systems are about 10 μg and 20 μg respectively. The accuracy of the 0-100 bars gravimetric system is about 20 μg. Hydrogen sorption measurements on single-walled carbon nanotubes (SWNTs) and metal-incorporated- SWNTs are presented. (author)

  19. Highlighting the complexities of a groundwater pilot study during an avian influenza outbreak: Methods, lessons learned, and select contaminant results

    Science.gov (United States)

    Hubbard, Laura E.; Kolpin, Dana W.; Fields, Chad L.; Hladik, Michelle L.; Iwanowicz, Luke

    2017-01-01

    The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment.

  20. Lagrangian methods for blood damage estimation in cardiovascular devices--How numerical implementation affects the results.

    Science.gov (United States)

    Marom, Gil; Bluestein, Danny

    2016-01-01

    This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed.

  1. A computational study of inviscid hypersonic flows using energy relaxation method

    International Nuclear Information System (INIS)

    Nagdewe, Suryakant; Kim, H. D.; Shevare, G. R.

    2008-01-01

    Reasonable analysis of hypersonic flows requires a thermodynamic non-equilibrium model to properly simulate strong shock waves or high pressure and temperature states in the flow field. The energy relaxation method (ERM) has been used to model such a non-equilibrium effect which is generally expressed as a hyperbolic system of equations with a stiff relaxation source term. Relaxation time that is multiplied with source terms is responsible for nonequilibrium in the system. In the present study, a numerical analysis has been carried out with varying values of relaxation time for several hypersonic flows with AUSM (advection upstream splitting method) as a numerical scheme. Vibration modes of thermodynamic nonequilibrium effects are considered. The results obtained showed that, as the relaxation time reduces to zero, the solution marches toward equilibrium, while it shows non-equilibrium effects, as the relaxation time increases. The present computations predicted the experiment results of hypersonic flows with good accuracy. The work carried out suggests that the present energy relaxation method can be robust for analysis of hypersonic flows

  2. Psychosocial and environmental distress resulting from a volcanic eruption: Study protocol.

    Science.gov (United States)

    Warsini, Sri; Usher, Kim; Buettner, Petra; Mills, Jane; West, Caryn; Methods, Res

    2015-01-01

    To examine the psychosocial and environmental distress resulting from the 2010 eruption of the Merapi volcano and explore the experience of living in an environment damaged by a volcanic eruption. Natural disasters cause psychosocial responses in survivors. While volcanic eruptions are an example of a natural disaster, little is currently known about the psychosocial impact on survivors. Volcanic eruptions also cause degradation of the environment, which is linked to environmental distress. However, little is currently known of this phenomenon. An explanatory mixed method study. The research will be divided into three phases. The first phase will involve instrument modification, translation and testing. The second phase will involve a survey to a larger sample using the modified and tested questionnaire. The third phase will involve the collection of interviews from a sub set of the same participants as the second phase. Quantitative data will be analyzed to determine the extent of psychosocial and environmental distress experienced by the participants. Qualitative data will be analyzed to explain the variation among the participants. The results of the study will be used to develop strategies to support survivors in the future and to help ameliorate distress.

  3. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    Science.gov (United States)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  4. Different percentages of false-positive results obtained using five methods for the calculation of reference change values based on simulated normal and ln-normal distributions of data

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2016-01-01

    a homeostatic set point that follows a normal (Gaussian) distribution. This set point (or baseline in steady-state) should be estimated from a set of previous samples, but, in practice, decisions based on reference change value are often based on only two consecutive results. The original reference change value......-positive results. The aim of this study was to investigate false-positive results using five different published methods for calculation of reference change value. METHODS: The five reference change value methods were examined using normally and ln-normally distributed simulated data. RESULTS: One method performed...... best in approaching the theoretical false-positive percentages on normally distributed data and another method performed best on ln-normally distributed data. The commonly used reference change value method based on two results (without use of estimated set point) performed worst both on normally...

  5. Study of applicable methods on safety verification of disposal facilities and waste packages

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Three subjects about safety verification on the disposal of low level radioactive waste were investigated in FY. 2012. For radioactive waste disposal facilities, specs and construction techniques of covering with soil to prevent possible destruction caused by natural events (e.g. earthquake) were studied to consider verification methods for those specs. For waste packages subject to near surface pit disposal, settings of scaling factor and average radioactivity concentration (hereafter referred to as ''SF'') on container-filled and solidified waste packages generated from Kashiwazaki Kariwa Nuclear Power Station Unit 1-5, setting of cesium residual ratio of molten solidified waste generated from Tokai and Tokai No.2 Power Stations, etc. were studied. Those results were finalized in consideration of the opinion from advisory panel, and publicly opened as JNES-EV reports. In FY 2012, five JNES reports were published and these have been used as standards of safety verification on waste packages. The verification method of radioactive wastes subject to near-surface trench disposal and intermediate depth disposal were also studied. For radioactive wastes which will be returned from overseas, determination methods of radioactive concentration, heat rate and hydrogen generation rate of CSD-C were established. Determination methods of radioactive concentration and heat rate of CSD-B were also established. These results will be referred to verification manuals. (author)

  6. Assessing variability in results in systematic reviews of diagnostic studies

    NARCIS (Netherlands)

    Naaktgeboren, Christiana A; Ochodo, Eleanor A; Van Enst, Wynanda A; de Groot, Joris A H; Hooft, Lotty; Leeflang, Mariska M G; Bossuyt, Patrick M; Moons, Karel G M; Reitsma, Johannes B

    2016-01-01

    BACKGROUND: To describe approaches used in systematic reviews of diagnostic test accuracy studies for assessing variability in estimates of accuracy between studies and to provide guidance in this area. METHODS: Meta-analyses of diagnostic test accuracy studies published between May and September

  7. Self-directed learning can outperform direct instruction in the course of a modern German medical curriculum - results of a mixed methods trial.

    Science.gov (United States)

    Peine, Arne; Kabino, Klaus; Spreckelsen, Cord

    2016-06-03

    Modernised medical curricula in Germany (so called "reformed study programs") rely increasingly on alternative self-instructed learning forms such as e-learning and curriculum-guided self-study. However, there is a lack of evidence that these methods can outperform conventional teaching methods such as lectures and seminars. This study was conducted in order to compare extant traditional teaching methods with new instruction forms in terms of learning effect and student satisfaction. In a randomised trial, 244 students of medicine in their third academic year were assigned to one of four study branches representing self-instructed learning forms (e-learning and curriculum-based self-study) and instructed learning forms (lectures and seminars). All groups participated in their respective learning module with standardised materials and instructions. Learning effect was measured with pre-test and post-test multiple-choice questionnaires. Student satisfaction and learning style were examined via self-assessment. Of 244 initial participants, 223 completed the respective module and were included in the study. In the pre-test, the groups showed relatively homogenous scores. All students showed notable improvements compared with the pre-test results. Participants in the non-self-instructed learning groups reached scores of 14.71 (seminar) and 14.37 (lecture), while the groups of self-instructed learners reached higher scores with 17.23 (e-learning) and 15.81 (self-study). All groups improved significantly (p learning group, whose self-assessment improved by 2.36. The study shows that students in modern study curricula learn better through modern self-instructed methods than through conventional methods. These methods should be used more, as they also show good levels of student acceptance and higher scores in personal self-assessment of knowledge.

  8. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study

    Science.gov (United States)

    Riis, Anders H; Hatch, Elizabeth E; Wise, Lauren A; Nielsen, Marie G; Rothman, Kenneth J; Toft Sørensen, Henrik; Mikkelsen, Ellen M

    2017-01-01

    Background The Internet is widely used to conduct research studies on health issues. Many different methods are used to recruit participants for such studies, but little is known about how various recruitment methods compare in terms of efficiency and costs. Objective The aim of our study was to compare online and offline recruitment methods for Internet-based studies in terms of efficiency (number of recruited participants) and costs per participant. Methods We employed several online and offline recruitment methods to enroll 18- to 45-year-old women in an Internet-based Danish prospective cohort study on fertility. Offline methods included press releases, posters, and flyers. Online methods comprised advertisements placed on five different websites, including Facebook and Netdoktor.dk. We defined seven categories of mutually exclusive recruitment methods and used electronic tracking via unique Uniform Resource Locator (URL) and self-reported data to identify the recruitment method for each participant. For each method, we calculated the average cost per participant and efficiency, that is, the total number of recruited participants. Results We recruited 8252 study participants. Of these, 534 were excluded as they could not be assigned to a specific recruitment method. The final study population included 7724 participants, of whom 803 (10.4%) were recruited by offline methods, 3985 (51.6%) by online methods, 2382 (30.8%) by online methods not initiated by us, and 554 (7.2%) by other methods. Overall, the average cost per participant was €6.22 for online methods initiated by us versus €9.06 for offline methods. Costs per participant ranged from €2.74 to €105.53 for online methods and from €0 to €67.50 for offline methods. Lowest average costs per participant were for those recruited from Netdoktor.dk (€2.99) and from Facebook (€3.44). Conclusions In our Internet-based cohort study, online recruitment methods were superior to offline methods in terms

  9. Application of mathematical statistics methods to study fluorite deposits

    International Nuclear Information System (INIS)

    Chermeninov, V.B.

    1980-01-01

    Considered are the applicability of mathematical-statistical methods for the increase of reliability of sampling and geological tasks (study of regularities of ore formation). Compared is the reliability of core sampling (regarding the selective abrasion of fluorite) and neutron activation logging for fluorine. The core sampling data are characterized by higher dispersion than neutron activation logging results (mean value of variation coefficients are 75% and 56% respectively). However the hypothesis of the equality of average two sampling is confirmed; this fact testifies to the absence of considerable variability of ore bodies

  10. Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results

    International Nuclear Information System (INIS)

    Nathan, J; Masson, C; Meyer Forsting, A R; Troldborg, N

    2017-01-01

    The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments. (paper)

  11. Effectiveness of the Dader Method for Pharmaceutical Care on Patients with Bipolar I Disorder: Results from the EMDADER-TAB Study.

    Science.gov (United States)

    Salazar-Ospina, Andrea; Amariles, Pedro; Hincapié-García, Jaime A; González-Avendaño, Sebastián; Benjumea, Dora M; Faus, Maria José; Rodriguez, Luis F

    2017-01-01

    Bipolar I disorder (BD-I) is a chronic illness characterized by relapses alternating with periods of remission. Pharmacists can contribute to improved health outcomes in these patients through pharmaceutical care in association with a multidisciplinary health team; however, more evidence derived from randomized controlled trials (RCTs) is needed to demonstrate the effect of pharmaceutical care on patients with BD-I. To assess the effectiveness of a pharmaceutical intervention using the Dader Method on patients with BD-I, measured by the decrease in the number of hospitalizations, emergency service consultations, and unscheduled outpatient visits from baseline through 1 year of follow-up. This study is based on the EMDADER-TAB trial, which was an RCT designed to compare pharmaceutical care with the usual care given to outpatients with BD-I in a psychiatric clinic. The main outcome was the use of health care services, using Kaplan-Meier methods and Cox regression. The trial protocol was registered in ClinicalTrials.gov (Identifier NCT01750255). 92 patients were included in the EMDADER-TAB study: 43 pharmaceutical care patients (intervention group) and 49 usual care patients (control group). At baseline, no significant differences in demographic and clinical characteristics were found across the 2 groups. After 1 year of follow-up, the risk of hospitalizations and emergencies was higher for the control group than for the intervention group (HR = 9.03, P = 0.042; HR = 3.38, P = 0.034, respectively); however, the risk of unscheduled outpatient visits was higher for the intervention group (HR = 4.18, P = 0.028). There was no "placebo" treatment, and patients in the control group might have produced positive outcomes and reduced the magnitude of differences compared with the intervention group. Compared with usual care, pharmaceutical care significantly reduced hospitalizations and emergency service consultations by outpatients with BD-I. This study received funding from

  12. A method for calculating Bayesian uncertainties on internal doses resulting from complex occupational exposures

    International Nuclear Information System (INIS)

    Puncher, M.; Birchall, A.; Bull, R. K.

    2012-01-01

    Estimating uncertainties on doses from bioassay data is of interest in epidemiology studies that estimate cancer risk from occupational exposures to radionuclides. Bayesian methods provide a logical framework to calculate these uncertainties. However, occupational exposures often consist of many intakes, and this can make the Bayesian calculation computationally intractable. This paper describes a novel strategy for increasing the computational speed of the calculation by simplifying the intake pattern to a single composite intake, termed as complex intake regime (CIR). In order to assess whether this approximation is accurate and fast enough for practical purposes, the method is implemented by the Weighted Likelihood Monte Carlo Sampling (WeLMoS) method and evaluated by comparing its performance with a Markov Chain Monte Carlo (MCMC) method. The MCMC method gives the full solution (all intakes are independent), but is very computationally intensive to apply routinely. Posterior distributions of model parameter values, intakes and doses are calculated for a representative sample of plutonium workers from the United Kingdom Atomic Energy cohort using the WeLMoS method with the CIR and the MCMC method. The distributions are in good agreement: posterior means and Q 0.025 and Q 0.975 quantiles are typically within 20 %. Furthermore, the WeLMoS method using the CIR converges quickly: a typical case history takes around 10-20 min on a fast workstation, whereas the MCMC method took around 12-hr. The advantages and disadvantages of the method are discussed. (authors)

  13. Enhancing activated-peroxide formulations for porous materials: Test methods and results

    Energy Technology Data Exchange (ETDEWEB)

    Krauter, Paula [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tucker, Mark D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tezak, Matthew S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boucher, Raymond [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-12-01

    During an urban wide-area incident involving the release of a biological warfare agent, the recovery/restoration effort will require extensive resources and will tax the current capabilities of the government and private contractors. In fact, resources may be so limited that decontamination by facility owners/occupants may become necessary and a simple decontamination process and material should be available for this use. One potential process for use by facility owners/occupants would be a liquid sporicidal decontaminant, such as pHamended bleach or activated-peroxide, and simple application devices. While pH-amended bleach is currently the recommended low-tech decontamination solution, a less corrosive and toxic decontaminant is desirable. The objective of this project is to provide an operational assessment of an alternative to chlorine bleach for low-tech decontamination applications activated hydrogen peroxide. This report provides the methods and results for activatedperoxide evaluation experiments. The results suggest that the efficacy of an activated-peroxide decontaminant is similar to pH-amended bleach on many common materials.

  14. Behavioral Changes Based on a Course in Agroecology: A Mixed Methods Study

    Science.gov (United States)

    Harms, Kristyn; King, James; Francis, Charles

    2009-01-01

    This study evaluated and described student perceptions of a course in agroecology to determine if participants experienced changed perceptions and behaviors resulting from the Agroecosystems Analysis course. A triangulation validating quantitative data mixed methods approach included a written survey comprised of both quantitative and open-ended…

  15. Locating previously unknown patterns in data-mining results: a dual data- and knowledge-mining method

    Directory of Open Access Journals (Sweden)

    Knaus William A

    2006-03-01

    Full Text Available Abstract Background Data mining can be utilized to automate analysis of substantial amounts of data produced in many organizations. However, data mining produces large numbers of rules and patterns, many of which are not useful. Existing methods for pruning uninteresting patterns have only begun to automate the knowledge acquisition step (which is required for subjective measures of interestingness, hence leaving a serious bottleneck. In this paper we propose a method for automatically acquiring knowledge to shorten the pattern list by locating the novel and interesting ones. Methods The dual-mining method is based on automatically comparing the strength of patterns mined from a database with the strength of equivalent patterns mined from a relevant knowledgebase. When these two estimates of pattern strength do not match, a high "surprise score" is assigned to the pattern, identifying the pattern as potentially interesting. The surprise score captures the degree of novelty or interestingness of the mined pattern. In addition, we show how to compute p values for each surprise score, thus filtering out noise and attaching statistical significance. Results We have implemented the dual-mining method using scripts written in Perl and R. We applied the method to a large patient database and a biomedical literature citation knowledgebase. The system estimated association scores for 50,000 patterns, composed of disease entities and lab results, by querying the database and the knowledgebase. It then computed the surprise scores by comparing the pairs of association scores. Finally, the system estimated statistical significance of the scores. Conclusion The dual-mining method eliminates more than 90% of patterns with strong associations, thus identifying them as uninteresting. We found that the pruning of patterns using the surprise score matched the biomedical evidence in the 100 cases that were examined by hand. The method automates the acquisition of

  16. Improving the accuracy of myocardial perfusion scintigraphy results by machine learning method

    International Nuclear Information System (INIS)

    Groselj, C.; Kukar, M.

    2002-01-01

    Full text: Machine learning (ML) as rapidly growing artificial intelligence subfield has already proven in last decade to be a useful tool in many fields of decision making, also in some fields of medicine. Its decision accuracy usually exceeds the human one. To assess applicability of ML in interpretation the results of stress myocardial perfusion scintigraphy for CAD diagnosis. The 327 patient's data of planar stress myocardial perfusion scintigraphy were reevaluated in usual way. Comparing them with the results of coronary angiography the sensitivity, specificity and accuracy for the investigation was computed. The data were digitized and the decision procedure repeated by ML program 'Naive Bayesian classifier'. As the ML is able to simultaneously manipulate of whatever number of data, all reachable disease connected data (regarding history, habitus, risk factors, stress results) were added. The sensitivity, specificity and accuracy for scintigraphy were expressed in this way. The results of both decision procedures were compared. With ML method 19 patients more out of 327 (5.8 %) were correctly diagnosed by stress myocardial perfusion scintigraphy. ML could be an important tool for decision making in myocardial perfusion scintigraphy. (author)

  17. Drug-Drug/Drug-Excipient Compatibility Studies on Curcumin using Non-Thermal Methods

    Directory of Open Access Journals (Sweden)

    Moorthi Chidambaram

    2014-05-01

    Full Text Available Purpose: Curcumin is a hydrophobic polyphenol isolated from dried rhizome of turmeric. Clinical usefulness of curcumin in the treatment of cancer is limited due to poor aqueous solubility, hydrolytic degradation, metabolism, and poor oral bioavailability. To overcome these limitations, we proposed to fabricate curcumin-piperine, curcumin-quercetin and curcumin-silibinin loaded polymeric nanoformulation. However, unfavourable combinations of drug-drug and drug-excipient may result in interaction and rises the safety concern. Hence, the present study was aimed to assess the interaction of curcumin with excipients used in nanoformulations. Methods: Isothermal stress testing method was used to assess the compatibility of drug-drug/drug-excipient. Results: The combination of curcumin-piperine, curcumin-quercetin, curcumin-silibinin and the combination of other excipients with curcumin, piperine, quercetin and silibinin have not shown any significant physical and chemical instability. Conclusion: The study concludes that the curcumin, piperine, quercetin and silibinin is compatible with each other and with other excipients.

  18. Testing the usability of the Rapid Impact Assessment Matrix (RIAM) method for comparison of EIA and SEA results

    International Nuclear Information System (INIS)

    Kuitunen, Markku; Jalava, Kimmo; Hirvonen, Kimmo

    2008-01-01

    This study examines how the results of Environmental Impact Assessment (EIA) and Strategic Environmental Assessment (SEA) could be compared using the Rapid Impact Assessment Matrix (RIAM) method. There are many tools and techniques that have been developed for use in impact assessment processes, including scoping, checklists, matrices, qualitative and quantitative models, literature reviews, and decision-support systems. While impact assessment processes have become more technically complicated, it is recognized that approaches including simpler applications of available tools and techniques are also appropriate. The Rapid Impact Assessment Matrix (RIAM) is a tool for organizing, analysing and presenting the results of a holistic EIA. RIAM was originally developed to compare the impact of alternative procedures in a single project. In this study, we used RIAM to compare the environmental and social impact of different projects, plans and programs realized within the same geographical area. RIAM scoring is based on five separate criteria. The RIAM criteria were applied to the impact that was considered to be the most significant in the evaluated cases, and scores were given both on environmental and social impact. Our results revealed that the RIAM method could be used for comparison and ranking of separate and distinct projects, plans, programs and policies, based on their negative or positive impact. Our data included 142 cases from the area of Central Finland that is covered by the Regional Council of Central Finland. This sample consisted of various types of projects, ranging from road construction to education programs that applied for EU funding

  19. Study on Mechanism Experiments and Evaluation Methods for Water Eutrophication

    Directory of Open Access Journals (Sweden)

    Jiabin Yu

    2017-01-01

    Full Text Available The process of water eutrophication involves the interaction of external factors, nutrients, microorganisms, and other factors. It is complex and has not yet been effectively studied. To examine the formation process of water eutrophication, a set of orthogonal experiments with three factors and four levels is designed to analyze the key factors. At the same time, with the help of a large amount of monitoring data, the principal component analysis method is used to extract the main components of water eutrophication and determine the effective evaluation indicators of eutrophication. Finally, the Bayesian theory of uncertainty is applied to the evaluation of the eutrophication process to evaluate the sample data. The simulation results demonstrate the validity of the research method.

  20. Pediatric esophageal scintigraphy. Results of 200 studies

    International Nuclear Information System (INIS)

    Guillet, J.; Wynchank, S.; Basse-Cathalinat, B.; Christophe, E.; Ducassou, D.; Blanquet, P.

    1983-01-01

    Esophageal transit of a small volume of watery liquid has been observed scintigraphically in 200 studies performed on patients aged between 6 days and 16 years. Qualitative information concerning esophageal morphology and function in the various phases of deglutition, and scintigraphic features of achalasia, stenosis, and other pathologies are described. Measured esophageal transit time and its normal variation, its relevance to the diagnosis of esophagitis, and the monitoring of treatment are discussed. This technique observing distinct deglutitions has proven a useful diagnostic tool. Its advantages and limitations are discussed in comparison with other methods

  1. Pediatric esophageal scintigraphy. Results of 200 studies

    Energy Technology Data Exchange (ETDEWEB)

    Guillet, J.; Wynchank, S.; Basse-Cathalinat, B.; Christophe, E.; Ducassou, D.; Blanquet, P.

    1983-09-01

    Esophageal transit of a small volume of watery liquid has been observed scintigraphically in 200 studies performed on patients aged between 6 days and 16 years. Qualitative information concerning esophageal morphology and function in the various phases of deglutition, and scintigraphic features of achalasia, stenosis, and other pathologies are described. Measured esophageal transit time and its normal variation, its relevance to the diagnosis of esophagitis, and the monitoring of treatment are discussed. This technique observing distinct deglutitions has proven a useful diagnostic tool. Its advantages and limitations are discussed in comparison with other methods.

  2. Suitability of voltage stability study methods for real-time assessment

    DEFF Research Database (Denmark)

    Perez, Angel; Jóhannsson, Hjörtur; Vancraeyveld, Pieter

    2013-01-01

    This paper analyzes the suitability of existing methods for long-term voltage stability assessment for real-time operation. An overview of the relevant methods is followed with a comparison that takes into account the accuracy, computational efficiency and characteristics when used for security...... assessment. The results enable an evaluation of the run time of each method with respect to the number of inputs. Furthermore, the results assist in identifying which of the methods is most suitable for realtime operation in future power system with production based on fluctuating energy sources....

  3. Studying the Impact of Academic Mobility on Intercultural Competence: A Mixed-Methods Perspective

    Science.gov (United States)

    Cots, Josep M.; Aguilar, Marta; Mas-Alcolea, Sònia; Llanes, Àngels

    2016-01-01

    This paper contributes to the study of the impact of academic mobility on the development of students' intercultural competence (IC). Following Byram, IC is seen as comprising the three components of knowledge, behaviour and attitude. The study adopts a mixed-methods approach, analysing the results of a quantitative pre-stay post-stay survey…

  4. Studying Intense Pulsed Light Method Along With Corticosteroid Injection in Treating Keloid Scars

    OpenAIRE

    Shamsi Meymandi, Simin; Rezazadeh, Azadeh; Ekhlasi, Ali

    2014-01-01

    Background: Results of various studies suggest that the hypertrophic and keloid scars are highly prevalent in the general population and are irritating both physically and mentally. Objective: Considering the variety of existing therapies, intense pulsed light (IPL) method along with corticosteroid injection was evaluated in treating these scars. Materials and Methods: 86 subjects were included in this clinical trial. Eight sessions of therapeutic intervention were done with IPL along with co...

  5. Coal-nuclear energy system. Method of study. Examples of results

    International Nuclear Information System (INIS)

    Deneuve, F.; Le Penhuizic, B.

    1981-01-01

    Given the outlook for hydrocarbon depletion, three primary energy sources could enable supplies to be diversified, i.e. nuclear energy, coal and solar energy. These primary energy sources can rarely be used directly and must be converted into energy carriers such as electricity, hydrogen, substitute natural gas, liquid hydrocarbons derived from coal, etc. The nature of future gas carriers and their position in the national energy balance must be examined. Within the framework of an overall energy pattern. Many of the potential conversion processes are often interrelated through their production and consumption. Likewise, seasonal variations in consumption make it necessary to design production plants for peak demand or to create large-scale storage facilities. An initial model taking these interactions into consideration has been worked out to represent the variety of solutions possible. This model can be used to evaluate the technical paths to be followed within the framework of different assumptions concerning the future [fr

  6. Nondestructive methods for the structural evaluation of wood floor systems in historic buildings : preliminary results : [abstract

    Science.gov (United States)

    Zhiyong Cai; Michael O. Hunt; Robert J. Ross; Lawrence A. Soltis

    1999-01-01

    To date, there is no standard method for evaluating the structural integrity of wood floor systems using nondestructive techniques. Current methods of examination and assessment are often subjective and therefore tend to yield imprecise or variable results. For this reason, estimates of allowable wood floor loads are often conservative. The assignment of conservatively...

  7. Evaluation of PDA Technical Report No 33. Statistical Testing Recommendations for a Rapid Microbiological Method Case Study.

    Science.gov (United States)

    Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David

    2015-01-01

    New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc

  8. The numerical method of inverse Laplace transform for calculation of overvoltages in power transformers and test results

    Directory of Open Access Journals (Sweden)

    Mikulović Jovan Č.

    2014-01-01

    Full Text Available A methodology for calculation of overvoltages in transformer windings, based on a numerical method of inverse Laplace transform, is presented. Mathematical model of transformer windings is described by partial differential equations corresponding to distributed parameters electrical circuits. The procedure of calculating overvoltages is applied to windings having either isolated neutral point, or grounded neutral point, or neutral point grounded through impedance. A comparative analysis of the calculation results obtained by the proposed numerical method and by analytical method of calculation of overvoltages in transformer windings is presented. The results computed by the proposed method and measured voltage distributions, when a voltage surge is applied to a three-phase 30 kVA power transformer, are compared. [Projekat Ministartsva nauke Republike Srbije, br. TR-33037 i br. TR-33020

  9. Online Recruitment Methods for Web-Based and Mobile Health Studies: A Review of the Literature.

    Science.gov (United States)

    Lane, Taylor S; Armin, Julie; Gordon, Judith S

    2015-07-22

    Internet and mobile health (mHealth) apps hold promise for expanding the reach of evidence-based health interventions. Research in this area is rapidly expanding. However, these studies may experience problems with recruitment and retention. Web-based and mHealth studies are in need of a wide-reaching and low-cost method of recruitment that will also effectively retain participants for the duration of the study. Online recruitment may be a low-cost and wide-reaching tool in comparison to traditional recruitment methods, although empirical evidence is limited. This study aims to review the literature on online recruitment for, and retention in, mHealth studies. We conducted a review of the literature of studies examining online recruitment methods as a viable means of obtaining mHealth research participants. The data sources used were PubMed, CINAHL, EbscoHost, PyscINFO, and MEDLINE. Studies reporting at least one method of online recruitment were included. A narrative approach enabled the authors to discuss the variability in recruitment results, as well as in recruitment duration and study design. From 550 initial publications, 12 studies were included in this review. The studies reported multiple uses and outcomes for online recruitment methods. Web-based recruitment was the only type of recruitment used in 67% (8/12) of the studies. Online recruitment was used for studies with a variety of health domains: smoking cessation (58%; 7/12) and mental health (17%; 2/12) being the most common. Recruitment duration lasted under a year in 67% (8/12) of the studies, with an average of 5 months spent on recruiting. In those studies that spent over a year (33%; 4/12), an average of 17 months was spent on recruiting. A little less than half (42%; 5/12) of the studies found Facebook ads or newsfeed posts to be an effective method of recruitment, a quarter (25%; 3/12) of the studies found Google ads to be the most effective way to reach participants, and one study showed

  10. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  11. Justice Is the Missing Link in One Health: Results of a Mixed Methods Study in an Urban City State.

    Directory of Open Access Journals (Sweden)

    Tamra Lysaght

    Full Text Available One Health (OH is an interdisciplinary collaborative approach to human and animal health that aims to break down conventional research and policy 'silos'. OH has been used to develop strategies for zoonotic Emerging Infectious Diseases (EID. However, the ethical case for OH as an alternative to more traditional public health approaches is largely absent from the discourse. To study the ethics of OH, we examined perceptions of the human health and ecological priorities for the management of zoonotic EID in the Southeast Asia country of Singapore.We conducted a mixed methods study using a modified Delphi technique with a panel of 32 opinion leaders and 11 semi-structured interviews with a sub-set of those experts in Singapore. Panellists rated concepts of OH and priorities for zoonotic EID preparedness planning using a series of scenarios developed through the study. Interview data were examined qualitatively using thematic analysis.We found that panellists agreed that OH is a cross-disciplinary collaboration among the veterinary, medical, and ecological sciences, as well as relevant government agencies encompassing animal, human, and environmental health. Although human health was often framed as the most important priority in zoonotic EID planning, our qualitative analysis suggested that consideration of non-human animal health and welfare was also important for an effective and ethical response. The panellists also suggested that effective pandemic planning demands regional leadership and investment from wealthier countries to better enable international cooperation.We argue that EID planning under an OH approach would benefit greatly from an ethical ecological framework that accounts for justice in human, animal, and environmental health.

  12. Application of nuclear analytical methods to heavy metal pollution studies of estuaries

    International Nuclear Information System (INIS)

    Anders, B.; Junge, W.; Knoth, J.; Michaelis, W.; Pepelnik, R.; Schwenke, H.

    1984-01-01

    Important objectives of heavy metal pollution studies of estuaries are the understanding of the transport phenomena in these complex ecosystems and the discovery of the pollution history and the geochemical background. Such studies require high precision and accuracy of the analytical methods. Moreover, pronounced spatial heterogeneities and temporal variabilities that are typical for estuaries necessitate the analysis of a great number of samples if relevant results are to be obtained. Both requirements can economically be fulfilled by a proper combination of analytical methods. Applications of energy-dispersive X-ray fluorescence analysis with total reflection of the exciting beam at the sample support and of neutron activation analysis with both thermal and fast neutrons are reported in the light of pollution studies performed in the Lower Elbe River. (orig.)

  13. VARIATIONS OF THE ENERGY METHOD FOR STUDYING CONSTRUCTION STABILITY

    Directory of Open Access Journals (Sweden)

    A. M. Dibirgadzhiev

    2017-01-01

    Full Text Available Objectives. The aim of the work is to find the most rational form of expression of the potential energy of a nonlinear system with the subsequent use of algebraic means and geometric images of catastrophe theory for studying the behaviour of a construction under load. Various forms of stability criteria for the equilibrium states of constructions are investigated. Some aspects of the using various forms of expression of the system’s total energy are considered, oriented to the subsequent use of the catastrophe theory methods for solving the nonlinear problems of construction calculation associated with discontinuous phenomena.Methods. According to the form of the potential energy expression, the mathematical description of the problem being solved is linked to a specific catastrophe of a universal character from the list of catastrophes. After this, the behaviour of the system can be predicted on the basis of the fundamental propositions formulated in catastrophe theory without integrating the corresponding system of nonlinear differential equations of high order in partial derivatives, to which the solution of such problems is reduced.Results. The result is presented in the form of uniform geometric images containing all the necessary qualitative and quantitative information about the deformation of whole construction classes under load for a wide range of changes in the values of external (control and internal (behavioural parameters.Conclusion. Methods based on catastrophe theory are an effective mathematical tool for solving non-linear boundary-value problems with parameters associated with discontinuous phenomena, which are poorly analysable by conventional methods. However, they have not yet received due attention from researchers, especially in the field of stability calculations, which remains a complex, relevant and attractive problem within structural mechanics. To solve a concrete nonlinear boundary value problem for calculating

  14. Statistical methods to correct for verification bias in diagnostic studies are inadequate when there are few false negatives: a simulation study

    Directory of Open Access Journals (Sweden)

    Vickers Andrew J

    2008-11-01

    Full Text Available Abstract Background A common feature of diagnostic research is that results for a diagnostic gold standard are available primarily for patients who are positive for the test under investigation. Data from such studies are subject to what has been termed "verification bias". We evaluated statistical methods for verification bias correction when there are few false negatives. Methods A simulation study was conducted of a screening study subject to verification bias. We compared estimates of the area-under-the-curve (AUC corrected for verification bias varying both the rate and mechanism of verification. Results In a single simulated data set, varying false negatives from 0 to 4 led to verification bias corrected AUCs ranging from 0.550 to 0.852. Excess variation associated with low numbers of false negatives was confirmed in simulation studies and by analyses of published studies that incorporated verification bias correction. The 2.5th – 97.5th centile range constituted as much as 60% of the possible range of AUCs for some simulations. Conclusion Screening programs are designed such that there are few false negatives. Standard statistical methods for verification bias correction are inadequate in this circumstance.

  15. A New Method for Studying the Periodic System Based on a Kohonen Neural Network

    Science.gov (United States)

    Chen, David Zhekai

    2010-01-01

    A new method for studying the periodic system is described based on the combination of a Kohonen neural network and a set of chemical and physical properties. The classification results are directly shown in a two-dimensional map and easy to interpret. This is one of the major advantages of this approach over other methods reported in the…

  16. Spin Is Common in Studies Assessing Robotic Colorectal Surgery: An Assessment of Reporting and Interpretation of Study Results.

    Science.gov (United States)

    Patel, Sunil V; Van Koughnett, Julie Ann M; Howe, Brett; Wexner, Steven D

    2015-09-01

    potential conflict(s) of interest, is concerning. Readers of these articles need to be critical of author conclusions, and publishers should ensure that conclusions correspond with the study methods and results.

  17. Highlighting the complexities of a groundwater pilot study during an avian influenza outbreak: Methods, lessons learned, and select contaminant results.

    Science.gov (United States)

    Hubbard, Laura E; Kolpin, Dana W; Fields, Chad L; Hladik, Michelle L; Iwanowicz, Luke R

    2017-10-01

    The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment. Published by Elsevier Inc.

  18. The study of technological prevention method of road accident ...

    African Journals Online (AJOL)

    The study of technological prevention method of road accident related to driver and vehicle. ... road accident prevention method based on the factors studied. The study of this paper can provide forceful data analysis support for the road traffic safety related research. Keywords: road accident; accident prevention; road safety.

  19. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  20. A comparative study of the maximum power point tracking methods for PV systems

    International Nuclear Information System (INIS)

    Liu, Yali; Li, Ming; Ji, Xu; Luo, Xi; Wang, Meidi; Zhang, Ying

    2014-01-01

    Highlights: • An improved maximum power point tracking method for PV system was proposed. • Theoretical derivation procedure of the proposed method was provided. • Simulation models of MPPT trackers were established based on MATLAB/Simulink. • Experiments were conducted to verify the effectiveness of the proposed MPPT method. - Abstract: Maximum power point tracking (MPPT) algorithms play an important role in the optimization of the power and efficiency of a photovoltaic (PV) generation system. According to the contradiction of the classical Perturb and Observe (P and Oa) method between the corresponding speed and the tracking accuracy on steady-state, an improved P and O (P and Ob) method has been put forward in this paper by using the Atken interpolation algorithm. To validate the correctness and performance of the proposed method, simulation and experimental study have been implemented. Simulation models of classical P and Oa method and improved P and Ob method have been established by MATLAB/Simulink to analyze each technique under varying solar irradiation and temperature. The experimental results show that the tracking efficiency of P and Ob method is an average of 93% compared to 72% for P and Oa method, this conclusion basically agree with the simulation study. Finally, we proposed the applicable conditions and scope of these MPPT methods in the practical application

  1. Environmental reference materials methods and case studies

    DEFF Research Database (Denmark)

    Schramm-Nielsen, Karina Edith

    1998-01-01

    . This study lasted 22 months as well. The samples were produced and stored according to a 2³ factorial design. The influences of storage temperature, UV radiation and ultra-filtration on the stability of NH4-N and total phosphorous have been investigated. A Youden plot method is suggested for the graphical....... The methods have been evaluated with regard to their robustness towards variations in the chemical analytical method and with regard to the number of times a significant out of control situation is indicated. The second study regards the stability of NH4-N and total phosphorous in autoclaved seawater samples...... with wastewater. The purpose was to improve ortho-phosphate (and total phosphorous) homogeneity. A procedure is suggested which includes freeze-drying and redissolving. All calculations have been performed in SAS® primarily by means of elementary procedures, analyses of variance procedures, SAS Insight and SAS...

  2. A new method for processing INAA results without the necessity of standards

    International Nuclear Information System (INIS)

    Hemon, G.; Philippot, J.C.

    1986-01-01

    When neutron activation analysis is used for elemental determinations in samples taken from environment, and quite different in origin, certain questions arise: is the method absolute or relative, precise or accurate? How should objects be chosen to represent the subject studied? How should the conclusions of the measurement be used? How are the quality and intensity of the flux to be controlled? What corrections are needed for the effects of perturbing elements, uranium and boron? How sensitive is the method or - which amounts to the same thing - what is the best time to analyse an element in a given matrix? The authors attempt to answer these questions and illustrate the subject by way of a few specific examples: mineral and river water, sea and river sediments, aerosols, quartz tools, hair, nodules and Mn deposits, diamonds, wines, PWR effluents. (author)

  3. Meta-analytic methods for pooling rates when follow-up duration varies: a case study

    Directory of Open Access Journals (Sweden)

    Wolf Fredric M

    2004-07-01

    Full Text Available Abstract Background Meta-analysis can be used to pool rate measures across studies, but challenges arise when follow-up duration varies. Our objective was to compare different statistical approaches for pooling count data of varying follow-up times in terms of estimates of effect, precision, and clinical interpretability. Methods We examined data from a published Cochrane Review of asthma self-management education in children. We selected two rate measures with the largest number of contributing studies: school absences and emergency room (ER visits. We estimated fixed- and random-effects standardized weighted mean differences (SMD, stratified incidence rate differences (IRD, and stratified incidence rate ratios (IRR. We also fit Poisson regression models, which allowed for further adjustment for clustering by study. Results For both outcomes, all methods gave qualitatively similar estimates of effect in favor of the intervention. For school absences, SMD showed modest results in favor of the intervention (SMD -0.14, 95% CI -0.23 to -0.04. IRD implied that the intervention reduced school absences by 1.8 days per year (IRD -0.15 days/child-month, 95% CI -0.19 to -0.11, while IRR suggested a 14% reduction in absences (IRR 0.86, 95% CI 0.83 to 0.90. For ER visits, SMD showed a modest benefit in favor of the intervention (SMD -0.27, 95% CI: -0.45 to -0.09. IRD implied that the intervention reduced ER visits by 1 visit every 2 years (IRD -0.04 visits/child-month, 95% CI: -0.05 to -0.03, while IRR suggested a 34% reduction in ER visits (IRR 0.66, 95% CI 0.59 to 0.74. In Poisson models, adjustment for clustering lowered the precision of the estimates relative to stratified IRR results. For ER visits but not school absences, failure to incorporate study indicators resulted in a different estimate of effect (unadjusted IRR 0.77, 95% CI 0.59 to 0.99. Conclusions Choice of method among the ones presented had little effect on inference but affected the

  4. AES Water Architecture Study Interim Results

    Science.gov (United States)

    Sarguisingh, Miriam J.

    2012-01-01

    The mission of the Advanced Exploration System (AES) Water Recovery Project (WRP) is to develop advanced water recovery systems in order to enable NASA human exploration missions beyond low earth orbit (LEO). The primary objective of the AES WRP is to develop water recovery technologies critical to near term missions beyond LEO. The secondary objective is to continue to advance mid-readiness level technologies to support future NASA missions. An effort is being undertaken to establish the architecture for the AES Water Recovery System (WRS) that meets both near and long term objectives. The resultant architecture will be used to guide future technical planning, establish a baseline development roadmap for technology infusion, and establish baseline assumptions for integrated ground and on-orbit environmental control and life support systems (ECLSS) definition. This study is being performed in three phases. Phase I of this study established the scope of the study through definition of the mission requirements and constraints, as well as indentifying all possible WRS configurations that meet the mission requirements. Phase II of this study focused on the near term space exploration objectives by establishing an ISS-derived reference schematic for long-duration (>180 day) in-space habitation. Phase III will focus on the long term space exploration objectives, trading the viable WRS configurations identified in Phase I to identify the ideal exploration WRS. The results of Phases I and II are discussed in this paper.

  5. FNSP-studies, 1985-1992 : results and recommendations

    NARCIS (Netherlands)

    Hoorweg, J.C.

    1993-01-01

    The present report is a compilation of summaries detailing the results and recommendations of the more than twenty studies completed to date as part of the Food and Nutrition Studies Programme (FNSP), an international cooperation agreement between the Ministry of Planning and National Development in

  6. Chemical abundances of fast-rotating massive stars. I. Description of the methods and individual results

    Science.gov (United States)

    Cazorla, Constantin; Morel, Thierry; Nazé, Yaël; Rauw, Gregor; Semaan, Thierry; Daflon, Simone; Oey, M. S.

    2017-07-01

    Aims: Recent observations have challenged our understanding of rotational mixing in massive stars by revealing a population of fast-rotating objects with apparently normal surface nitrogen abundances. However, several questions have arisen because of a number of issues, which have rendered a reinvestigation necessary; these issues include the presence of numerous upper limits for the nitrogen abundance, unknown multiplicity status, and a mix of stars with different physical properties, such as their mass and evolutionary state, which are known to control the amount of rotational mixing. Methods: We have carefully selected a large sample of bright, fast-rotating early-type stars of our Galaxy (40 objects with spectral types between B0.5 and O4). Their high-quality, high-resolution optical spectra were then analysed with the stellar atmosphere modelling codes DETAIL/SURFACE or CMFGEN, depending on the temperature of the target. Several internal and external checks were performed to validate our methods; notably, we compared our results with literature data for some well-known objects, studied the effect of gravity darkening, or confronted the results provided by the two codes for stars amenable to both analyses. Furthermore, we studied the radial velocities of the stars to assess their binarity. Results: This first part of our study presents our methods and provides the derived stellar parameters, He, CNO abundances, and the multiplicity status of every star of the sample. It is the first time that He and CNO abundances of such a large number of Galactic massive fast rotators are determined in a homogeneous way. Based on observations obtained with the Heidelberg Extended Range Optical Spectrograph (HEROS) at the Telescopio Internacional de Guanajuato (TIGRE) with the SOPHIE échelle spectrograph at the Haute-Provence Observatory (OHP; Institut Pytheas; CNRS, France), and with the Magellan Inamori Kyocera Echelle (MIKE) spectrograph at the Magellan II Clay telescope

  7. Methods to study postprandial lipemia

    DEFF Research Database (Denmark)

    Ooi, Teik Chye; Nordestgaard, Børge G

    2011-01-01

    to the liver. In general, PPL occurs over 4-6 h in normal individuals, depending on the amount and type of fats consumed. The complexity of PPL changes is compounded by ingestion of food before the previous meal is fully processed. PPL testing is done to determine the impact of (a) exogenous factors...... such as the amount and type of food consumed, and (b) endogenous factors such as the metabolic/genetic status of the subjects, on PPL. To study PPL appropriately, different methods are used to suit the study goal. This paper provides an overview of the methodological aspects of PPL testing. It deals with markers...

  8. Neurochemistry of Alzheimer's disease and related dementias: Results of metabolic imaging and future application of ligand binding methods

    International Nuclear Information System (INIS)

    Frey, K.A.; Koeppe, R.A.; Kuhl, D.E.

    1991-01-01

    Although Alzheimer's disease (AD) has been recognized for over a decade as a leading cause of cognitive decline in the elderly, its etiology remains unknown. Radiotracer imaging studies have revealed characteristic patterns of abnormal energy metabolism and blood flow in AD. A consistent reduction in cerebral glucose metabolism, determined by positron emission tomography, is observed in the parietal, temporal, and frontal association cortices. It is proposed that this occurs on the basis of diffuse cortical pathology, resulting in disproportionate loss of presynaptic input to higher cortical association areas. Postmortem neurochemical studies consistently indicate a severe depletion of cortical presynaptic cholinergic markers in AD. This is accounted for by loss of cholinergic projection neurons in the basal forebrain. In addition, loss of extrinsic serotonergic innervation of the cortex and losses of intrinsic cortical markers such as somatostatin, substance P, glutamate receptors, and glutamate- and GABA-uptake sites are reported. These observations offer the opportunity for study in vivo with the use of radioligand imaging methods under development. The role of tracer imaging studies in the investigation and diagnosis of dementia is likely to become increasingly central, as metabolic imaging provides evidence of abnormality early in the clinical course. New neurochemical imaging methods will allow direct testing of hypotheses of selective neuronal degeneration, and will assist in design of future studies of AD pathophysiology

  9. A study of an active magnetic shielding method for the superconductive Maglev vehicle

    International Nuclear Information System (INIS)

    Nemoto, K.; Komori, M.

    2010-01-01

    Various methods of magnetic shielding have been studied so far to reduce magnetic field strength inside the passenger room of the superconductive Maglev vehicle. Magnetic shielding methods with ferromagnetic materials are very useful, but they tend to be heavier for large space. Though some passive magnetic shielding methods using induced currents in superconducting bulks or superconducting coils have also been studied, the induced current is relatively small and it is difficult to get satisfactory magnetic shielding performance for the passenger room of the Maglev vehicle. Thus, we have proposed an active magnetic shielding method with some superconducting coils of the same length as propulsion-levitation-guidance superconducting coils of the Maglev vehicle. They are arranged under the passenger room of the Maglev vehicle. Then, we studied the shielding effect by canceling magnetic flux density in the passenger room by way of adjusting magnetomotive-forces of the magnetic shielding coils. As a result, it is found that a simple arrangement of two magnetic shielding coils for one propulsion-levitation-guidance superconducting coil on the vehicle shows an effective magnetic shielding.

  10. A study of an active magnetic shielding method for the superconductive Maglev vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Nemoto, K., E-mail: nemoto@kamakuranet.ne.j [Kyushu Institute of Technology, Dept. of Applied Science for Integrated System Engineering, 1-1 Sensui, Tobata, Kitakyushu, Fukuoka 804-8550 (Japan); Komori, M. [Kyushu Institute of Technology, Dept. of Applied Science for Integrated System Engineering, 1-1 Sensui, Tobata, Kitakyushu, Fukuoka 804-8550 (Japan)

    2010-11-01

    Various methods of magnetic shielding have been studied so far to reduce magnetic field strength inside the passenger room of the superconductive Maglev vehicle. Magnetic shielding methods with ferromagnetic materials are very useful, but they tend to be heavier for large space. Though some passive magnetic shielding methods using induced currents in superconducting bulks or superconducting coils have also been studied, the induced current is relatively small and it is difficult to get satisfactory magnetic shielding performance for the passenger room of the Maglev vehicle. Thus, we have proposed an active magnetic shielding method with some superconducting coils of the same length as propulsion-levitation-guidance superconducting coils of the Maglev vehicle. They are arranged under the passenger room of the Maglev vehicle. Then, we studied the shielding effect by canceling magnetic flux density in the passenger room by way of adjusting magnetomotive-forces of the magnetic shielding coils. As a result, it is found that a simple arrangement of two magnetic shielding coils for one propulsion-levitation-guidance superconducting coil on the vehicle shows an effective magnetic shielding.

  11. A study of economic utility resulting from CERN

    International Nuclear Information System (INIS)

    Schmied, H.

    1975-01-01

    The study attempts to quantify the technical and economic benefit to the manufacturing industries involved in CERN contracts, in relation to the expenditures on CERN by its Member States. Interviews were carried out in some 130 European firms, who supplied data on estimates of increased sales and decreased costs due to CERN contracts. This 'economic utility' totals 1,665 million Swiss francs (up to the year 1978), compared with a sales value to CERN of 394 MSF. Utility/sales ratios range from 0.9 to 7.3 for application fields of cables, magnets, cooling systems, vacuum equipment, electronics, and steels; they are as high as 17.3 for computers and 31.6 for precision mechanics. Some 80 per cent of the total reported utility results from sales to markets outside high-energy and nuclear physics, for example, railways, ship-building, refrigeration, power generation and power distribution. For the 877 MSF spent by CERN in European industry from its over-all budget of 3,500 MSF during 1955 to 1973, the total utility is estimated to be nearly 5,000 MSF. The method and procedure of analysis and quantification are discussed in detail and some specific cases are presented as examples. (author)

  12. Assessment of Three Flood Hazard Mapping Methods: A Case Study of Perlis

    Science.gov (United States)

    Azizat, Nazirah; Omar, Wan Mohd Sabki Wan

    2018-03-01

    Flood is a common natural disaster and also affect the all state in Malaysia. Regarding to Drainage and Irrigation Department (DID) in 2007, about 29, 270 km2 or 9 percent of region of the country is prone to flooding. Flood can be such devastating catastrophic which can effected to people, economy and environment. Flood hazard mapping can be used is an important part in flood assessment to define those high risk area prone to flooding. The purposes of this study are to prepare a flood hazard mapping in Perlis and to evaluate flood hazard using frequency ratio, statistical index and Poisson method. The six factors affecting the occurrence of flood including elevation, distance from the drainage network, rainfall, soil texture, geology and erosion were created using ArcGIS 10.1 software. Flood location map in this study has been generated based on flooded area in year 2010 from DID. These parameters and flood location map were analysed to prepare flood hazard mapping in representing the probability of flood area. The results of the analysis were verified using flood location data in year 2013, 2014, 2015. The comparison result showed statistical index method is better in prediction of flood area rather than frequency ratio and Poisson method.

  13. Comparative study of age estimation using dentinal translucency by digital and conventional methods

    Science.gov (United States)

    Bommannavar, Sushma; Kulkarni, Meena

    2015-01-01

    Introduction: Estimating age using the dentition plays a significant role in identification of the individual in forensic cases. Teeth are one of the most durable and strongest structures in the human body. The morphology and arrangement of teeth vary from person-to-person and is unique to an individual as are the fingerprints. Therefore, the use of dentition is the method of choice in the identification of the unknown. Root dentin translucency is considered to be one of the best parameters for dental age estimation. Traditionally, root dentin translucency was measured using calipers. Recently, the use of custom built software programs have been proposed for the same. Objectives: The present study describes a method to measure root dentin translucency on sectioned teeth using a custom built software program Adobe Photoshop 7.0 version (Adobe system Inc, Mountain View California). Materials and Methods: A total of 50 single rooted teeth were sectioned longitudinally to derive a 0.25 mm uniform thickness and the root dentin translucency was measured using digital and caliper methods and compared. The Gustafson's morphohistologic approach is used in this study. Results: Correlation coefficients of translucency measurements to age were statistically significant for both the methods (P < 0.125) and linear regression equations derived from both methods revealed better ability of the digital method to assess age. Conclusion: The custom built software program used in the present study is commercially available and widely used image editing software. Furthermore, this method is easy to use and less time consuming. The measurements obtained using this method are more precise and thus help in more accurate age estimation. Considering these benefits, the present study recommends the use of digital method to assess translucency for age estimation. PMID:25709325

  14. Interlaboratory study of a liquid chromatography method for erythromycin: determination of uncertainty.

    Science.gov (United States)

    Dehouck, P; Vander Heyden, Y; Smeyers-Verbeke, J; Massart, D L; Marini, R D; Chiap, P; Hubert, Ph; Crommen, J; Van de Wauw, W; De Beer, J; Cox, R; Mathieu, G; Reepmeyer, J C; Voigt, B; Estevenon, O; Nicolas, A; Van Schepdael, A; Adams, E; Hoogmartens, J

    2003-08-22

    Erythromycin is a mixture of macrolide antibiotics produced by Saccharopolyspora erythreas during fermentation. A new method for the analysis of erythromycin by liquid chromatography has previously been developed. It makes use of an Astec C18 polymeric column. After validation in one laboratory, the method was now validated in an interlaboratory study. Validation studies are commonly used to test the fitness of the analytical method prior to its use for routine quality testing. The data derived in the interlaboratory study can be used to make an uncertainty statement as well. The relationship between validation and uncertainty statement is not clear for many analysts and there is a need to show how the existing data, derived during validation, can be used in practice. Eight laboratories participated in this interlaboratory study. The set-up allowed the determination of the repeatability variance, s(2)r and the between-laboratory variance, s(2)L. Combination of s(2)r and s(2)L results in the reproducibility variance s(2)R. It has been shown how these data can be used in future by a single laboratory that wants to make an uncertainty statement concerning the same analysis.

  15. Some new results on correlation-preserving factor scores prediction methods

    NARCIS (Netherlands)

    Ten Berge, J.M.F.; Krijnen, W.P.; Wansbeek, T.J.; Shapiro, A.

    1999-01-01

    Anderson and Rubin and McDonald have proposed a correlation-preserving method of factor scores prediction which minimizes the trace of a residual covariance matrix for variables. Green has proposed a correlation-preserving method which minimizes the trace of a residual covariance matrix for factors.

  16. FY 1974 report on the results of the Sunshine Project. Comprehensive study of hydrogen use subsystem and study on the periphery technology (Investigational study on the hydrogen production method by the quinone method); 1974 nendo suiso riyo subsystem no sogoteki kento to shuhen gijutsu ni kansuru kenkyu seika hokokusho. Kinonho ni yoru suiso seizoho ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1975-05-01

    This is aimed at making an investigational study on the hydrogen production from water by means of quinone compounds. The paper studied a preparation method in case of using n-TiO2 photosemiconductor and p-GaAs and Gap photosemiconductor to photoelectrode as catalysts of this reaction. The reaction from p-benzoquinone to p-hydroquinone by reaction with H2O were studied in terms of photochemical reaction, and light reaction/dark reaction of photosemiconductor electrode. As a result, it was found out that this reaction easily advances by the photochemical reaction by solar light, and also that it advances by the counter electrode Pt electrode reaction in the water electrolysis of n-TiO2 photosemiconductor electrode. The H2 production reaction from p-hydrogquinone was studied in terms of the photochemical reaction, photosemiconductor electrode reaction, and effects of the catalysis of electron transport of metal salts, methylviologen, etc. As a result, in the n-TiO2 photosemiconductor electrode reaction, H2 formation was not as great as it can be confirmed. However, it was found that p-hydroquinone can produce H2 under the existence of methylviologen or Fe salts. (NEDO)

  17. Comparative studies on different molecular methods for ...

    African Journals Online (AJOL)

    The present study aims to evaluate two molecular methods for epidemiological typing of multi drug resistant Klebsiella pneumoniae isolated from Mansoura Hospitals. In this study, a total of 300 clinical isolates were collected from different patients distributed among Mansoura Hospitals, Dakahlia governorate, Egypt.

  18. Comparative study of environmental impact assessment methods ...

    African Journals Online (AJOL)

    This study aims to introduce and systematically investigate the environmental issues during important decision-making stages. Meanwhile, impacts of development on the environmental components will be also analyzed. This research studies various methods of predicting the environmental changes and determining the ...

  19. Application of activity theory to analysis of human-related accidents: Method and case studies

    International Nuclear Information System (INIS)

    Yoon, Young Sik; Ham, Dong-Han; Yoon, Wan Chul

    2016-01-01

    This study proposes a new approach to human-related accident analysis based on activity theory. Most of the existing methods seem to be insufficient for comprehensive analysis of human activity-related contextual aspects of accidents when investigating the causes of human errors. Additionally, they identify causal factors and their interrelationships with a weak theoretical basis. We argue that activity theory offers useful concepts and insights to supplement existing methods. The proposed approach gives holistic contextual backgrounds for understanding and diagnosing human-related accidents. It also helps identify and organise causal factors in a consistent, systematic way. Two case studies in Korean nuclear power plants are presented to demonstrate the applicability of the proposed method. Human Factors Analysis and Classification System (HFACS) was also applied to the case studies. The results of using HFACS were then compared with those of using the proposed method. These case studies showed that the proposed approach could produce a meaningful set of human activity-related contextual factors, which cannot easily be obtained by using existing methods. It can be especially effective when analysts think it is important to diagnose accident situations with human activity-related contextual factors derived from a theoretically sound model and to identify accident-related contextual factors systematically. - Highlights: • This study proposes a new method for analysing human-related accidents. • The method was developed based on activity theory. • The concept of activity system model and contradiction was used in the method. • Two case studies in nuclear power plants are presented. • The method is helpful to consider causal factors systematically and comprehensively.

  20. Study on calculation methods for the effective delayed neutron fraction

    International Nuclear Information System (INIS)

    Irwanto, Dwi; Obara, Toru; Chiba, Go; Nagaya, Yasunobu

    2011-03-01

    The effective delayed neutron fraction β eff is one of the important neutronic parameters from a view point of a reactor kinetics. Several Monte-Carlo-based methods to estimate β eff have been proposed to date. In order to quantify the accuracy of these methods, we study calculation methods for β eff by analyzing various fast neutron systems including the bare spherical systems (Godiva, Jezebel, Skidoo, Jezebel-240), the reflective spherical systems (Popsy, Topsy, Flattop-23), MASURCA-R2 and MASURCA-ZONA2, and FCA XIX-1, XIX-2 and XIX-3. These analyses are performed by using SLAROM-UF and CBG for the deterministic method and MVP-II for the Monte Carlo method. We calculate β eff with various definitions such as the fundamental value β 0 , the standard definition, Nauchi's definition and Meulekamp's definition, and compare these results with each other. Through the present study, we find the following: The largest difference among the standard definition of β eff , Nauchi's β eff and Meulekamp's β eff is approximately 10%. The fundamental value β 0 is quite larger than the others in several cases. For all the cases, Meulekamp's β eff is always higher than Nauchi's β eff . This is because Nauchi's β eff considers the average neutron multiplicity value per fission which is large in the high energy range (1MeV-10MeV), while the definition of Meulekamp's β eff does not include this parameter. Furthermore, we evaluate the multi-generation effect on β eff values and demonstrate that this effect should be considered to obtain the standard definition values of β eff . (author)

  1. Studies on the instrumental neutron activation analysis by cadmium ratio method and pair comparator method

    Energy Technology Data Exchange (ETDEWEB)

    Chao, H E; Lu, W D; Wu, S C

    1977-12-01

    The cadmium ratio method and pair comparator method provide a solution for the effects on the effective activation factors resulting from the variation of neutron spectrum at different irradiation positions as usually encountered in the single comparator method. The relations between the activation factors and neutron spectrum in terms of cadmium ratio of the comparator Au or of the activation factor of Co-Au pair for the elements, Sc, Cr, Mn, Co, La, Ce, Sm, and Th have been determined. The activation factors of the elements at any irradiation position can then be obtained from the cadmium ratio of the comparator and/or the activation factor of the comparator pair. The relations determined should be able to apply to different reactors and/or different positions of a reactor. It is shown that, for the isotopes /sup 46/Sc, /sup 51/Cr, /sup 56/Mn, /sup 60/Co, /sup 140/La, /sup 141/Ce, /sup 153/Sm and /sup 233/Pa, the thermal neutron activation factors determined by these two methods were generally in agreement with theoretical values. Their I/sub 0//sigma/sub th/ values appeared to agree with literature values also. The methods were applied to determine the contents of elements Sc, Cr, Mn, La, Ce, Sm, and Th in U.S.G.S. Standard Rock G-2, and the results were also in agreement with literature values. The cadmium ratio method and pair comparator method improved the single comparator method, and they are more suitable to analysis for multi-elements of a large number of samples.

  2. Comparison between two sampling methods by results obtained using petrographic techniques, specially developed for minerals of the Itataia uranium phosphate deposit, Ceara, Brazil

    International Nuclear Information System (INIS)

    Salas, H.T.; Murta, R.L.L.

    1985-01-01

    The results of comparison of two sampling methods applied to a gallery of the uranium-phosphate ore body of Itataia-Ceara State, Brazil, along 235 metres of mineralized zone, are presented. The results were obtained through petrographic techniques especially developed and applied to both samplings. In the first one it was studied hand samples from a systematically sampling made at intervals of 2 metres. After that, the estimated mineralogical composition studies were carried out. Some petrogenetic observations were for the first time verified. The second sampling was made at intervals of 20 metres and 570 tons of ore extracted and distributed in sections and a sample representing each section was studied after crushing at -65. Their mineralogy were quantified and the degree of liberation of apatite calculated. Based on the mineralogical data obtained it was possible to represent both samplings and to make the comparison of the main mineralogical groups (phosphates, carbonates and silicates). In spite of utilizing different methods and methodology and the kind of mineralization, stockwork, being quite irregular, the results were satisfactory. (Author) [pt

  3. High rates of HIV seroconversion in pregnant women and low reported levels of HIV testing among male partners in Southern Mozambique: results from a mixed methods study.

    Directory of Open Access Journals (Sweden)

    Caroline De Schacht

    Full Text Available INTRODUCTION: Prevention of acute HIV infections in pregnancy is required to achieve elimination of pediatric HIV. Identification and support for HIV negative pregnant women and their partners, particularly serodiscordant couples, are critical. A mixed method study done in Southern Mozambique estimated HIV incidence during pregnancy, associated risk factors and factors influencing partner's HIV testing. METHODS: Between April 2008 and November 2011, a prospective cohort of 1230 HIV negative pregnant women was followed during pregnancy. A structured questionnaire, HIV testing, and collection of dried blood spots were done at 2-3 scheduled visits. HIV incidence rates were calculated by repeat HIV testing and risk factors assessed by Poisson regression. A qualitative study including 37 individual interviews with men, women, and nurses and 11 focus group discussions (n = 94 with men, women and grandmothers explored motivators and barriers to uptake of male HIV testing. RESULTS: HIV incidence rate was estimated at 4.28/100 women-years (95%CI: 2.33-7.16. Significant risk factors for HIV acquisition were early sexual debut (RR 3.79, 95%CI: 1.04-13.78, p = 0.04 and living in Maputo Province (RR 4.35, 95%CI: 0.97-19.45, p = 0.05. Nineteen percent of women reported that their partner had tested for HIV (93% knew the result with 8/213 indicating an HIV positive partner, 56% said their partner had not tested and 19% did not know their partner test status. Of the 14 seroconversions, only one reported being in a serodiscordant relationship. Fear of discrimination or stigma was reported as a key barrier to male HIV testing, while knowing the importance of getting tested and receiving care was the main motivator. CONCLUSIONS: HIV incidence during pregnancy is high in Southern Mozambique, but knowledge of partners' HIV status remains low. Knowledge of both partners' HIV status is critical for maximal effectiveness of prevention and treatment services to reach

  4. Standardization of a method to study angiogenesis in a mouse model

    Directory of Open Access Journals (Sweden)

    DAVID FEDER

    2013-01-01

    Full Text Available In the adult organism, angiogenesis is restricted to a few physiological conditions. On the other hand, uncontrolled angiogenesis have often been associated to angiogenesis-dependent pathologies. A variety of animal models have been described to provide more quantitative analysis of in vivo angiogenesis and to characterize pro- and antiangiogenic molecules. However, it is still necessary to establish a quantitative, reproducible and specific method for studies of angiogenesis factors and inhibitors. This work aimed to standardize a method for the study of angiogenesis and to investigate the effects of thalidomide on angiogenesis. Sponges of 0.5 x 0.5 x 0.5 cm were implanted in the back of mice groups, control and experimental (thalidomide 200 mg/K/day by gavage. After seven days, the sponges were removed. The dosage of hemoglobin in sponge and in circulation was performed and the ratio between the values was tested using nonparametric Mann-Whitney test. Results have shown that sponge-induced angiogenesis quantitated by ratio between hemoglobin content in serum and in sponge is a helpful model for in vivo studies on angiogenesis. Moreover, it was observed that sponge-induced angiogenesis can be suppressed by thalidomide, corroborating to the validity of the standardized method.

  5. Comments on Brodsky's statistical methods for evaluating epidemiological results, and reply by Brodsky, A

    International Nuclear Information System (INIS)

    Frome, E.L.; Khare, M.

    1980-01-01

    Brodsky's paper 'A Statistical Method for Testing Epidemiological Results, as applied to the Hanford Worker Population', (Health Phys., 36, 611-628, 1979) proposed two test statistics for use in comparing the survival experience of a group of employees and controls. This letter states that both of the test statistics were computed using incorrect formulas and concludes that the results obtained using these statistics may also be incorrect. In his reply Brodsky concurs with the comments on the proper formulation of estimates of pooled standard errors in constructing test statistics but believes that the erroneous formulation does not invalidate the major points, results and discussions of his paper. (author)

  6. Water-quality trends in the nation’s rivers and streams, 1972–2012—Data preparation, statistical methods, and trend results

    Science.gov (United States)

    Oelsner, Gretchen P.; Sprague, Lori A.; Murphy, Jennifer C.; Zuellig, Robert E.; Johnson, Henry M.; Ryberg, Karen R.; Falcone, James A.; Stets, Edward G.; Vecchia, Aldo V.; Riskin, Melissa L.; De Cicco, Laura A.; Mills, Taylor J.; Farmer, William H.

    2017-04-04

    Since passage of the Clean Water Act in 1972, Federal, State, and local governments have invested billions of dollars to reduce pollution entering rivers and streams. To understand the return on these investments and to effectively manage and protect the Nation’s water resources in the future, we need to know how and why water quality has been changing over time. As part of the National Water-Quality Assessment Project, of the U.S. Geological Survey’s National Water-Quality Program, data from the U.S. Geological Survey, along with multiple other Federal, State, Tribal, regional, and local agencies, have been used to support the most comprehensive assessment conducted to date of surface-water-quality trends in the United States. This report documents the methods used to determine trends in water quality and ecology because these methods are vital to ensuring the quality of the results. Specific objectives are to document (1) the data compilation and processing steps used to identify river and stream sites throughout the Nation suitable for water-quality, pesticide, and ecology trend analysis, (2) the statistical methods used to determine trends in target parameters, (3) considerations for water-quality, pesticide, and ecology data and streamflow data when modeling trends, (4) sensitivity analyses for selecting data and interpreting trend results with the Weighted Regressions on Time, Discharge, and Season method, and (5) the final trend results at each site. The scope of this study includes trends in water-quality concentrations and loads (nutrient, sediment, major ion, salinity, and carbon), pesticide concentrations and loads, and metrics for aquatic ecology (fish, invertebrates, and algae) for four time periods: (1) 1972–2012, (2) 1982–2012, (3) 1992–2012, and (4) 2002–12. In total, nearly 12,000 trends in concentration, load, and ecology metrics were evaluated in this study; there were 11,893 combinations of sites, parameters, and trend periods. The

  7. Method of research and study of uranium deposits; Methode de recherches et d'etude des gites uraniferes

    Energy Technology Data Exchange (ETDEWEB)

    Lenoble, A [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1955-07-01

    In a first part, the author gives a fast retrospective of the evaluations of the uranium deposits in the French Union. The author established a method of prospecting and studying, modifiable at all times following the experiences and the results, permitting to make the general inventory of uranium resources on the territory. The method is based on: 1 - the determination of geological guides in order to mark the most promising deposits, 2 - the definition of a methodology adapted to every steps of the research, 3 - the choice of the material adapted for each of the steps. This method, originally established for the prospecting in crystalline massifs, is adaptable to the prospecting of the sedimentary formations. (M.B.) [French] Dans une premiere partie, l'auteur donne une retrospective rapide des estimations des gites uraniferes dans l'Union Francaise. L'auteur a etabli une methode de prospection et d'etude, modifiable a tout instant suivant les experiences et les resultats, permettant de faire l'inventaire general des ressources en uranium du territoire. La methode est base sur: 1 - la determination de guides geologiques afin de reperer les gisements les plus prometteurs, 2 - la definition d'une methodologie adaptee a chaque stade de la recherche, 3 - le choix du materiel adapte pour chacun des stades. Cette methode, a l'origine etablie pour la prospection en massifs cristallins, est adaptable a la prospection des formations sedimentaires. (M.B.)

  8. A comparative study of cultural methods for the detection of Salmonella in feed and feed ingredients

    Directory of Open Access Journals (Sweden)

    Haggblom Per

    2009-02-01

    Full Text Available Abstract Background Animal feed as a source of infection to food producing animals is much debated. In order to increase our present knowledge about possible feed transmission it is important to know that the present isolation methods for Salmonella are reliable also for feed materials. In a comparative study the ability of the standard method used for isolation of Salmonella in feed in the Nordic countries, the NMKL71 method (Nordic Committee on Food Analysis was compared to the Modified Semisolid Rappaport Vassiliadis method (MSRV and the international standard method (EN ISO 6579:2002. Five different feed materials were investigated, namely wheat grain, soybean meal, rape seed meal, palm kernel meal, pellets of pig feed and also scrapings from a feed mill elevator. Four different levels of the Salmonella serotypes S. Typhimurium, S. Cubana and S. Yoruba were added to each feed material, respectively. For all methods pre-enrichment in Buffered Peptone Water (BPW were carried out followed by enrichments in the different selective media and finally plating on selective agar media. Results The results obtained with all three methods showed no differences in detection levels, with an accuracy and sensitivity of 65% and 56%, respectively. However, Müller-Kauffmann tetrathionate-novobiocin broth (MKTTn, performed less well due to many false-negative results on Brilliant Green agar (BGA plates. Compared to other feed materials palm kernel meal showed a higher detection level with all serotypes and methods tested. Conclusion The results of this study showed that the accuracy, sensitivity and specificity of the investigated cultural methods were equivalent. However, the detection levels for different feed and feed ingredients varied considerably.

  9. Direct-substitution method for studying second harmonic generation in arbitrary optical superlattices

    Directory of Open Access Journals (Sweden)

    Ying Chen

    Full Text Available In this paper, we present the direct-substitution (DS method to study the second-harmonic generation (SHG in arbitrary one-dimensional optical superlattices (OS. Applying this method to Fibonacci and generalized Fibonacci systems, we obtain the relative intensity of SHG and compare them with previous works. We confirmed the validity of the proposed DS method by comparing our results of SHG in quasiperiodic Fibonacci OS with previous works using analytical Fourier transform method. Furthermore, the three-dimension SHG spectra obtained by DS method present the properties of SHG in Fibonacci OS more distinctly. What’s more important, the DS method demands very few limits and can be used to compute directly and conveniently the intensity of SHG in arbitrary OS where the quasi-phase-matching (QPM can be achieved. It shows that the DS method is powerful for the calculation of electric field and intensity of SHG and can help experimentalists conveniently to estimate the distributions of SHG in any designed polarized systems. Keywords: Second-harmonic generation, Direct-substitution, Fibonacci

  10. Infants and young children modeling method for numerical dosimetry studies: application to plane wave exposure

    International Nuclear Information System (INIS)

    Dahdouh, S; Wiart, J; Bloch, I; Varsier, N; Nunez Ochoa, M A; Peyman, A

    2016-01-01

    Numerical dosimetry studies require the development of accurate numerical 3D models of the human body. This paper proposes a novel method for building 3D heterogeneous young children models combining results obtained from a semi-automatic multi-organ segmentation algorithm and an anatomy deformation method. The data consist of 3D magnetic resonance images, which are first segmented to obtain a set of initial tissues. A deformation procedure guided by the segmentation results is then developed in order to obtain five young children models ranging from the age of 5 to 37 months. By constraining the deformation of an older child model toward a younger one using segmentation results, we assure the anatomical realism of the models. Using the proposed framework, five models, containing thirteen tissues, are built. Three of these models are used in a prospective dosimetry study to analyze young child exposure to radiofrequency electromagnetic fields. The results lean to show the existence of a relationship between age and whole body exposure. The results also highlight the necessity to specifically study and develop measurements of child tissues dielectric properties. (paper)

  11. Infants and young children modeling method for numerical dosimetry studies: application to plane wave exposure

    Science.gov (United States)

    Dahdouh, S.; Varsier, N.; Nunez Ochoa, M. A.; Wiart, J.; Peyman, A.; Bloch, I.

    2016-02-01

    Numerical dosimetry studies require the development of accurate numerical 3D models of the human body. This paper proposes a novel method for building 3D heterogeneous young children models combining results obtained from a semi-automatic multi-organ segmentation algorithm and an anatomy deformation method. The data consist of 3D magnetic resonance images, which are first segmented to obtain a set of initial tissues. A deformation procedure guided by the segmentation results is then developed in order to obtain five young children models ranging from the age of 5 to 37 months. By constraining the deformation of an older child model toward a younger one using segmentation results, we assure the anatomical realism of the models. Using the proposed framework, five models, containing thirteen tissues, are built. Three of these models are used in a prospective dosimetry study to analyze young child exposure to radiofrequency electromagnetic fields. The results lean to show the existence of a relationship between age and whole body exposure. The results also highlight the necessity to specifically study and develop measurements of child tissues dielectric properties.

  12. Experimental study of sodium droplet burning in free fall. Evaluation of preliminary test results

    International Nuclear Information System (INIS)

    Miyahara, Shinya; Ara, Kuniaki

    1998-08-01

    To study a sodium leak and combustion behavior phenomenologically and to construct the mechanistic evaluation method, an experimental series of a sodium droplet burning in free fall is under way. In this study, the accuracy of measurement technique used in the preliminary test was assessed and the modified technique was proposed for the next test series. Analytical study of the test results was also conducted to deduce dominant parameters and important measurement items which would play an important role in the droplet combustion behavior. The results and conclusions are as follows: (1) Assessment of measurement accuracy and modified technique proposed for the next test series. a) Control accuracy of sodium supply system using β-alumina solid electrolyte was sufficient for generation of objective size of single droplet. However, it is necessary to calibrate the correlation between the quantity of electric charge for sodium supply system and that of supplied sodium. b) Measurement accuracy of falling velocity using high-speed video was ±0.33 m/s at an upper part and ±0.48 m/s at a lower part of the measurement. To reduce the error, a high-speed stroboscopic method is recommended to measure the falling velocity of droplet. (2) Results of analytical study and deduced dominant parameters and important measurement items. a) The falling behavior of a burning droplet was described solving the equation of free falling motion for a rigid sphere. In the case of higher falling height, it is necessary to study the burning effects on the falling behavior. b) The mass burned of a falling droplet was calculated using the combustion model according to 'D 2 ' law during the full burning phase. It is necessary to study the dominant chemical reaction in the burning flame because the mass burned depends on the composition of the reaction products. c) The mass burned was calculated using surface oxidation model for preignition phase together with above model. However, it is

  13. A Comparative Study on the Architecture Internet of Things and its’ Implementation method

    Science.gov (United States)

    Xiao, Zhiliang

    2017-08-01

    With the rapid development of science and technology, Internet-based the Internet of things was born and achieved good results. In order to further build a complete Internet of things system, to achieve the design of the Internet of things, we need to constitute the object of the network structure of the indicators of comparative study, and on this basis, the Internet of things connected to the way and do more in-depth to achieve the unity of the object network architecture and implementation methods. This paper mainly analyzes the two types of Internet of Things system, and makes a brief comparative study of the important indicators, and then introduces the connection method and realization method of Internet of Things based on the concept of Internet of Things and architecture.

  14. Multi-reader ROC studies with split-plot designs: a comparison of statistical methods.

    Science.gov (United States)

    Obuchowski, Nancy A; Gallas, Brandon D; Hillis, Stephen L

    2012-12-01

    Multireader imaging trials often use a factorial design, in which study patients undergo testing with all imaging modalities and readers interpret the results of all tests for all patients. A drawback of this design is the large number of interpretations required of each reader. Split-plot designs have been proposed as an alternative, in which one or a subset of readers interprets all images of a sample of patients, while other readers interpret the images of other samples of patients. In this paper, the authors compare three methods of analysis for the split-plot design. Three statistical methods are presented: the Obuchowski-Rockette method modified for the split-plot design, a newly proposed marginal-mean analysis-of-variance approach, and an extension of the three-sample U-statistic method. A simulation study using the Roe-Metz model was performed to compare the type I error rate, power, and confidence interval coverage of the three test statistics. The type I error rates for all three methods are close to the nominal level but tend to be slightly conservative. The statistical power is nearly identical for the three methods. The coverage of 95% confidence intervals falls close to the nominal coverage for small and large sample sizes. The split-plot multireader, multicase study design can be statistically efficient compared to the factorial design, reducing the number of interpretations required per reader. Three methods of analysis, shown to have nominal type I error rates, similar power, and nominal confidence interval coverage, are available for this study design. Copyright © 2012 AUR. All rights reserved.

  15. The Findings from the OECD/NEA/CSNI UMS (Uncertainty Method Study)

    International Nuclear Information System (INIS)

    D'Auria, F.; Glaeser, H.

    2013-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a 'best estimate' concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI (Committee on the Safety of Nuclear Installations) of OECD/NEA (Organization for Economic Cooperation and Development / Nuclear Energy Agency), has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges. A 'bifurcation' analysis was also performed by the same research group also providing another way of interpreting the high temperature peak calculated by two of the participants. (authors)

  16. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  17. Hawaii Solar Integration Study: Solar Modeling Developments and Study Results; Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Orwig, K.; Corbus, D.; Piwko, R.; Schuerger, M.; Matsuura, M.; Roose, L.

    2012-12-01

    The Hawaii Solar Integration Study (HSIS) is a follow-up to the Oahu Wind Integration and Transmission Study completed in 2010. HSIS focuses on the impacts of higher penetrations of solar energy on the electrical grid and on other generation. HSIS goes beyond the island of Oahu and investigates Maui as well. The study examines reserve strategies, impacts on thermal unit commitment and dispatch, utilization of energy storage, renewable energy curtailment, and other aspects of grid reliability and operation. For the study, high-frequency (2-second) solar power profiles were generated using a new combined Numerical Weather Prediction model/ stochastic-kinematic cloud model approach, which represents the 'sharp-edge' effects of clouds passing over solar facilities. As part of the validation process, the solar data was evaluated using a variety of analysis techniques including wavelets, power spectral densities, ramp distributions, extreme values, and cross correlations. This paper provides an overview of the study objectives, results of the solar profile validation, and study results.

  18. Diffuse reflectance startigraphy - a new method in the study of loess (?)

    Science.gov (United States)

    József, Szeberényi; Balázs, Bradák; Klaudia, Kiss; József, Kovács; György, Varga; Réka, Balázs; Viczián, István

    2017-04-01

    The different varieties of loess (and intercalated paleosol layers) together constitute one of the most widespread terrestrial sediments, which was deposited, altered, and redeposited in the course of the changing climatic conditions of the Pleistocene. To reveal more information about Pleistocene climate cycles and/or environments the detailed lithostratigraphical subdivision and classification of the loess variations and paleosols are necessary. Beside the numerous method such as various field measurements, semi-quantitative tests and laboratory investigations, diffuse reflectance spectroscopy (DRS) is one of the well applied method on loess/paleosol sequences. Generally, DRS has been used to separate the detrital and pedogenic mineral component of the loess sections by the hematite/goethite ratio. DRS also has been applied as a joint method of various environmental magnetic investigations such as magnetic susceptibility- and isothermal remanent magnetization measurements. In our study the so-called "diffuse reflectance stratigraphy method" were developed. At First, complex mathematical method was applied to compare the results of the spectral reflectance measurements. One of the most preferred multivariate methods is cluster analysis. Its scope is to group and compare the loess variations and paleosol based on the similarity and common properties of their reflectance curves. In the Second, beside the basic subdivision of the profiles by the different reflectance curves of the layers, the most characteristic wavelength section of the reflectance curve was determined. This sections played the most important role during the classification of the different materials of the section. The reflectance value of individual samples, belonged to the characteristic wavelength were depicted in the function of depth and well correlated with other proxies like grain size distribution and magnetic susceptibility data. The results of the correlation showed the significance of the

  19. Condensed matter studies by nuclear methods

    International Nuclear Information System (INIS)

    Krolas, K.; Tomala, K.

    1988-01-01

    The separate abstract was prepared for 1 of the papers in this volume. The remaining 13 papers dealing with the use but not with advances in the use of nuclear methods in studies of condensed matter, were considered outside the subject scope of INIS. (M.F.W.)

  20. Narrative Inquiry as Travel Study Method: Affordances and Constraints

    Science.gov (United States)

    Craig, Cheryl J.; Zou, Yali; Poimbeauf, Rita

    2014-01-01

    This article maps how narrative inquiry--the use of story to study human experience--has been employed as both method and form to capture cross-cultural learning associated with Western doctoral students' travel study to eastern destinations. While others were the first to employ this method in the travel study domain, we are the first to…