WorldWideScience

Sample records for global quantitation methods

  1. Comparison of visual scoring and quantitative planimetry methods for estimation of global infarct size on delayed enhanced cardiac MRI and validation with myocardial enzymes

    Energy Technology Data Exchange (ETDEWEB)

    Mewton, Nathan, E-mail: nmewton@gmail.com [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Revel, Didier [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France); Bonnefoy, Eric [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); Ovize, Michel [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); INSERM Unite 886 (France); Croisille, Pierre [Hopital Cardiovasculaire Louis Pradel, 28, Avenue Doyen Lepine, 69677 Bron cedex, Hospices Civils de Lyon (France); CREATIS-LRMN (Centre de Recherche et d' Applications en Traitement de l' Image et du Signal), Universite Claude Bernard Lyon 1, UMR CNRS 5220, U 630 INSERM (France)

    2011-04-15

    Purpose: Although delayed enhanced CMR has become a reference method for infarct size quantification, there is no ideal method to quantify total infarct size in a routine clinical practice. In a prospective study we compared the performance and post-processing time of a global visual scoring method to standard quantitative planimetry and we compared both methods to the peak values of myocardial biomarkers. Materials and methods: This study had local ethics committee approval; all patients gave written informed consent. One hundred and three patients admitted with reperfused AMI to our intensive care unit had a complete CMR study with gadolinium-contrast injection 4 {+-} 2 days after admission. A global visual score was defined on a 17-segment model and compared with the quantitative planimetric evaluation of hyperenhancement. The peak values of serum Troponin I (TnI) and creatine kinase (CK) release were measured in each patient. Results: The mean percentage of total left ventricular myocardium with hyperenhancement determined by the quantitative planimetry method was (20.1 {+-} 14.6) with a range of 1-68%. There was an excellent correlation between quantitative planimetry and visual global scoring for the hyperenhancement extent's measurement (r = 0.94; y = 1.093x + 0.87; SEE = 1.2; P < 0.001) The Bland-Altman plot showed a good concordance between the two approaches (mean of the differences = 1.9% with a standard deviation of 4.7). Mean post-processing time for quantitative planimetry was significantly longer than visual scoring post-processing time (23.7 {+-} 5.7 min vs 5.0 {+-} 1.1 min respectively, P < 0.001). Correlation between peak CK and quantitative planimetry was r = 0.82 (P < 0.001) and r = 0.83 (P < 0.001) with visual global scoring. Correlation between peak Troponin I and quantitative planimetry was r = 0.86 (P < 0.001) and r = 0.85 (P < 0.001) with visual global scoring. Conclusion: A visual approach based on a 17-segment model allows a rapid

  2. Comparison of visual scoring and quantitative planimetry methods for estimation of global infarct size on delayed enhanced cardiac MRI and validation with myocardial enzymes

    International Nuclear Information System (INIS)

    Mewton, Nathan; Revel, Didier; Bonnefoy, Eric; Ovize, Michel; Croisille, Pierre

    2011-01-01

    Purpose: Although delayed enhanced CMR has become a reference method for infarct size quantification, there is no ideal method to quantify total infarct size in a routine clinical practice. In a prospective study we compared the performance and post-processing time of a global visual scoring method to standard quantitative planimetry and we compared both methods to the peak values of myocardial biomarkers. Materials and methods: This study had local ethics committee approval; all patients gave written informed consent. One hundred and three patients admitted with reperfused AMI to our intensive care unit had a complete CMR study with gadolinium-contrast injection 4 ± 2 days after admission. A global visual score was defined on a 17-segment model and compared with the quantitative planimetric evaluation of hyperenhancement. The peak values of serum Troponin I (TnI) and creatine kinase (CK) release were measured in each patient. Results: The mean percentage of total left ventricular myocardium with hyperenhancement determined by the quantitative planimetry method was (20.1 ± 14.6) with a range of 1-68%. There was an excellent correlation between quantitative planimetry and visual global scoring for the hyperenhancement extent's measurement (r = 0.94; y = 1.093x + 0.87; SEE = 1.2; P < 0.001) The Bland-Altman plot showed a good concordance between the two approaches (mean of the differences = 1.9% with a standard deviation of 4.7). Mean post-processing time for quantitative planimetry was significantly longer than visual scoring post-processing time (23.7 ± 5.7 min vs 5.0 ± 1.1 min respectively, P < 0.001). Correlation between peak CK and quantitative planimetry was r = 0.82 (P < 0.001) and r = 0.83 (P < 0.001) with visual global scoring. Correlation between peak Troponin I and quantitative planimetry was r = 0.86 (P < 0.001) and r = 0.85 (P < 0.001) with visual global scoring. Conclusion: A visual approach based on a 17-segment model allows a rapid and accurate

  3. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  4. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  5. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  6. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  7. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    Science.gov (United States)

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  8. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  9. Comparison of two label-free global quantitation methods, APEX and 2D gel electrophoresis, applied to the Shigella dysenteriae proteome

    Directory of Open Access Journals (Sweden)

    Fleischmann Robert D

    2009-06-01

    Full Text Available Abstract The in vitro stationary phase proteome of the human pathogen Shigella dysenteriae serotype 1 (SD1 was quantitatively analyzed in Coomassie Blue G250 (CBB-stained 2D gels. More than four hundred and fifty proteins, of which 271 were associated with distinct gel spots, were identified. In parallel, we employed 2D-LC-MS/MS followed by the label-free computationally modified spectral counting method APEX for absolute protein expression measurements. Of the 4502 genome-predicted SD1 proteins, 1148 proteins were identified with a false positive discovery rate of 5% and quantitated using 2D-LC-MS/MS and APEX. The dynamic range of the APEX method was approximately one order of magnitude higher than that of CBB-stained spot intensity quantitation. A squared Pearson correlation analysis revealed a reasonably good correlation (R2 = 0.67 for protein quantities surveyed by both methods. The correlation was decreased for protein subsets with specific physicochemical properties, such as low Mr values and high hydropathy scores. Stoichiometric ratios of subunits of protein complexes characterized in E. coli were compared with APEX quantitative ratios of orthologous SD1 protein complexes. A high correlation was observed for subunits of soluble cellular protein complexes in several cases, demonstrating versatile applications of the APEX method in quantitative proteomics.

  10. CONSTRUCTION THEORY AND NOISE ANALYSIS METHOD OF GLOBAL CGCS2000 COORDINATE FRAME

    Directory of Open Access Journals (Sweden)

    Z. Jiang

    2018-04-01

    Full Text Available The definition, renewal and maintenance of geodetic datum has been international hot issue. In recent years, many countries have been studying and implementing modernization and renewal of local geodetic reference coordinate frame. Based on the precise result of continuous observation for recent 15 years from state CORS (continuously operating reference system network and the mainland GNSS (Global Navigation Satellite System network between 1999 and 2007, this paper studies the construction of mathematical model of the Global CGCS2000 frame, mainly analyzes the theory and algorithm of two-step method for Global CGCS2000 Coordinate Frame formulation. Finally, the noise characteristic of the coordinate time series are estimated quantitatively with the criterion of maximum likelihood estimation.

  11. Barriers to global health development: An international quantitative survey.

    Directory of Open Access Journals (Sweden)

    Bahr Weiss

    Full Text Available Global health's goal of reducing low-and-middle-income country versus high-income country health disparities faces complex challenges. Although there have been discussions of barriers, there has not been a broad-based, quantitative survey of such barriers.432 global health professionals were invited via email to participate in an online survey, with 268 (62% participating. The survey assessed participants' (A demographic and global health background, (B perceptions regarding 66 barriers' seriousness, (C detailed ratings of barriers designated most serious, (D potential solutions.Thirty-four (of 66 barriers were seen as moderately or more serious, highlighting the widespread, significant challenges global health development faces. Perceived barrier seriousness differed significantly across domains: Resource Limitations mean = 2.47 (0-4 Likert scale, Priority Selection mean = 2.20, Corruption, Lack of Competence mean = 1.87, Social and Cultural Barriers mean = 1.68. Some system-level predictors showed significant but relatively limited relations. For instance, for Global Health Domain, HIV and Mental Health had higher levels of perceived Social and Cultural Barriers than other GH Domains. Individual-level global health experience predictors had small but significant effects, with seriousness of (a Corruption, Lack of Competence, and (b Priority Selection barriers positively correlated with respondents' level of LMIC-oriented (e.g., weeks/year spent in LMIC but Academic Global Health Achievement (e.g., number of global health publications negatively correlated with overall barrier seriousness.That comparatively few system-level predictors (e.g., Organization Type were significant suggests these barriers may be relatively fundamental at the system-level. Individual-level and system-level effects do have policy implications; e.g., Priority Selection barriers were among the most serious, yet effects on seriousness of how LMIC-oriented a professional

  12. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  13. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  14. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    Directory of Open Access Journals (Sweden)

    Ye Zhi-Qiang

    2011-08-01

    Full Text Available Abstract Background Differential coexpression analysis (DCEA is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links. Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum.

  15. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  16. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  17. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  18. Diagnosis of Acute Global Myocarditis Using Cardiac MRI with Quantitative T1 and T2 Mapping: Case Report and Literature Review

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chul Hwan [Department of Radiology and Research Institute of Radiological Science, Yonsei University Health System, Seoul 135-720 (Korea, Republic of); Choi, Eui-Young [Division of Cardiology, Department of Internal Medicine, Gangnam Severance Hospital, Yonsei University College of Medicine, Seoul 135-720 (Korea, Republic of); Greiser, Andreas [Healthcare Sector, Siemens AG, Erlangen D-91052 (Germany); Paek, Mun Young [Siemens Ltd., Seoul 120-837 (Korea, Republic of); Hwang, Sung Ho; Kim, Tae Hoon [Department of Radiology and Research Institute of Radiological Science, Yonsei University Health System, Seoul 135-720 (Korea, Republic of)

    2013-07-01

    The diagnosis of myocarditis can be challenging given that symptoms, clinical exam findings, electrocardiogram results, biomarkers, and echocardiogram results are often non-specific. Endocardial biopsy is an established method for diagnosing myocarditis, but carries the risk of complications and false negative results. Cardiac magnetic resonance imaging (MRI) has become the primary non-invasive imaging tool in patients with suspected myocarditis. Myocarditis can be diagnosed by using three tissue markers including edema, hyperemia/capillary leak, and necrosis/fibrosis. The interpretation of cardiac MR findings can be confusing, especially when the myocardium is diffusely involved. Using T1 and T2 maps, the diagnosis of myocarditis can be made even in cases of global myocarditis with the help of quantitative analysis. We herein describe a case of acute global myocarditis which was diagnosed by using quantitative T1 and T2 mapping.

  19. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  20. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  1. Identification of circulating miRNA biomarkers based on global quantitative real-time PCR profiling

    Directory of Open Access Journals (Sweden)

    Kang Kang

    2012-02-01

    Full Text Available Abstract MicroRNAs (miRNAs are small noncoding RNAs (18-25 nucleotides that regulate gene expression at the post-transcriptional level. Recent studies have demonstrated the presence of miRNAs in the blood circulation. Deregulation of miRNAs in serum or plasma has been associated with many diseases including cancers and cardiovascular diseases, suggesting the possible use of miRNAs as diagnostic biomarkers. However, the detection of the small amount of miRNAs found in serum or plasma requires a method with high sensitivity and accuracy. Therefore, the current study describes polymerase chain reaction (PCR-based methods for measuring circulating miRNAs. Briefly, the procedure involves four major steps: (1 sample collection and preparation; (2 global miRNAs profiling using quantitative real-time PCR (qRT-PCR; (3 data normalization and analysis; and (4 selection and validation of miRNA biomarkers. In conclusion, qRT-PCR is a promising method for profiling of circulating miRNAs as biomarkers.

  2. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  3. GLOBAL AND STRICT CURVE FITTING METHOD

    NARCIS (Netherlands)

    Nakajima, Y.; Mori, S.

    2004-01-01

    To find a global and smooth curve fitting, cubic B­Spline method and gathering­ line methods are investigated. When segmenting and recognizing a contour curve of character shape, some global method is required. If we want to connect contour curves around a singular point like crossing points,

  4. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Quantitative method for determination of body inorganic iodine

    International Nuclear Information System (INIS)

    Filatov, A.A.; Tatsievskij, V.A.

    1991-01-01

    An original method of quantitation of body inorganic iodine, based upon a simultaneous administration of a known dose of stable and radioactive iodine with subsequent radiometry of the thyroid was proposed. The calculation is based upon the principle of the dilution of radiactive iodine in human inorganic iodine space. The method permits quantitation of the amount of inorganic iodine with regard to individual features of inorganic space. The method is characterized by simplicity and is not invasive for a patient

  7. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  8. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  9. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  10. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system.

    Science.gov (United States)

    Lumen, Annie; McNally, Kevin; George, Nysia; Fisher, Jeffrey W; Loizou, George D

    2015-01-01

    A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local sensitivity analysis.

  11. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system

    Directory of Open Access Journals (Sweden)

    Annie eLumen

    2015-05-01

    Full Text Available A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local

  12. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging

  13. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  14. A CT-based method for fully quantitative TI SPECT

    International Nuclear Information System (INIS)

    Willowson, Kathy; Bailey, Dale; Baldock, Clive

    2009-01-01

    Full text: Objectives: To develop and validate a method for quantitative 2 0 l TI SPECT data based on corrections derived from X-ray CT data, and to apply the method in the clinic for quantitative determination of recurrence of brain tumours. Method: A previously developed method for achieving quantitative SPECT with 9 9 m Tc based on corrections derived from xray CT data was extended to apply to 2 0 l Tl. Experimental validation was performed on a cylindrical phantom by comparing known injected activity and measured concentration to quantitative calculations. Further evaluation was performed on a RSI Striatal Brain Phantom containing three 'lesions' with activity to background ratios of 1: 1, 1.5: I and 2: I. The method was subsequently applied to a series of scans from patients with suspected recurrence of brain tumours (principally glioma) to determine an SUV-like measure (Standardised Uptake Value). Results: The total activity and concentration in the phantom were calculated to within 3% and I % of the true values, respectively. The calculated values for the concentration of activity in the background and corresponding lesions of the brain phantom (in increasing ratios) were found to be within 2%,10%,1% and 2%, respectively, of the true concentrations. Patient studies showed that an initial SUV greater than 1.5 corresponded to a 56% mortality rate in the first 12 months, as opposed to a 14% mortality rate for those with a SUV less than 1.5. Conclusion: The quantitative technique produces accurate results for the radionuclide 2 0 l Tl. Initial investigation in clinical brain SPECT suggests correlation between quantitative uptake and survival.

  15. Globalization vs. localization: global food challenges and local sollutions

    NARCIS (Netherlands)

    Quaye, W.; Jongerden, J.P.; Essegbey, G.; Ruivenkamp, G.T.P.

    2010-01-01

    The objective of this study was to examine the effect of global-local interactions on food production and consumption in Ghana, and identify possible local solutions. Primary data were collected using a combination of quantitative-qualitative methods, which included focus group discussions and

  16. Qualitative to quantitative: linked trajectory of method triangulation in a study on HIV/AIDS in Goa, India.

    Science.gov (United States)

    Bailey, Ajay; Hutter, Inge

    2008-10-01

    With 3.1 million people estimated to be living with HIV/AIDS in India and 39.5 million people globally, the epidemic has posed academics the challenge of identifying behaviours and their underlying beliefs in the effort to reduce the risk of HIV transmission. The Health Belief Model (HBM) is frequently used to identify risk behaviours and adherence behaviour in the field of HIV/AIDS. Risk behaviour studies that apply HBM have been largely quantitative and use of qualitative methodology is rare. The marriage of qualitative and quantitative methods has never been easy. The challenge is in triangulating the methods. Method triangulation has been largely used to combine insights from the qualitative and quantitative methods but not to link both the methods. In this paper we suggest a linked trajectory of method triangulation (LTMT). The linked trajectory aims to first gather individual level information through in-depth interviews and then to present the information as vignettes in focus group discussions. We thus validate information obtained from in-depth interviews and gather emic concepts that arise from the interaction. We thus capture both the interpretation and the interaction angles of the qualitative method. Further, using the qualitative information gained, a survey is designed. In doing so, the survey questions are grounded and contextualized. We employed this linked trajectory of method triangulation in a study on the risk assessment of HIV/AIDS among migrant and mobile men. Fieldwork was carried out in Goa, India. Data come from two waves of studies, first an explorative qualitative study (2003), second a larger study (2004-2005), including in-depth interviews (25), focus group discussions (21) and a survey (n=1259). By employing the qualitative to quantitative LTMT we can not only contextualize the existing concepts of the HBM, but also validate new concepts and identify new risk groups.

  17. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  18. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  19. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  20. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  1. Global Convergence of a Modified LS Method

    Directory of Open Access Journals (Sweden)

    Liu JinKui

    2012-01-01

    Full Text Available The LS method is one of the effective conjugate gradient methods in solving the unconstrained optimization problems. The paper presents a modified LS method on the basis of the famous LS method and proves the strong global convergence for the uniformly convex functions and the global convergence for general functions under the strong Wolfe line search. The numerical experiments show that the modified LS method is very effective in practice.

  2. A Quantitative Method for Localizing User Interface Problems: The D-TEO Method

    Directory of Open Access Journals (Sweden)

    Juha Lamminen

    2009-01-01

    Full Text Available A large array of evaluation methods have been proposed to identify Website usability problems. In log-based evaluation, information about the performance of users is collected and stored into log files, and used to find problems and deficiencies in Web page designs. Most methods require the programming and modeling of large task models, which are cumbersome processes for evaluators. Also, because much statistical data is collected onto log files, recognizing which Web pages require deeper usability analysis is difficult. This paper suggests a novel quantitative method, called the D-TEO, for locating problematic Web pages. This semiautomated method explores the decomposition of interaction tasks of directed information search into elementary operations, deploying two quantitative usability criteria, search success and search time, to reveal how a user navigates within a web of hypertext.

  3. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    Science.gov (United States)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  4. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative ... 2. Basic principles. The mineralogical constitution of soil is rather complex. ... K2O, MgO, and TFe as variables for the calculation.

  5. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  6. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  8. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    Science.gov (United States)

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  9. [A new method of processing quantitative PCR data].

    Science.gov (United States)

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  10. Travelling Methods: Tracing the Globalization of Qualitative Communication Research

    Directory of Open Access Journals (Sweden)

    Bryan C. Taylor

    2016-05-01

    Full Text Available Existing discussion of the relationships between globalization, communication research, and qualitative methods emphasizes two images: the challenges posed by globalization to existing communication theory and research methods, and the impact of post-colonial politics and ethics on qualitative research. We draw in this paper on a third image – qualitative research methods as artifacts of globalization – to explore the globalization of qualitative communication research methods. Following a review of literature which tentatively models this process, we discuss two case studies of qualitative research in the disciplinary subfields of intercultural communication and media audience studies. These cases elaborate the forces which influence the articulation of national, disciplinary, and methodological identities which mediate the globalization of qualitative communication research methods.

  11. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...

  12. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  13. Quantitative global and gene-specific promoter methylation in relation to biological properties of neuroblastomas

    Directory of Open Access Journals (Sweden)

    Kiss Nimrod B

    2012-09-01

    Full Text Available Abstract Background In this study we aimed to quantify tumor suppressor gene (TSG promoter methylation densities levels in primary neuroblastoma tumors and cell lines. A subset of these TSGs is associated with a CpG island methylator phenotype (CIMP in other tumor types. Methods The study panel consisted of 38 primary tumors, 7 established cell lines and 4 healthy references. Promoter methylation was determined by bisulphate Pyrosequencing for 14 TSGs; and LINE-1 repeat element methylation was used as an indicator of global methylation levels. Results Overall mean TSG Z-scores were significantly increased in cases with adverse outcome, but were unrelated to global LINE-1 methylation. CIMP with hypermethylation of three or more gene promoters was observed in 6/38 tumors and 7/7 cell lines. Hypermethylation of one or more TSG (comprising TSGs BLU, CASP8, DCR2, CDH1, RASSF1A and RASSF2 was evident in 30/38 tumors. By contrast only very low levels of promoter methylation were recorded for APC, DAPK1, NORE1A, P14, P16, TP73, PTEN and RARB. Similar involvements of methylation instability were revealed between cell line models and neuroblastoma tumors. Separate analysis of two proposed CASP8 regulatory regions revealed frequent and significant involvement of CpG sites between exon 4 and 5, but modest involvement of the exon 1 region. Conclusions/significance The results highlight the involvement of TSG methylation instability in neuroblastoma tumors and cell lines using quantitative methods, support the use of DNA methylation analyses as a prognostic tool for this tumor type, and underscore the relevance of developing demethylating therapies for its treatment.

  14. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  15. Quantitative Preparation in Doctoral Education Programs: A Mixed-Methods Study of Doctoral Student Perspectives on their Quantitative Training

    Directory of Open Access Journals (Sweden)

    Sarah L Ferguson

    2017-07-01

    Full Text Available Aim/Purpose: The purpose of the current study is to explore student perceptions of their own doctoral-level education and quantitative proficiency. Background: The challenges of preparing doctoral students in education have been discussed in the literature, but largely from the perspective of university faculty and program administrators. The current study directly explores the student voice on this issue. Methodology: Utilizing a sequential explanatory mixed-methods research design, the present study seeks to better understand doctoral-level education students’ perceptions of their quantitative methods training at a large public university in the southwestern United States. Findings: Results from both phases present the need for more application and consistency in doctoral-level quantitative courses. Additionally, there was a consistent theme of internal motivation in the responses, suggesting students perceive their quantitative training to be valuable beyond their personal interest in the topic. Recommendations for Practitioners: Quantitative methods instructors should emphasize practice in their quantitative courses and consider providing additional support for students through the inclusion of lab sections, tutoring, and/or differentiation. Pre-testing statistical ability at the start of a course is also suggested to better meet student needs. Impact on Society: The ultimate goal of quantitative methods in doctoral education is to produce high-quality educational researchers who are prepared to apply their knowledge to problems and research in education. Results of the present study can inform faculty and administrator decisions in doctoral education to best support this goal. Future Research: Using the student perspectives presented in the present study, future researchers should continue to explore effective instructional strategies and curriculum design within education doctoral programs. The inclusion of student voice can strengthen

  16. Global quantitative indices reflecting provider process-of-care: data-base derivation

    Directory of Open Access Journals (Sweden)

    Solomon Patricia J

    2010-04-01

    Full Text Available Abstract Background Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. Methods A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35, for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC, reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days, reflecting the time to maximum rate of hospital discharge; and (ii individual patient ability to optimize output (as length-of-stay for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum1], via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. Results The data-set consisted of 223129 patients from 99 ICUs with mean (SD age and APACHE III score of 59.2(18.9 years and 52.7(30.6 respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (≥ 7.8 days and TE (≥ 0.74 were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (≥ 4.2 days and TE (≥ 0.69 were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P ≤ 0.0001. Total explained variance, for survivors (0.89 and non-survivors (0.89, was maximized by

  17. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  18. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  19. Analysis of methods for quantitative renography

    International Nuclear Information System (INIS)

    Archambaud, F.; Maksud, P.; Prigent, A.; Perrin-Fayolle, O.

    1995-01-01

    This article reviews the main methods using renography to estimate renal perfusion indices and to quantify differential and global renal function. The review addresses the pathophysiological significance of estimated parameters according to the underlying models and the choice of the radiopharmaceutical. The dependence of these parameters on the region of interest characteristics and on the methods of background and attenuation corrections are surveyed. Some current recommendations are proposed. (authors). 66 refs., 8 figs

  20. Calibration of quantitative neutron radiography method for moisture measurement

    International Nuclear Information System (INIS)

    Nemec, T.; Jeraj, R.

    1999-01-01

    Quantitative measurements of moisture and hydrogenous matter in building materials by neutron radiography (NR) are regularly performed at TRIGA Mark II research of 'Jozef Stefan' Institute in Ljubljana. Calibration of quantitative method is performed using standard brick samples with known moisture content and also with a secondary standard, plexiglas step wedge. In general, the contribution of scattered neutrons to the neutron image is not determined explicitly what introduces an error to the measured signal. Influence of scattered neutrons is significant in regions with high gradients of moisture concentrations, where the build up of scattered neutrons causes distortion of the moisture concentration profile. In this paper detailed analysis of validity of our calibration method for different geometrical parameters is presented. The error in the measured hydrogen concentration is evaluated by an experiment and compared with results obtained by Monte Carlo calculation with computer code MCNP 4B. Optimal conditions are determined for quantitative moisture measurements in order to minimize the error due to scattered neutrons. The method is tested on concrete samples with high moisture content.(author)

  1. The discussion on the qualitative and quantitative evaluation methods for safety culture

    International Nuclear Information System (INIS)

    Gao Kefu

    2005-01-01

    The fundamental methods for safely culture evaluation are described. Combining with the practice of the quantitative evaluation of safety culture in Daya Bay NPP, the quantitative evaluation method for safety culture are discussed. (author)

  2. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  3. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  4. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  5. Modeling Cancer Metastasis using Global, Quantitative and Integrative Network Biology

    DEFF Research Database (Denmark)

    Schoof, Erwin; Erler, Janine

    understanding of molecular processes which are fundamental to tumorigenesis. In Article 1, we propose a novel framework for how cancer mutations can be studied by taking into account their effect at the protein network level. In Article 2, we demonstrate how global, quantitative data on phosphorylation dynamics...... can be generated using MS, and how this can be modeled using a computational framework for deciphering kinase-substrate dynamics. This framework is described in depth in Article 3, and covers the design of KinomeXplorer, which allows the prediction of kinases responsible for modulating observed...... phosphorylation dynamics in a given biological sample. In Chapter III, we move into Integrative Network Biology, where, by combining two fundamental technologies (MS & NGS), we can obtain more in-depth insights into the links between cellular phenotype and genotype. Article 4 describes the proof...

  6. The method of global learning in teaching foreign languages

    Directory of Open Access Journals (Sweden)

    Tatjana Dragovič

    2001-12-01

    Full Text Available The authors describe the method of global learning of foreign languages, which is based on the principles of neurolinguistic programming (NLP. According to this theory, the educator should use the method of the so-called periphery learning, where students learn relaxation techniques and at the same time they »incidentally « or subconsciously learn a foreign language. The method of global learning imitates successful strategies of learning in early childhood and therefore creates a relaxed attitude towards learning. Global learning is also compared with standard methods.

  7. Conventional method for the calculation of the global energy cost of buildings; Methode conventionnelle de calcul du cout global energetique des batiments

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-05-01

    A working group driven by Electricite de France (EdF), Chauffage Fioul and Gaz de France (GdF) companies has been built with the sustain of several building engineering companies in order to clarify the use of the method of calculation of the global energy cost of buildings. This global cost is an economical decision help criterion among others. This press kit presents, first, the content of the method (input data, calculation of annual expenses, calculation of the global energy cost, display of results and limitations of the method). Then it fully describes the method and its appendixes necessary for its implementation: economical and financial context, general data of the project in progress, environmental data, occupation and comfort level, variants, investment cost of energy systems, investment cost for the structure linked with the energy system, investment cost for other invariant elements of the structure, calculation of consumptions (space heating, hot water, ventilation), maintenance costs (energy systems, structure), operation and exploitation costs, tariffs and consumption costs and taxes, actualized global cost, annualized global cost, comparison between variants. The method is applied to a council building of 23 flats taken as an example. (J.S.)

  8. The method of quantitative X-ray microanalysis of fine inclusions in copper

    International Nuclear Information System (INIS)

    Morawiec, H.; Kubica, L.; Piszczek, J.

    1978-01-01

    The method of correction for the matrix effect in quantitative x-ray microanalysis was presented. The application of the method was discussed on the example of quantitative analysis of fine inclusions of Cu 2 S and Cu 2 O in copper. (author)

  9. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  10. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  11. A generalized quantitative antibody homeostasis model: maintenance of global antibody equilibrium by effector functions.

    Science.gov (United States)

    Prechl, József

    2017-11-01

    The homeostasis of antibodies can be characterized as a balanced production, target-binding and receptor-mediated elimination regulated by an interaction network, which controls B-cell development and selection. Recently, we proposed a quantitative model to describe how the concentration and affinity of interacting partners generates a network. Here we argue that this physical, quantitative approach can be extended for the interpretation of effector functions of antibodies. We define global antibody equilibrium as the zone of molar equivalence of free antibody, free antigen and immune complex concentrations and of dissociation constant of apparent affinity: [Ab]=[Ag]=[AbAg]= K D . This zone corresponds to the biologically relevant K D range of reversible interactions. We show that thermodynamic and kinetic properties of antibody-antigen interactions correlate with immunological functions. The formation of stable, long-lived immune complexes correspond to a decrease of entropy and is a prerequisite for the generation of higher-order complexes. As the energy of formation of complexes increases, we observe a gradual shift from silent clearance to inflammatory reactions. These rules can also be applied to complement activation-related immune effector processes, linking the physicochemical principles of innate and adaptive humoral responses. Affinity of the receptors mediating effector functions shows a wide range of affinities, allowing the continuous sampling of antibody-bound antigen over the complete range of concentrations. The generation of multivalent, multicomponent complexes triggers effector functions by crosslinking these receptors on effector cells with increasing enzymatic degradation potential. Thus, antibody homeostasis is a thermodynamic system with complex network properties, nested into the host organism by proper immunoregulatory and effector pathways. Maintenance of global antibody equilibrium is achieved by innate qualitative signals modulating a

  12. Spring and Its Global Echo: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. V. Korotayev

    2017-01-01

    Full Text Available It is shown that the Arab Spring acted as a trigger for a global wave of socio-political destabilization, which signifi cantly exceeded the scale of the Arab Spring itself and affected absolutely all world-system zones. Only in 2011 the growth of the global number of largescale anti-government demonstrations, riots and political strikes was to a high degree (although not entirely due to their growth in the Arab world. In the ensuing years, the Arab countries rather made a negative contribution to a very noticeable further increase in the global number of large-scale anti-government demonstrations, riots and general strikes (the global intensity of all these three important types of socio-political destabilization continued to grow despite the decline in the Arab world. Thus, for all these three important indicators of sociopolitical destabilization, the scale of the global echo of the Arab Spring has overshadowed the scale of the Arab Spring itself. Only as regards the fourth considered indicator (major terrorist attacks / guerrilla warfare the scale of the global echo for the entire period considered did not overshadow the scale of the Arab Spring (and, incidentally, «Winter» - and in 2014-2015 Arab countries continued to make a disproportionate contribution to the historically record global values of this sad indicator – global number of major terrorist attacks/ guerilla warfare. To conclude, triggered by the Arab Spring, the global wave of socio-political destabilization led after 2010 to a very signifi cant growth of socio-political instability in absolutely all World System zones. However, this global destabilization wave manifested itself in different World System zones in different ways and not completely synchronously.

  13. Global quantitative indices reflecting provider process-of-care: data-base derivation.

    Science.gov (United States)

    Moran, John L; Solomon, Patricia J

    2010-04-19

    Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35), for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i) the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC), reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days), reflecting the time to maximum rate of hospital discharge; and (ii) individual patient ability to optimize output (as length-of-stay) for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum)1]), via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. The data-set consisted of 223129 patients from 99 ICUs with mean (SD) age and APACHE III score of 59.2(18.9) years and 52.7(30.6) respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (>or= 7.8 days) and TE (>or= 0.74) were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (>or= 4.2 days) and TE (>or= 0.69) were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P variance, for survivors (0.89) and non-survivors (0.89), was maximized by combinations of indices demonstrating a low correlation with

  14. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    Science.gov (United States)

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  15. Industrial ecology: Quantitative methods for exploring a lower carbon future

    Science.gov (United States)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  16. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    Science.gov (United States)

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Characterization of global yeast quantitative proteome data generated from the wild-type and glucose repression Saccharomyces cerevisiae strains: The comparison of two quantitative methods

    DEFF Research Database (Denmark)

    Usaite, Renata; Wohlschlegel, James; Venable, John D.

    2008-01-01

    The quantitative proteomic analysis of complex protein mixtures is emerging as a technically challenging but viable systems-level approach for studying cellular function. This study presents a large-scale comparative analysis of protein abundances from yeast protein lysates derived from both wild......-type yeast and yeast strains lacking key components of the Snf1 kinase complex. Four different strains were grown under well-controlled chemostat conditions. Multidimensional protein identification technology followed by quantitation using either spectral counting or stable isotope labeling approaches...... labeling strategy. The stable isotope labeling based quantitative approach was found to be highly reproducible among biological replicates when complex protein mixtures containing small expression changes were analyzed. Where poor correlation between stable isotope labeling and spectral counting was found...

  18. A general method for bead-enhanced quantitation by flow cytometry

    Science.gov (United States)

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  19. Global optimization methods for engineering design

    Science.gov (United States)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  20. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  1. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  2. Quantitative evaluation methods of skin condition based on texture feature parameters

    Directory of Open Access Journals (Sweden)

    Hui Pang

    2017-03-01

    Full Text Available In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  3. Quantitative evaluation methods of skin condition based on texture feature parameters.

    Science.gov (United States)

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  4. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  5. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  6. Quantitative renal cinescintigraphy with iodine-123 hippuran methodological aspects, kit for labeling of hippuran

    International Nuclear Information System (INIS)

    Mehdaoui, A.; Pecking, A.; Delorme, G.; Mathonnat, F.; Debaud, B.; Bardy, A.; Coornaert, S.; Merlin, L.; Vinot, J.M.; Desgrez, A.; Gambini, D.; Vernejoul, P. de.

    1981-08-01

    The development of an extemporaneous kit for the labeling of ortho-iodo-hippuric acid (Hippuran) with iodine 123 allows the performance of a routine quantitative renal cinescintigraphy providing in 20 minutes, and in an absolutely non-traumatic way, a very complete renal morphofunctional study including: a cortical renal scintigraphy, sequential scintigraphies of excretory tract, renal functional curves, tubular, global, and separate clearances for each kidney. This functional quantitative investigation method should take a preferential place in the routine renal balance. The methodology of the technique is explained and compared to classical methods for estimation of tubular, global and separate clearances [fr

  7. Can qualitative and quantitative methods serve complementary purposes for policy research?

    OpenAIRE

    Maxwell, Daniel G.

    1998-01-01

    Qualitative and quantitative methods in social science research have long been separate spheres with little overlap. However, recent innovations have highlighted the complementarity of qualitative and quantitative approaches. The Accra Food and Nutrition Security Study was designed to incorporate the participation of a variety of constituencies in the research, and to rely on a variety of approaches — both qualitative and quantitative — to data collection and analysis. This paper reviews the ...

  8. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  9. Instrumentation and quantitative methods of evaluation. Progress report, January 15-September 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.

    1986-09-01

    This document reports progress under grant entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Individual reports are presented on projects entitled the physical aspects of radionuclide imaging, image reconstruction and quantitative evaluation, PET-related instrumentation for improved quantitation, improvements in the FMI cyclotron for increased utilization, and methodology for quantitative evaluation of diagnostic performance

  10. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  11. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  12. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  13. Embedding Quantitative Methods by Stealth in Political Science: Developing a Pedagogy for Psephology

    Science.gov (United States)

    Gunn, Andrew

    2017-01-01

    Student evaluations of quantitative methods courses in political science often reveal they are characterised by aversion, alienation and anxiety. As a solution to this problem, this paper describes a pedagogic research project with the aim of embedding quantitative methods by stealth into the first-year undergraduate curriculum. This paper…

  14. A global central banker competency model

    Directory of Open Access Journals (Sweden)

    David W. Brits

    2014-07-01

    Full Text Available Orientation: No comprehensive, integrated competency model exists for central bankers. Due to the importance of central banks in the context of the ongoing global financial crisis, it was deemed necessary to design and validate such a model. Research purpose: To craft and validate a comprehensive, integrated global central banker competency model (GCBCM and to assess whether central banks using the GCBCM for training have a higher global influence. Motivation for the study: Limited consensus exists globally about what constitutes a ‘competent’ central banker. A quantitatively validated GCBCM would make a significant contribution to enhancing central banker effectiveness, and also provide a solid foundation for effective people management. Research approach, design and method: A blended quantitative and qualitative research approach was taken. Two sets of hypotheses were tested regarding the relationships between the GCBCM and the training offered, using the model on the one hand, and a central bank’s global influence on the other. Main findings: The GCBCM was generally accepted across all participating central banks globally, although some differences were found between central banks with higher and lower global influence. The actual training offered by central banks in terms of the model, however, is generally limited to technical-functional skills. The GCBCM is therefore at present predominantly aspirational. Significant differences were found regarding the training offered. Practical/managerial implications: By adopting the GCBCM, central banks would be able to develop organisation-specific competency models in order to enhance their organisational capabilities and play their increasingly important global role more effectively. Contribution: A generic conceptual framework for the crafting of a competency model with evaluation criteria was developed. A GCBCM was quantitatively validated.

  15. Simple PVT quantitative method of Kr under high pure N2 condition

    International Nuclear Information System (INIS)

    Li Xuesong; Zhang Zibin; Wei Guanyi; Chen Liyun; Zhai Lihua

    2005-01-01

    A simple PVT quantitative method of Kr in the high pure N 2 was studied. Pressure, volume and temperature of the sample gas were measured by three individual methods to obtain the sum sample with food uncertainty. The ratio of Kr/N 2 could measured by GAM 400 quadrupole mass spectrometer. So the quantity of Kr could be calculated with the two measured data above. This method can be suited for quantitative analysis of other simple composed noble gas sample with high pure carrying gas. (authors)

  16. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  17. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  18. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  19. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...

  20. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  1. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  2. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  3. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  4. Study of the quantitative analysis approach of maintenance by the Monte Carlo simulation method

    International Nuclear Information System (INIS)

    Shimizu, Takashi

    2007-01-01

    This study is examination of the quantitative valuation by Monte Carlo simulation method of maintenance activities of a nuclear power plant. Therefore, the concept of the quantitative valuation of maintenance that examination was advanced in the Japan Society of Maintenology and International Institute of Universality (IUU) was arranged. Basis examination for quantitative valuation of maintenance was carried out at simple feed water system, by Monte Carlo simulation method. (author)

  5. Modelling global container freight transport demand

    NARCIS (Netherlands)

    Tavasszy, L.A.; Ivanova, O.; Halim, R.A.

    2015-01-01

    The objective of this chapter is to discuss methods and techniques for a quantitative and descriptive analysis of future container transport demand at a global level. Information on future container transport flows is useful for various purposes. It is instrumental for the assessment of returns of

  6. Quantitative analysis of iodine in thyroidin. I. Methods of ''dry'' and ''wet'' mineralization

    International Nuclear Information System (INIS)

    Listov, S.A.; Arzamastsev, A.P.

    1986-01-01

    The relative investigations on the quantitative determination of iodine in thyroidin using different modifications of the ''dry'' and ''wet'' mineralization show that in using these methods the difficulties due to the characteristic features of the object of investigation itself and the mineralization method as a whole must be taken into account. The studies show that the most applicable method for the analysis of thyroidin is the method of ''dry'' mineralization with potassium carbonate. A procedure is proposed for a quantitative determination of iodine in thyroidin

  7. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  8. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    Science.gov (United States)

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  9. DREAM: a method for semi-quantitative dermal exposure assessment

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Brouwer, D.H.; Kromhout, H.; Hemmen, J.J. van

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others,

  10. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  11. A quantitative assessment method for the NPP operators' diagnosis of accidents

    International Nuclear Information System (INIS)

    Kim, M. C.; Seong, P. H.

    2003-01-01

    In this research, we developed a quantitative model for the operators' diagnosis of the accident situation when an accident occurs in a nuclear power plant. After identifying the occurrence probabilities of accidents, the unavailabilities of various information sources, and the causal relationship between accidents and information sources, Bayesian network is used for the analysis of the change in the occurrence probabilities of accidents as the operators receive the information related to the status of the plant. The developed method is applied to a simple example case and it turned out that the developed method is a systematic quantitative analysis method which can cope with complex relationship between the accidents and information sources and various variables such accident occurrence probabilities and unavailabilities of various information sources

  12. On the use of quantitative methods in the Danish food industry

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Østergaard, Peder; Kristensen, Kai

    1997-01-01

    Executive summary 1. The paper examines the use of quantitative methods in the Danish food industry and a comparison is made between the food industry and other manufacturing industries. Data was collected in 1991 and 107 manufacturing companies filled in the questionnaire. 20 of the companies were...... orientation is expected to lead to a more intensive use of proactive methods. It will be obvious to compare results from the new investigation with the results presented in this report in order to identify any trends in the use of quantitative methods....... in this paper does not lead to any striking differences between food companies and other manufacturing companies. In both cases there is a heavy concentration on methods used to analyze internal processes. 4. The increasing focus on food products ready for consumption and the general increase in focus on market...

  13. Global methods for reinforced concrete slabs

    International Nuclear Information System (INIS)

    Hoffmann, A.; Lepareux, M.; Combescure, A.

    1985-08-01

    This paper develops the global method strategy to compute elastoplastic thin shells or beams. It is shown how this methodology can be applied to the case of reinforced concrete structures. Two cases of applications are presented: one static, the other dynamic. The numerical results are compared to experimental data

  14. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    Science.gov (United States)

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  15. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    Science.gov (United States)

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (Pmethods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Similar estimates of temperature impacts on global wheat yield by three independent methods

    DEFF Research Database (Denmark)

    Liu, Bing; Asseng, Senthold; Müller, Christoph

    2016-01-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produ......-method ensemble, it was possible to quantify ‘method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.......The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce...... similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries...

  17. An improved fast neutron radiography quantitative measurement method

    International Nuclear Information System (INIS)

    Matsubayashi, Masahito; Hibiki, Takashi; Mishima, Kaichiro; Yoshii, Koji; Okamoto, Koji

    2004-01-01

    The validity of a fast neutron radiography quantification method, the Σ-scaling method, which was originally proposed for thermal neutron radiography was examined with Monte Carlo calculations and experiments conducted at the YAYOI fast neutron source reactor. Water and copper were selected as comparative samples for a thermal neutron radiography case and a dense object, respectively. Although different characteristics on effective macroscopic cross-sections were implied by the simulation, the Σ-scaled experimental results with the fission neutron spectrum cross-sections were well fitted to the measurements for both the water and copper samples. This indicates that the Σ-scaling method could be successfully adopted for quantitative measurements in fast neutron radiography

  18. Global Optimization Ensemble Model for Classification Methods

    Science.gov (United States)

    Anwar, Hina; Qamar, Usman; Muzaffar Qureshi, Abdul Wahab

    2014-01-01

    Supervised learning is the process of data mining for deducing rules from training datasets. A broad array of supervised learning algorithms exists, every one of them with its own advantages and drawbacks. There are some basic issues that affect the accuracy of classifier while solving a supervised learning problem, like bias-variance tradeoff, dimensionality of input space, and noise in the input data space. All these problems affect the accuracy of classifier and are the reason that there is no global optimal method for classification. There is not any generalized improvement method that can increase the accuracy of any classifier while addressing all the problems stated above. This paper proposes a global optimization ensemble model for classification methods (GMC) that can improve the overall accuracy for supervised learning problems. The experimental results on various public datasets showed that the proposed model improved the accuracy of the classification models from 1% to 30% depending upon the algorithm complexity. PMID:24883382

  19. Global Optimization Ensemble Model for Classification Methods

    Directory of Open Access Journals (Sweden)

    Hina Anwar

    2014-01-01

    Full Text Available Supervised learning is the process of data mining for deducing rules from training datasets. A broad array of supervised learning algorithms exists, every one of them with its own advantages and drawbacks. There are some basic issues that affect the accuracy of classifier while solving a supervised learning problem, like bias-variance tradeoff, dimensionality of input space, and noise in the input data space. All these problems affect the accuracy of classifier and are the reason that there is no global optimal method for classification. There is not any generalized improvement method that can increase the accuracy of any classifier while addressing all the problems stated above. This paper proposes a global optimization ensemble model for classification methods (GMC that can improve the overall accuracy for supervised learning problems. The experimental results on various public datasets showed that the proposed model improved the accuracy of the classification models from 1% to 30% depending upon the algorithm complexity.

  20. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  1. Quantitative numerical method for analysing slip traces observed by AFM

    International Nuclear Information System (INIS)

    Veselý, J; Cieslar, M; Coupeau, C; Bonneville, J

    2013-01-01

    Atomic force microscopy (AFM) is used more and more routinely to study, at the nanometre scale, the slip traces produced on the surface of deformed crystalline materials. Taking full advantage of the quantitative height data of the slip traces, which can be extracted from these observations, requires however an adequate and robust processing of the images. In this paper an original method is presented, which allows the fitting of AFM scan-lines with a specific parameterized step function without any averaging treatment of the original data. This yields a quantitative and full description of the changes in step shape along the slip trace. The strength of the proposed method is established on several typical examples met in plasticity by analysing nano-scale structures formed on the sample surface by emerging dislocations. (paper)

  2. An improved method for quantitatively measuring the sequences of total organic carbon and black carbon in marine sediment cores

    Science.gov (United States)

    Xu, Xiaoming; Zhu, Qing; Zhou, Qianzhi; Liu, Jinzhong; Yuan, Jianping; Wang, Jianghai

    2018-01-01

    Understanding global carbon cycle is critical to uncover the mechanisms of global warming and remediate its adverse effects on human activities. Organic carbon in marine sediments is an indispensable part of the global carbon reservoir in global carbon cycling. Evaluating such a reservoir calls for quantitative studies of marine carbon burial, which closely depend on quantifying total organic carbon and black carbon in marine sediment cores and subsequently on obtaining their high-resolution temporal sequences. However, the conventional methods for detecting the contents of total organic carbon or black carbon cannot resolve the following specific difficulties, i.e., (1) a very limited amount of each subsample versus the diverse analytical items, (2) a low and fluctuating recovery rate of total organic carbon or black carbon versus the reproducibility of carbon data, and (3) a large number of subsamples versus the rapid batch measurements. In this work, (i) adopting the customized disposable ceramic crucibles with the microporecontrolled ability, (ii) developing self-made or customized facilities for the procedures of acidification and chemothermal oxidization, and (iii) optimizing procedures and carbon-sulfur analyzer, we have built a novel Wang-Xu-Yuan method (the WXY method) for measuring the contents of total organic carbon or black carbon in marine sediment cores, which includes the procedures of pretreatment, weighing, acidification, chemothermal oxidation and quantification; and can fully meet the requirements of establishing their highresolution temporal sequences, whatever in the recovery, experimental efficiency, accuracy and reliability of the measurements, and homogeneity of samples. In particular, the usage of disposable ceramic crucibles leads to evidently simplify the experimental scenario, which further results in the very high recovery rates for total organic carbon and black carbon. This new technique may provide a significant support for

  3. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  4. Training the next generation of global health advocates through experiential education: A mixed-methods case study evaluation.

    Science.gov (United States)

    Hoffman, Steven J; Silverberg, Sarah L

    2015-10-15

    This case study evaluates a global health education experience aimed at training the next generation of global health advocates. Demand and interest in global health among Canadian students is well documented, despite the difficulty in integrating meaningful experiences into curricula. Global health advocacy was taught to 19 undergraduate students at McMaster University through an experiential education course, during which they developed a national advocacy campaign on global access to medicines. A quantitative survey and an analysis of social network dynamics were conducted, along with a qualitative analysis of written work and course evaluations. Data were interpreted through a thematic synthesis approach. Themes were identified related to students' learning outcomes, experience and class dynamics. The experiential education format helped students gain authentic, real-world experience in global health advocacy and leadership. The tangible implications for their course work was a key motivating factor. While experiential education is an effective tool for some learning outcomes, it is not suitable for all. As well, group dynamics and evaluation methods affect the learning environment. Real-world global health issues, public health practice and advocacy approaches can be effectively taught through experiential education, alongside skills like communication and professionalism. Students developed a nuanced understanding of many strategies, challenges and barriers that exist in advocating for public health ideas. These experiences are potentially empowering and confidence-building despite the heavy time commitment they require. Attention should be given to how such experiences are designed, as course dynamics and grading structure significantly influence students' experience.

  5. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    Science.gov (United States)

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  6. Quantitative Methods in Public Administration: their use and development through time

    NARCIS (Netherlands)

    Groeneveld, S.M.; Tummers, L.G.; Bronkhorst, B.A.C.; Ashikali, T.S.; van Thiel, S.

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001-2010 in four leading journals: JPART, PAR, Governance and PA. Results show

  7. A general first-order global sensitivity analysis method

    International Nuclear Information System (INIS)

    Xu Chonggang; Gertner, George Zdzislaw

    2008-01-01

    Fourier amplitude sensitivity test (FAST) is one of the most popular global sensitivity analysis techniques. The main mechanism of FAST is to assign each parameter with a characteristic frequency through a search function. Then, for a specific parameter, the variance contribution can be singled out of the model output by the characteristic frequency. Although FAST has been widely applied, there are two limitations: (1) the aliasing effect among parameters by using integer characteristic frequencies and (2) the suitability for only models with independent parameters. In this paper, we synthesize the improvement to overcome the aliasing effect limitation [Tarantola S, Gatelli D, Mara TA. Random balance designs for the estimation of first order global sensitivity indices. Reliab Eng Syst Safety 2006; 91(6):717-27] and the improvement to overcome the independence limitation [Xu C, Gertner G. Extending a global sensitivity analysis technique to models with correlated parameters. Comput Stat Data Anal 2007, accepted for publication]. In this way, FAST can be a general first-order global sensitivity analysis method for linear/nonlinear models with as many correlated/uncorrelated parameters as the user specifies. We apply the general FAST to four test cases with correlated parameters. The results show that the sensitivity indices derived by the general FAST are in good agreement with the sensitivity indices derived by the correlation ratio method, which is a non-parametric method for models with correlated parameters

  8. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com [Airbus Group Innovations, Munich (Germany); Grosse, Christian, E-mail: Grosse@tum.de [Technical University Munich (Germany)

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  9. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    International Nuclear Information System (INIS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-01-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented

  10. THE METHOD OF GLOBAL READING FROM AN INTERDISCIPLINARY PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Jasmina Delcheva Dizdarevikj

    2018-04-01

    Full Text Available Primary literacy in Macedonian education is in decline. This assertion has been proved both by the abstract theory, and by the concrete empirical data. Educational reforms in the national curriculum are on their way, and the implementation of the method of global reading is one of the main innovations. Misunderstanding of this method has led it its being criticized as a foreign import and as unnatural and incongruous for the specificities of the Macedonian language. We think that this argument is wrong. That is why this paper is going to extrapolate and explain the method of global learning and its basis in pedagogy, philosophy, psychology, anthropology and linguistics. The main premise of this paper is the relation of the part to the whole, understood from the different perspectives of philosophy, psychology, linguistics and anthropology. The theories of Kant, Cassirer, Bruner, Benveniste and Geertz are going to be considered in the context of the part – whole problem, by themselves, and also in their relation to the method of global reading.

  11. A quantitative method to measure and evaluate the peelability of shrimps (Pandalus borealis)

    DEFF Research Database (Denmark)

    Gringer, Nina; Dang, Tem Thi; Orlien, Vibeke

    2018-01-01

    A novel, standardized method has been developed in order to provide a quantitative description of shrimp peelability. The peeling process was based on the measure of the strength of the shell-muscle attachment of the shrimp using a texture analyzer, and calculated into the peeling work. The self......-consistent method, insensitive of the shrimp size, was proven valid for assessment of ice maturation of shrimps. The quantitative peeling efficiency (peeling work) and performance (degree of shell removal) showed that the decrease in peeling work correlated with the amount of satisfactory peeled shrimps, indicating...... an effective weakening of the shell-muscle attachment. The developed method provides the industry with a quantitative analysis for measurement of peeling efficiency and peeling performance of shrimps. It may be used for comparing different maturation conditions in relation to optimization of shrimps peeling....

  12. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  13. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  14. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    Science.gov (United States)

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  15. A novel method for quantitative geosteering using azimuthal gamma-ray logging

    International Nuclear Information System (INIS)

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-01-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. - Highlights: • A new method is proposed for geosteering by using azimuthal gamma-ray logging. • The new method can quantitatively determine the distance from the drilling bit to the boundary surface while the traditional geosteering method can only qualitatively guide the drilling bit in reservoirs. • The response points of real-time upper and lower gamma line when the logging tool meets high radioactive formation are used to predict the relative dip angles, and then the distance from the drilling bit to the boundary surface is calculated

  16. Model of global evaluation for energetic resources; Modelo de avaliacao global de recursos energeticos

    Energy Technology Data Exchange (ETDEWEB)

    Fujii, Ricardo Junqueira; Udaeta, Miguel Edgar Morales; Galvao, Luiz Claudio Ribeiro [Universidade de Sao Paulo (USP), SP (Brazil). Dept. de Energia e Automacao Eletricas. Grupo de Energia]. E-mail: ricardo_fujii@pea.usp.br; daeta@pea.usp.br; lcgalvao@pea.usp.br

    2006-07-01

    The traditional energy planning usually takes into account the technical economical costs, considered alongside environmental and a few political restraints; however, there is a lack of methods to evenly assess environmental, economical, social and political costs. This work tries to change such scenario by elaborating a model to characterize an energy resource in all four dimensions - environmental, political, social and economical - in an integrated view. The model aims at two objectives: provide a method to assess the global cost of the energy resource and estimate its potential considering the limitations provided by these dimensions. To minimize the complexity of the integration process, the model strongly recommends the use of the Full Cost Accounting - FCA - method to assess the costs and benefits from any given resource. The FCA allows considering quantitative and qualitative costs, reducing the need of quantitative data, which are limited in some cases. The model has been applied in the characterization of the region of Aracatuba, located in the west part of the state of Sao Paulo - Brazil. The results showed that the potential of renewable sources are promising, especially when the global costs are considered. Some resources, in spite of being economically attractive, don't provide an acceptable global cost. It became clear that the model is a valuable tool when the conventional tools fail to address many issues, especially the need of an integrated view on the planning process; the results from this model can be applied in a portfolio selection method to evaluate the best options for a power system expansion. It has to be noticed that the usefulness of this model can be increased when adopted with a method to analyze demand side management measures, thus offering a complete set of possible choices of energy options for the decision maker. (author)

  17. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  18. Report of the methods for quantitative organ evaluation in nuclear medicine

    International Nuclear Information System (INIS)

    Nakata, Shigeru; Akagi, Naoki; Mimura, Hiroaki; Nagaki, Akio; Takahashi, Yasuyuki

    1999-01-01

    The group for the methods in the title herein reported the summary of their investigations on literatures concerning the brain, heart, liver and kidney evaluation. The report consisted of the history, kinetics of the agents, methods for quantitative evaluation and summary for each organ. As for the brain, the quantitative evaluation of cerebral blood flow scintigraphy with 123 I-IMP and 99m Tc-HMPAO or -ECD were reviewed to conclude that the present convenient methods are of problems in precision, for which a novel method and/or tracer should be developed. For cardiac functions, there is a method based either on the behavior of tracer in the blood which is excellent in reproducibility, or on the morphology of cardiac wall of which images can be analyzed alternatively by CT and MRI. For these, 131 I-albumin, 99m Tc-albumin, -red blood cells, -MIBI and -tetrofosmin have been used. For myocardium, 201 Tl has been used to evaluate the ischemic region and, with simultaneous use of 99m Tc-MIBI or -tetrofosmin, the viability. 123 I-BMIPP and -MIBG have been developed for myocardial fatty acid metabolism and for cardiac sympathetic nerve functions. Liver functions have been evaluated by the blood elimination rate, hepatic uptake, hepatic elimination and hepatic blood flow with use of 99m Tc-labeled colloids, -PMT and -GSA. Quantitative evaluation of renal functions is now well established with high precision since the kinetic behavior of the tracers, like 99m Tc-DTPA, -MAG3, -DMSA and 131 I-OIH, is simple. (K.H.)

  19. Getting agile methods to work for Cordys global software product development

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Ligtenberg, Gerwin; Aydin, M.N.; Kotlarsky, J.; Willcocks, L.P.; Oshri, I.

    2011-01-01

    Getting agile methods to work in global software development is a potentially rewarding but challenging task. Agile methods are relatively young and still maturing. The application to globally distributed projects is in its early stages. Various guidelines on how to apply and sometimes adapt agile

  20. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  1. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  5. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  6. Quantitative methods in electroencephalography to access therapeutic response.

    Science.gov (United States)

    Diniz, Roseane Costa; Fontenele, Andrea Martins Melo; Carmo, Luiza Helena Araújo do; Ribeiro, Aurea Celeste da Costa; Sales, Fábio Henrique Silva; Monteiro, Sally Cristina Moutinho; Sousa, Ana Karoline Ferreira de Castro

    2016-07-01

    Pharmacometrics or Quantitative Pharmacology aims to quantitatively analyze the interaction between drugs and patients whose tripod: pharmacokinetics, pharmacodynamics and disease monitoring to identify variability in drug response. Being the subject of central interest in the training of pharmacists, this work was out with a view to promoting this idea on methods to access the therapeutic response of drugs with central action. This paper discusses quantitative methods (Fast Fourier Transform, Magnitude Square Coherence, Conditional Entropy, Generalised Linear semi-canonical Correlation Analysis, Statistical Parametric Network and Mutual Information Function) used to evaluate the EEG signals obtained after administration regimen of drugs, the main findings and their clinical relevance, pointing it as a contribution to construction of different pharmaceutical practice. Peter Anderer et. al in 2000 showed the effect of 20mg of buspirone in 20 healthy subjects after 1, 2, 4, 6 and 8h after oral ingestion of the drug. The areas of increased power of the theta frequency occurred mainly in the temporo-occipital - parietal region. It has been shown by Sampaio et al., 2007 that the use of bromazepam, which allows the release of GABA (gamma amino butyric acid), an inhibitory neurotransmitter of the central nervous system could theoretically promote dissociation of cortical functional areas, a decrease of functional connectivity, a decrease of cognitive functions by means of smaller coherence (electrophysiological magnitude measured from the EEG by software) values. Ahmad Khodayari-Rostamabad et al. in 2015 talk that such a measure could be a useful clinical tool potentially to assess adverse effects of opioids and hence give rise to treatment guidelines. There was the relation between changes in pain intensity and brain sources (at maximum activity locations) during remifentanil infusion despite its potent analgesic effect. The statement of mathematical and computational

  7. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    Science.gov (United States)

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  8. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.

  9. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  10. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    Science.gov (United States)

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  11. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    Science.gov (United States)

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This

  12. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    Science.gov (United States)

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  14. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  15. A method for the quantitative determination of crystalline phases by X-ray

    Science.gov (United States)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  16. VERIFICATION HPLC METHOD OF QUANTITATIVE DETERMINATION OF AMLODIPINE IN TABLETS

    Directory of Open Access Journals (Sweden)

    Khanin V. A

    2014-10-01

    Full Text Available Introduction. Amlodipine ((±-2-[(2-aminoetoksimethyl]-4-(2-chlorophenyl-1,4-dihydro-6-methyl-3,5-pyridine dicarboxylic acid 3-ethyl 5-methyl ester as besylate and small tally belongs to the group of selective long-acting calcium channel blockers, dihydropyridine derivatives. In clinical practice, as antianginal and antihypertensive agent for the treatment of cardiovascular diseases. It is produced in powder form, substance and finished dosage forms (tablets of 2.5, 5 and 10 mg. The scientific literature describes methods of quantitative determination of the drug by spectrophotometry – by his own light absorption and by reaction product with aloksan, chromatography techniques, kinetic-spectrophotometric method in substances and preparations and methods chromatomass spectrometry and stripping voltammetry. For the quantitative determination of amlodipine besylate British Pharmacopoeia and European Pharmacopoeia recommend the use of liquid chromatography method. In connection with the establishment of the second edition of SPhU and when it is comprised of articles on the finished product, we set out to analyze the characteristics of the validation of chromatographic quantitative determination of amlodipine besylate tablets and to verify the analytical procedure. Material & methods. In conducting research using substance amlodipine besylate series number AB0401013. Analysis subject pill “Amlodipine” series number 20113 manufacturer of “Pharmaceutical company “Zdorovye”. Analytical equipment used is: 2695 chromatograph with diode array detector 2996 firms Waters Corp. USA using column Nova-Pak C18 300 x 3,9 mm with a particle size of 4 μm, weight ER-182 company AND Japan, measuring vessel class A. Preparation of the test solution. To accurately sample powder tablets equivalent to 50 mg amlodipine, add 30 ml of methanol, shake for 30 minutes, dilute the solution to 50.0 ml with methanol and filtered. 5 ml of methanol solution adjusted to

  17. A Quantitative Analytical Method to Test for Salt Effects on Giant Unilamellar Vesicles

    DEFF Research Database (Denmark)

    Hadorn, Maik; Bönzli, Eva; Eggenberger Hotz, Peter

    2011-01-01

    preparation method with automatic haemocytometry. We found that this new quantitative screening method is highly reliable and consistent with previously reported results. Thus, this method may provide a significant methodological advance in analysis of effects on free-standing model membranes....

  18. Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods

    Science.gov (United States)

    Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; hide

    2016-01-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

  19. Similar estimates of temperature impacts on global wheat yield by three independent methods

    Science.gov (United States)

    Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan

    2016-12-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

  20. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  1. QUANTITATIVE EVALUATION METHOD OF ELEMENTS PRIORITY OF CARTOGRAPHIC GENERALIZATION BASED ON TAXI TRAJECTORY DATA

    Directory of Open Access Journals (Sweden)

    Z. Long

    2017-09-01

    Full Text Available Considering the lack of quantitative criteria for the selection of elements in cartographic generalization, this study divided the hotspot areas of passengers into parts at three levels, gave them different weights, and then classified the elements from the different hotspots. On this basis, a method was proposed to quantify the priority of elements selection. Subsequently, the quantitative priority of different cartographic elements was summarized based on this method. In cartographic generalization, the method can be preferred to select the significant elements and discard those that are relatively non-significant.

  2. An improved transmutation method for quantitative determination of the components in multicomponent overlapping chromatograms.

    Science.gov (United States)

    Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong

    2004-06-01

    An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.

  3. Advancing tuberculosis drug regimen development through innovative quantitative translational pharmacology methods and approaches.

    Science.gov (United States)

    Hanna, Debra; Romero, Klaus; Schito, Marco

    2017-03-01

    The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.

  4. Comparison between the boundary layer and global resistivity methods for tearing modes in reversed field configurations

    International Nuclear Information System (INIS)

    Santiago, M.A.M.

    1987-01-01

    A review of the problem of growth rate calculations for tearing modes in field reversed Θ-pinches is presented. Its shown that in the several experimental data, the methods used for analysing the plasma with a global finite resistivity has a better quantitative agreement than the boundary layer analysis. A comparative study taking into account the m = 1 resistive kindmode and the m = 2 mode, which is more dangerous for the survey of rotational instabilities of the plasma column is done. It can see that the imaginary component of the eigenfrequency, which determinates the growth rate, has a good agreement with the experimental data and the real component is different from the rotational frequency as it has been measured in some experiments. (author) [pt

  5. The Direct Lighting Computation in Global Illumination Methods

    Science.gov (United States)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  6. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  7. A method of quantitative prediction for sandstone type uranium deposit in Russia and its application

    International Nuclear Information System (INIS)

    Chang Shushuai; Jiang Minzhong; Li Xiaolu

    2008-01-01

    The paper presents the foundational principle of quantitative predication for sandstone type uranium deposits in Russia. Some key methods such as physical-mathematical model construction and deposits prediction are described. The method has been applied to deposits prediction in Dahongshan region of Chaoshui basin. It is concluded that the technique can fortify the method of quantitative predication for sandstone type uranium deposits, and it could be used as a new technique in China. (authors)

  8. Biological characteristics of crucian by quantitative inspection method

    Science.gov (United States)

    Chu, Mengqi

    2015-04-01

    Biological characteristics of crucian by quantitative inspection method Through quantitative inspection method , the biological characteristics of crucian was preliminary researched. Crucian , Belongs to Cypriniformes, Cyprinidae, Carassius auratus, is a kind of main plant-eating omnivorous fish,like Gregarious, selection and ranking. Crucian are widely distributed, perennial water all over the country all have production. Determine the indicators of crucian in the experiment, to understand the growth, reproduction situation of crucian in this area . Using the measured data (such as the scale length ,scale size and wheel diameter and so on) and related functional to calculate growth of crucian in any one year.According to the egg shape, color, weight ,etc to determine its maturity, with the mean egg diameter per 20 eggs and the number of eggs per 0.5 grams, to calculate the relative and absolute fecundity of the fish .Measured crucian were female puberty. Based on the relation between the scale diameter and length and the information, linear relationship between crucian scale diameter and length: y=1.530+3.0649. From the data, the fertility and is closely relative to the increase of age. The older, the more mature gonad development. The more amount of eggs. In addition, absolute fecundity increases with the pituitary gland.Through quantitative check crucian bait food intake by the object, reveals the main food, secondary foods, and chance food of crucian ,and understand that crucian degree of be fond of of all kinds of bait organisms.Fish fertility with weight gain, it has the characteristics of species and populations, and at the same tmes influenced by the age of the individual, body length, body weight, environmental conditions (especially the nutrition conditions), and breeding habits, spawning times factors and the size of the egg. After a series of studies of crucian biological character, provide the ecological basis for local crucian's feeding, breeding

  9. Genetic Diversity of Globally Dispersed Lacustrine Group I Haptophytes: Implications for Quantitative Temperature Reconstructions

    Science.gov (United States)

    Richter, N.; Longo, W. M.; Amaral-Zettler, L. A.; Huang, Y.

    2017-12-01

    There are significant uncertainties surrounding the forcings that drive terrestrial temperature changes on local and regional scales. Quantitative temperature reconstructions from terrestrial sites, such as lakes, help to unravel the fundamental processes that drive changes in temperature on different temporal and spatial scales. Recent studies at Brown University show that distinct alkenones, long chain ketones produced by haptophytes, are found in many freshwater, alkaline lakes in the Northern Hemisphere, highlighting these systems as targets for quantitative continental temperature reconstructions. These freshwater alkenones are produced by the Group I haptophyte phylotype and are characterized by a distinct signature: the presence of isomeric tri-unsaturated ketones and absence of alkenoates. There are currently no cultured representatives of the "Group I" haptophytes, hence they are only known based on their rRNA gene signatures. Here we present robust evidence that Northern Hemispheric freshwater, alkaline lakes with the characteristic "Group I" alkenone signature all host the same clade of Isochrysidales haptophytes. We employed next generation DNA amplicon sequencing to target haptophyte specific hypervariable regions of the large and small-subunit ribosomal RNA gene from 13 different lakes from three continents (i.e., North America, Europe, and Asia). Combined with previously published sequences, our genetic data show that the Group I haptophyte is genetically diverse on a regional and global scale, and even within the same lake. We present two case studies from a suite of five lakes in Alaska and three in Iceland to assess the impact of various environmental factors affecting Group I diversity and alkenone production. Despite the genetic diversity in this group, the overall ketone signature is conserved. Based on global surface sediment samples and in situ Alaskan lake calibrations, alkenones produced by different operational taxonomic units of the Group

  10. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    Science.gov (United States)

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more

  11. Quantitative Global Heat Transfer in a Mach-6 Quiet Tunnel

    Science.gov (United States)

    Sullivan, John P.; Schneider, Steven P.; Liu, Tianshu; Rubal, Justin; Ward, Chris; Dussling, Joseph; Rice, Cody; Foley, Ryan; Cai, Zeimin; Wang, Bo; hide

    2012-01-01

    This project developed quantitative methods for obtaining heat transfer from temperature sensitive paint (TSP) measurements in the Mach-6 quiet tunnel at Purdue, which is a Ludwieg tube with a downstream valve, moderately-short flow duration and low levels of heat transfer. Previous difficulties with inferring heat transfer from TSP in the Mach-6 quiet tunnel were traced to (1) the large transient heat transfer that occurs during the unusually long tunnel startup and shutdown, (2) the non-uniform thickness of the insulating coating, (3) inconsistencies and imperfections in the painting process and (4) the low levels of heat transfer observed on slender models at typical stagnation temperatures near 430K. Repeated measurements were conducted on 7 degree-half-angle sharp circular cones at zero angle of attack in order to evaluate the techniques, isolate the problems and identify solutions. An attempt at developing a two-color TSP method is also summarized.

  12. Global Convergence of Schubert’s Method for Solving Sparse Nonlinear Equations

    Directory of Open Access Journals (Sweden)

    Huiping Cao

    2014-01-01

    Full Text Available Schubert’s method is an extension of Broyden’s method for solving sparse nonlinear equations, which can preserve the zero-nonzero structure defined by the sparse Jacobian matrix and can retain many good properties of Broyden’s method. In particular, Schubert’s method has been proved to be locally and q-superlinearly convergent. In this paper, we globalize Schubert’s method by using a nonmonotone line search. Under appropriate conditions, we show that the proposed algorithm converges globally and superlinearly. Some preliminary numerical experiments are presented, which demonstrate that our algorithm is effective for large-scale problems.

  13. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  14. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    Science.gov (United States)

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  16. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  17. Quantitative ChIP-Seq Normalization Reveals Global Modulation of the Epigenome

    Directory of Open Access Journals (Sweden)

    David A. Orlando

    2014-11-01

    Full Text Available Epigenomic profiling by chromatin immunoprecipitation coupled with massively parallel DNA sequencing (ChIP-seq is a prevailing methodology used to investigate chromatin-based regulation in biological systems such as human disease, but the lack of an empirical methodology to enable normalization among experiments has limited the precision and usefulness of this technique. Here, we describe a method called ChIP with reference exogenous genome (ChIP-Rx that allows one to perform genome-wide quantitative comparisons of histone modification status across cell populations using defined quantities of a reference epigenome. ChIP-Rx enables the discovery and quantification of dynamic epigenomic profiles across mammalian cells that would otherwise remain hidden using traditional normalization methods. We demonstrate the utility of this method for measuring epigenomic changes following chemical perturbations and show how reference normalization of ChIP-seq experiments enables the discovery of disease-relevant changes in histone modification occupancy.

  18. Informatics methods to enable sharing of quantitative imaging research data.

    Science.gov (United States)

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  20. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  1. The Tunneling Method for Global Optimization in Multidimensional Scaling.

    Science.gov (United States)

    Groenen, Patrick J. F.; Heiser, Willem J.

    1996-01-01

    A tunneling method for global minimization in multidimensional scaling is introduced and adjusted for multidimensional scaling with general Minkowski distances. The method alternates a local search step with a tunneling step in which a different configuration is sought with the same STRESS implementation. (SLD)

  2. Quantitative comparison of in situ soil CO2 flux measurement methods

    Science.gov (United States)

    Jennifer D. Knoepp; James M. Vose

    2002-01-01

    Development of reliable regional or global carbon budgets requires accurate measurement of soil CO2 flux. We conducted laboratory and field studies to determine the accuracy and comparability of methods commonly used to measure in situ soil CO2 fluxes. Methods compared included CO2...

  3. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    Science.gov (United States)

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  4. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    Directory of Open Access Journals (Sweden)

    Joanna Kwiecinska-Piróg

    2014-12-01

    Full Text Available Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride and CV (crystal violet application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters. The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  5. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  6. Global positioning method based on polarized light compass system

    Science.gov (United States)

    Liu, Jun; Yang, Jiangtao; Wang, Yubo; Tang, Jun; Shen, Chong

    2018-05-01

    This paper presents a global positioning method based on a polarized light compass system. A main limitation of polarization positioning is the environment such as weak and locally destroyed polarization environments, and the solution to the positioning problem is given in this paper which is polarization image de-noising and segmentation. Therefore, the pulse coupled neural network is employed for enhancing positioning performance. The prominent advantages of the present positioning technique are as follows: (i) compared to the existing position method based on polarized light, better sun tracking accuracy can be achieved and (ii) the robustness and accuracy of positioning under weak and locally destroyed polarization environments, such as cloudy or building shielding, are improved significantly. Finally, some field experiments are given to demonstrate the effectiveness and applicability of the proposed global positioning technique. The experiments have shown that our proposed method outperforms the conventional polarization positioning method, the real time longitude and latitude with accuracy up to 0.0461° and 0.0911°, respectively.

  7. Conductance method for quantitative determination of Photobacterium phosphoreum in fish products

    DEFF Research Database (Denmark)

    Dalgaard, Paw; Mejlholm, Ole; Huss, Hans Henrik

    1996-01-01

    This paper presents the development of a sensitive and selective conductance method for quantitative determination of Photobacterium phosphoreum in fresh fish. A calibration curve with a correlation coefficient of -0.981 was established from conductance detection times (DT) for estimation of cell...

  8. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  9. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    Science.gov (United States)

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  10. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    NARCIS (Netherlands)

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from

  11. A method of quantitative risk assessment for transmission pipeline carrying natural gas

    International Nuclear Information System (INIS)

    Jo, Young-Do; Ahn, Bum Jong

    2005-01-01

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline

  12. Quantitative electromechanical impedance method for nondestructive testing based on a piezoelectric bimorph cantilever

    International Nuclear Information System (INIS)

    Fu, Ji; Tan, Chi; Li, Faxin

    2015-01-01

    The electromechanical impedance (EMI) method, which holds great promise in structural health monitoring (SHM), is usually treated as a qualitative method. In this work, we proposed a quantitative EMI method based on a piezoelectric bimorph cantilever using the sample’s local contact stiffness (LCS) as the identification parameter for nondestructive testing (NDT). Firstly, the equivalent circuit of the contact vibration system was established and the analytical relationship between the cantilever’s contact resonance frequency and the LCS was obtained. As the LCS is sensitive to typical defects such as voids and delamination, the proposed EMI method can then be used for NDT. To verify the equivalent circuit model, two piezoelectric bimorph cantilevers were fabricated and their free resonance frequencies were measured and compared with theoretical predictions. It was found that the stiff cantilever’s EMI can be well predicted by the equivalent circuit model while the soft cantilever’s cannot. Then, both cantilevers were assembled into a homemade NDT system using a three-axis motorized stage for LCS scanning. Testing results on a specimen with a prefabricated defect showed that the defect could be clearly reproduced in the LCS image, indicating the validity of the quantitative EMI method for NDT. It was found that the single-frequency mode of the EMI method can also be used for NDT, which is faster but not quantitative. Finally, several issues relating to the practical application of the NDT method were discussed. The proposed EMI-based NDT method offers a simple and rapid solution for damage evaluation in engineering structures and may also shed some light on EMI-based SHM. (paper)

  13. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  14. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  15. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  16. Comparison of two methods of quantitation in human studies of biodistribution and radiation dosimetry

    International Nuclear Information System (INIS)

    Smith, T.

    1992-01-01

    A simple method of quantitating organ radioactivity content for dosimetry purposes based on relationships between organ count rate and the initial whole body count rate, has been compared with a more rigorous method of absolute quantitation using a transmission scanning technique. Comparisons were on the basis of organ uptake (% administered activity) and resultant organ radiation doses (mGy MBq -1 ) in 6 normal male volunteers given a 99 Tc m -labelled myocardial perfusion imaging agent intravenously at rest and following exercise. In these studies, estimates of individual organ uptakes by the simple method were in error by between +24 and -16% compared with the more accurate method. However, errors on organ dose values were somewhat less and the effective dose was correct to within 3%. (Author)

  17. Quantitative Evaluation of Heavy Duty Machine Tools Remanufacturing Based on Modified Catastrophe Progression Method

    Science.gov (United States)

    shunhe, Li; jianhua, Rao; lin, Gui; weimin, Zhang; degang, Liu

    2017-11-01

    The result of remanufacturing evaluation is the basis for judging whether the heavy duty machine tool can remanufacture in the EOL stage of the machine tool lifecycle management.The objectivity and accuracy of evaluation is the key to the evaluation method.In this paper, the catastrophe progression method is introduced into the quantitative evaluation of heavy duty machine tools’ remanufacturing,and the results are modified by the comprehensive adjustment method,which makes the evaluation results accord with the standard of human conventional thinking.Using the catastrophe progression method to establish the heavy duty machine tools’ quantitative evaluation model,to evaluate the retired TK6916 type CNC floor milling-boring machine’s remanufacturing.The evaluation process is simple,high quantification,the result is objective.

  18. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    Science.gov (United States)

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  19. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  20. A comparison of visual and quantitative methods to identify interstitial lung abnormalities

    OpenAIRE

    Kliment, Corrine R.; Araki, Tetsuro; Doyle, Tracy J.; Gao, Wei; Dupuis, Jos?e; Latourelle, Jeanne C.; Zazueta, Oscar E.; Fernandez, Isis E.; Nishino, Mizuki; Okajima, Yuka; Ross, James C.; Est?par, Ra?l San Jos?; Diaz, Alejandro A.; Lederer, David J.; Schwartz, David A.

    2015-01-01

    Background: Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density met...

  1. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  2. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  3. A global calibration method for multiple vision sensors based on multiple targets

    International Nuclear Information System (INIS)

    Liu, Zhen; Zhang, Guangjun; Wei, Zhenzhong; Sun, Junhua

    2011-01-01

    The global calibration of multiple vision sensors (MVS) has been widely studied in the last two decades. In this paper, we present a global calibration method for MVS with non-overlapping fields of view (FOVs) using multiple targets (MT). MT is constructed by fixing several targets, called sub-targets, together. The mutual coordinate transformations between sub-targets need not be known. The main procedures of the proposed method are as follows: one vision sensor is selected from MVS to establish the global coordinate frame (GCF). MT is placed in front of the vision sensors for several (at least four) times. Using the constraint that the relative positions of all sub-targets are invariant, the transformation matrix from the coordinate frame of each vision sensor to GCF can be solved. Both synthetic and real experiments are carried out and good result is obtained. The proposed method has been applied to several real measurement systems and shown to be both flexible and accurate. It can serve as an attractive alternative to existing global calibration methods

  4. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    Science.gov (United States)

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  5. A REVIEW OF QUANTITATIVE METHODS FOR STUDIES OF MINERAL-CONTENT OF INTRAORAL INCIPIENT CARIES LESIONS

    NARCIS (Netherlands)

    TENBOSCH, JJ; ANGMARMANSSON, B

    Modern prospective caries studies require the measurement of small changes in tooth mineral content. Quantitative measurements of changes in mineral content in a single caries lesion is desirable. Quantitative methods can be either destructive or non-destructive. The latter type permits longitudinal

  6. SOCIOLOGICAL MEDIA: MAXIMIZING STUDENT INTEREST IN QUANTITATIVE METHODS VIA COLLABORATIVE USE OF DIGITAL MEDIA

    Directory of Open Access Journals (Sweden)

    Frederick T. Tucker

    2016-10-01

    Full Text Available College sociology lecturers are tasked with inspiring student interest in quantitative methods despite widespread student anxiety about the subject, and a tendency for students to relieve classroom anxiety through habitual web browsing. In this paper, the author details the results of a pedagogical program whereby students at a New York City community college used industry-standard software to design, conduct, and analyze sociological surveys of one another, with the aim of inspiring student interest in quantitative methods and enhancing technical literacy. A chi-square test of independence was performed to determine the effect of the pedagogical process on the students’ ability to discuss sociological methods unrelated to their surveys in their final papers, compared with the author’s students from the previous semester who did not undergo the pedagogical program. The relation between these variables was significant, χ 2(3, N=36 = 9.8, p = .02. Findings suggest that community college students, under lecturer supervision, with minimal prior statistical knowledge, and access to digital media can collaborate in small groups to create and conduct sociological surveys, and discuss methods and results in limited classroom time. College sociology lecturers, instead of combatting student desire to use digital media, should harness this desire to advance student mastery of quantitative methods.

  7. A gas chromatography-mass spectrometry method for the quantitation of clobenzorex.

    Science.gov (United States)

    Cody, J T; Valtier, S

    1999-01-01

    Drugs metabolized to amphetamine or methamphetamine are potentially significant concerns in the interpretation of amphetamine-positive urine drug-testing results. One of these compounds, clobenzorex, is an anorectic drug that is available in many countries. Clobenzorex (2-chlorobenzylamphetamine) is metabolized to amphetamine by the body and excreted in the urine. Following administration, the parent compound was detectable for a shorter time than the metabolite amphetamine, which could be detected for days. Because of the potential complication posed to the interpretation of amphetamin-positive drug tests following administration of this drug, the viability of a current amphetamine procedure using liquid-liquid extraction and conversion to the heptafluorobutyryl derivative followed by gas chromatography-mass spectrometry (GC-MS) analysis was evaluated for identification and quantitation of clobenzorex. Qualitative identification of the drug was relatively straightforward. Quantitative analysis proved to be a far more challenging process. Several compounds were evaluated for use as the internal standard in this method, including methamphetamine-d11, fenfluramine, benzphetamine, and diphenylamine. Results using these compounds proved to be less than satisfactory because of poor reproducibility of the quantitative values. Because of its similar chromatographic properties to the parent drug, the compound 3-chlorobenzylamphetamine (3-Cl-clobenzorex) was evaluated in this study as the internal standard for the quantitation of clobenzorex. Precision studies showed 3-Cl-clobenzorex to produce accurate and reliable quantitative results (within-run relative standard deviations [RSDs] clobenzorex.

  8. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  9. A direct method for estimating the alpha/beta ratio from quantitative dose-response data

    International Nuclear Information System (INIS)

    Stuschke, M.

    1989-01-01

    A one-step optimization method based on a least squares fit of the linear quadratic model to quantitative tissue response data after fractionated irradiation is proposed. Suitable end-points that can be analysed by this method are growth delay, host survival and quantitative biochemical or clinical laboratory data. The functional dependence between the transformed dose and the measured response is approximated by a polynomial. The method allows for the estimation of the alpha/beta ratio and its confidence limits from all observed responses of the different fractionation schedules. Censored data can be included in the analysis. A method to test the appropriateness of the fit is presented. A computer simulation illustrates the method and its accuracy as examplified by the growth delay end point. A comparison with a fit of the linear quadratic model to interpolated isoeffect doses shows the advantages of the direct method. (orig./HP) [de

  10. Benchmarking sample preparation/digestion protocols reveals tube-gel being a fast and repeatable method for quantitative proteomics.

    Science.gov (United States)

    Muller, Leslie; Fornecker, Luc; Van Dorsselaer, Alain; Cianférani, Sarah; Carapito, Christine

    2016-12-01

    Sample preparation, typically by in-solution or in-gel approaches, has a strong influence on the accuracy and robustness of quantitative proteomics workflows. The major benefit of in-gel procedures is their compatibility with detergents (such as SDS) for protein solubilization. However, SDS-PAGE is a time-consuming approach. Tube-gel (TG) preparation circumvents this drawback as it involves directly trapping the sample in a polyacrylamide gel matrix without electrophoresis. We report here the first global label-free quantitative comparison between TG, stacking gel (SG), and basic liquid digestion (LD). A series of UPS1 standard mixtures (at 0.5, 1, 2.5, 5, 10, and 25 fmol) were spiked in a complex yeast lysate background. TG preparation allowed more yeast proteins to be identified than did the SG and LD approaches, with mean numbers of 1979, 1788, and 1323 proteins identified, respectively. Furthermore, the TG method proved equivalent to SG and superior to LD in terms of the repeatability of the subsequent experiments, with mean CV for yeast protein label-free quantifications of 7, 9, and 10%. Finally, known variant UPS1 proteins were successfully detected in the TG-prepared sample within a complex background with high sensitivity. All the data from this study are accessible on ProteomeXchange (PXD003841). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A quantitative method to determine the orientation of collagen fibers in the dermis

    NARCIS (Netherlands)

    Noorlander, Maril L.; Melis, Paris; Jonker, Ard; van Noorden, Cornelis J. F.

    2002-01-01

    We have developed a quantitative microscopic method to determine changes in the orientation of collagen fibers in the dermis resulting from mechanical stress. The method is based on the use of picrosirius red-stained cryostat sections of piglet skin in which collagen fibers reflect light strongly

  12. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  13. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  14. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    Science.gov (United States)

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  16. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method.

    Directory of Open Access Journals (Sweden)

    Ganglong Yang

    Full Text Available The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia, KK47 (low grade nonmuscle invasive bladder cancer, NMIBC, and YTS1 (metastatic bladder cancer have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer.

  17. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  18. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    Science.gov (United States)

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  19. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody

    Energy Technology Data Exchange (ETDEWEB)

    Yoshinari, Tomoya [Division of Microbiology, National Institute of Health Sciences, 1-18-1, Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena [Biomedical Division, Ushio Inc., 1-12 Minamiwatarida-cho, Kawasaki-ku, Kawasaki 210-0855 (Japan); Ohkawa, Hideo [Research Center for Environmental Genomics, Kobe University, 1-1 Rokkodai, Nada, Kobe 657-8501 (Japan); Sugita-Konishi, Yoshiko, E-mail: y-konishi@azabu-u.ac.jp [Department of Food and Life Science, Azabu University, 1-17-71 Fuchinobe, Chuo-ku, Sagamihara, Kanagawa 252-5201 (Japan)

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L{sup −1}. The coefficients of variation were 7.9% at 0.003 mg L{sup −1}, 5.0% at 0.03 mg L{sup −1} and 13.7% at 0.3 mg L{sup −1}, respectively. The limit of detection was 0.006 mg L{sup −1} for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9–100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg{sup −1}. The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R{sup 2} = 0.9760) than the immunochromatographic assay kit (R{sup 2} = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. - Highlights: • A rapid method for quantitation of DON using Q-body has been developed. • A recovery test using the anti-DON Q-body was performed. • The concentrations of DON in wheat

  20. Advantages of a Dynamic RGGG Method in Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2009-01-01

    Various researches have been conducted in order to analyze dynamic interactions among components and process variables in nuclear power plants which cannot be handled by static reliability analysis methods such as conventional fault tree and event tree techniques. A dynamic reliability graph with general gates (RGGG) method was proposed for an intuitive modeling of dynamic systems and it enables one to easily analyze huge and complex systems. In this paper, advantages of the dynamic RGGG method are assessed through two stages: system modeling and quantitative analysis. And then a software tool for dynamic RGGG method is introduced and an application to a real dynamic system is accompanied

  1. A method for three-dimensional quantitative observation of the microstructure of biological samples

    Science.gov (United States)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  2. Characterization of working iron Fischer-Tropsch catalysts using quantitative diffraction methods

    Science.gov (United States)

    Mansker, Linda Denise

    This study presents the results of the ex-situ characterization of working iron Fischer-Tropsch synthesis (F-TS) catalysts, reacted hundreds of hours at elevated pressures, using a new quantitative x-ray diffraction analytical methodology. Compositions, iron phase structures, and phase particle morphologies were determined and correlated with the observed reaction kinetics. Conclusions were drawn about the character of each catalyst in its most and least active state. The identity of the active phase(s) in the Fe F-TS catalyst has been vigorously debated for more than 45 years. The highly-reduced catalyst, used to convert coal-derived syngas to hydrocarbon products, is thought to form a mixture of oxides, metal, and carbides upon pretreatment and reaction. Commonly, Soxhlet extraction is used to effect catalyst-product slurry separation; however, the extraction process could be producing irreversible changes in the catalyst, contributing to the conflicting results in the literature. X-ray diffraction doesn't require analyte-matrix separation before analysis, and can detect trace phases down to 300 ppm/2 nm; thus, working catalyst slurries could be characterized as-sampled. Data were quantitatively interpreted employing first principles methods, including the Rietveld polycrystalline structure method. Pretreated catalysts and pure phases were examined experimentally and modeled to explore specific behavior under x-rays. Then, the working catalyst slurries were quantitatively characterized. Empirical quantitation factors were calculated from experimental data or single crystal parameters, then validated using the Rietveld method results. In the most active form, after pretreatment in H 2 or in CO at Pambient, well-preserved working catalysts contained significant amounts of Fe7C3 with trace alpha-Fe, once reaction had commenced at elevated pressure. Amounts of Fe3O 4 were constant and small, with carbide dpavg 65 wt%, regardless of pretreatment gas and pressure, with

  3. The value of quantitative methods for assessment of renal transplant and comparison with physician expertness

    International Nuclear Information System (INIS)

    Firouzi, F.; Fazeli, M.

    2002-01-01

    Radionuclide renal diagnostic studies play an important role in assessing renal allograft. Various quantitative parameters have been derived from the Radionuclide renogram to facilitate and confirm the changes in perfusion and/or function of kidney allograft. These quantitative methods were divided into parameters used for assessing renal graft perfusion and parameters used for evaluating parenchymal function. The blood flow in renal transplants can be quantified by measuring the rate of activity appearance in the kidney graft and the ratio of the integral activity under the transplanted kidney and arterial curves e.g. Hilton's perfusion index and Karachi's kidney/aortic ratio. Quantitative evaluation of graft extraction and excretion was assessed by parameters derived from 123 I/ 131 I-OH, 99 mTc-DTPA or 99 mTc-Mag renogram. In this study we review retrospectively renal transplanted patients scintigraphies that all of them under gone to renal allograft needle biopsy nearly to date of allograft scan. We performed quantitative methods for all patients. We observed perfusion parameters affected by quality of bolus injection and numerical aviations related to changes in the site and size of region of interest. Quantitative methods for renal parenchymal functions were nonspecific and far from defining a specific cause of graft dysfunction. In conclusion, neither perfusion nor parenchymal parameters have not enough diagnostic power for specific diagnosis of graft dysfunction. Physician expertness by using scintigraphic images and renogram curves is more sensitive and specific for diagnosis of renal allograft dysfunction

  4. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    Science.gov (United States)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  5. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    Directory of Open Access Journals (Sweden)

    Angelo Fontana

    2013-09-01

    Full Text Available Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc. or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR. The protocol uses a reference electronic signal as external standard (ERETIC method and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids.

  6. Quantitative method of measuring cancer cell urokinase and metastatic potential

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  7. Use of local noise power spectrum and wavelet analysis in quantitative image quality assurance for EPIDs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Soyoung [Department of Radiation Oncology, University Hospitals Case and Medical Center, Cleveland, Ohio 44106 (United States); Yan, Guanghua; Bassett, Philip; Samant, Sanjiv, E-mail: samant@ufl.edu [Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida 32608 (United States); Gopal, Arun [Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, Maryland 21201 (United States)

    2016-09-15

    Purpose: To investigate the use of local noise power spectrum (NPS) to characterize image noise and wavelet analysis to isolate defective pixels and inter-subpanel flat-fielding artifacts for quantitative quality assurance (QA) of electronic portal imaging devices (EPIDs). Methods: A total of 93 image sets including custom-made bar-pattern images and open exposure images were collected from four iViewGT a-Si EPID systems over three years. Global quantitative metrics such as modulation transform function (MTF), NPS, and detective quantum efficiency (DQE) were computed for each image set. Local NPS was also calculated for individual subpanels by sampling region of interests within each subpanel of the EPID. The 1D NPS, obtained by radially averaging the 2D NPS, was fitted to a power-law function. The r-square value of the linear regression analysis was used as a singular metric to characterize the noise properties of individual subpanels of the EPID. The sensitivity of the local NPS was first compared with the global quantitative metrics using historical image sets. It was then compared with two commonly used commercial QA systems with images collected after applying two different EPID calibration methods (single-level gain and multilevel gain). To detect isolated defective pixels and inter-subpanel flat-fielding artifacts, Haar wavelet transform was applied on the images. Results: Global quantitative metrics including MTF, NPS, and DQE showed little change over the period of data collection. On the contrary, a strong correlation between the local NPS (r-square values) and the variation of the EPID noise condition was observed. The local NPS analysis indicated image quality improvement with the r-square values increased from 0.80 ± 0.03 (before calibration) to 0.85 ± 0.03 (after single-level gain calibration) and to 0.96 ± 0.03 (after multilevel gain calibration), while the commercial QA systems failed to distinguish the image quality improvement between the two

  8. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  9. Unbiased stereological methods used for the quantitative evaluation of guided bone regeneration

    DEFF Research Database (Denmark)

    Aaboe, Else Merete; Pinholt, E M; Schou, S

    1998-01-01

    The present study describes the use of unbiased stereological methods for the quantitative evaluation of the amount of regenerated bone. Using the principle of guided bone regeneration the amount of regenerated bone after placement of degradable or non-degradable membranes covering defects...

  10. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    Science.gov (United States)

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  11. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  12. Analysis of global multiscale finite element methods for wave equations with continuum spatial scales

    KAUST Repository

    Jiang, Lijian; Efendiev, Yalchin; Ginting, Victor

    2010-01-01

    In this paper, we discuss a numerical multiscale approach for solving wave equations with heterogeneous coefficients. Our interest comes from geophysics applications and we assume that there is no scale separation with respect to spatial variables. To obtain the solution of these multiscale problems on a coarse grid, we compute global fields such that the solution smoothly depends on these fields. We present a Galerkin multiscale finite element method using the global information and provide a convergence analysis when applied to solve the wave equations. We investigate the relation between the smoothness of the global fields and convergence rates of the global Galerkin multiscale finite element method for the wave equations. Numerical examples demonstrate that the use of global information renders better accuracy for wave equations with heterogeneous coefficients than the local multiscale finite element method. © 2010 IMACS.

  13. Analysis of global multiscale finite element methods for wave equations with continuum spatial scales

    KAUST Repository

    Jiang, Lijian

    2010-08-01

    In this paper, we discuss a numerical multiscale approach for solving wave equations with heterogeneous coefficients. Our interest comes from geophysics applications and we assume that there is no scale separation with respect to spatial variables. To obtain the solution of these multiscale problems on a coarse grid, we compute global fields such that the solution smoothly depends on these fields. We present a Galerkin multiscale finite element method using the global information and provide a convergence analysis when applied to solve the wave equations. We investigate the relation between the smoothness of the global fields and convergence rates of the global Galerkin multiscale finite element method for the wave equations. Numerical examples demonstrate that the use of global information renders better accuracy for wave equations with heterogeneous coefficients than the local multiscale finite element method. © 2010 IMACS.

  14. Semi-quantitative methods yield greater inter- and intraobserver agreement than subjective methods for interpreting 99m technetium-hydroxymethylene-diphosphonate uptake in equine thoracic processi spinosi.

    Science.gov (United States)

    van Zadelhoff, Claudia; Ehrle, Anna; Merle, Roswitha; Jahn, Werner; Lischer, Christoph

    2018-05-09

    Scintigraphy is a standard diagnostic method for evaluating horses with back pain due to suspected thoracic processus spinosus pathology. Lesion detection is based on subjective or semi-quantitative assessments of increased uptake. This retrospective, analytical study is aimed to compare semi-quantitative and subjective methods in the evaluation of scintigraphic images of the processi spinosi in the equine thoracic spine. Scintigraphic images of 20 Warmblood horses, presented for assessment of orthopedic conditions between 2014 and 2016, were included in the study. Randomized, blinded image evaluation was performed by 11 veterinarians using subjective and semi-quantitative methods. Subjective grading was performed for the analysis of red-green-blue and grayscale scintigraphic images, which were presented in full-size or as masked images. For the semi-quantitative assessment, observers placed regions of interest over each processus spinosus. The uptake ratio of each processus spinosus in comparison to a reference region of interest was determined. Subsequently, a modified semi-quantitative calculation was developed whereby only the highest counts-per-pixel for a specified number of pixels was processed. Inter- and intraobserver agreement was calculated using intraclass correlation coefficients. Inter- and intraobserver intraclass correlation coefficients were 41.65% and 71.39%, respectively, for the subjective image assessment. Additionally, a correlation between intraobserver agreement, experience, and grayscale images was identified. The inter- and intraobserver agreement was significantly increased when using semi-quantitative analysis (97.35% and 98.36%, respectively) or the modified semi-quantitative calculation (98.61% and 98.82%, respectively). The proposed modified semi-quantitative technique showed a higher inter- and intraobserver agreement when compared to other methods, which makes it a useful tool for the analysis of scintigraphic images. The

  15. Lattice Boltzmann methods for global linear instability analysis

    Science.gov (United States)

    Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis

    2017-12-01

    Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.

  16. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    Science.gov (United States)

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  17. A novel semi-quantitative method for measuring tissue bleeding.

    Science.gov (United States)

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  18. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    Science.gov (United States)

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  19. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    Science.gov (United States)

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  20. The Global Optimal Algorithm of Reliable Path Finding Problem Based on Backtracking Method

    Directory of Open Access Journals (Sweden)

    Liang Shen

    2017-01-01

    Full Text Available There is a growing interest in finding a global optimal path in transportation networks particularly when the network suffers from unexpected disturbance. This paper studies the problem of finding a global optimal path to guarantee a given probability of arriving on time in a network with uncertainty, in which the travel time is stochastic instead of deterministic. Traditional path finding methods based on least expected travel time cannot capture the network user’s risk-taking behaviors in path finding. To overcome such limitation, the reliable path finding algorithms have been proposed but the convergence of global optimum is seldom addressed in the literature. This paper integrates the K-shortest path algorithm into Backtracking method to propose a new path finding algorithm under uncertainty. The global optimum of the proposed method can be guaranteed. Numerical examples are conducted to demonstrate the correctness and efficiency of the proposed algorithm.

  1. A method for improving global pyranometer measurements by modeling responsivity functions

    Energy Technology Data Exchange (ETDEWEB)

    Lester, A. [Smith College, Northampton, MA 01063 (United States); Myers, D.R. [National Renewable Energy Laboratory, 1617 Cole Blvd., Golden, CO 80401 (United States)

    2006-03-15

    Accurate global solar radiation measurements are crucial to climate change research and the development of solar energy technologies. Pyranometers produce an electrical signal proportional to global irradiance. The signal-to-irradiance ratio is the responsivity (RS) of the instrument (RS=signal/irradiance=microvolts/(W/m{sup 2})). Most engineering measurements are made using a constant RS. It is known that RS varies with day of year, zenith angle, and net infrared radiation. This study proposes a method to find an RS function to model a pyranometer's changing RS. Using a reference irradiance calculated from direct and diffuse instruments, we found instantaneous RS for two global pyranometers over 31 sunny days in a two-year period. We performed successive independent regressions of the error between the constant and instantaneous RS with respect to zenith angle, day of year, and net infrared to obtain an RS function. An alternative method replaced the infrared regression with an independently developed technique to account for thermal offset. Results show improved uncertainties with the function method than with the single-calibration value. Lower uncertainties also occur using a black-and-white (8-48), rather than all-black (PSP), shaded pyranometer as the diffuse reference instrument. We conclude that the function method is extremely effective in reducing uncertainty in the irradiance measurements for global PSP pyranometers if they are calibrated at the deployment site. Furthermore, it was found that the function method accounts for the pyranometer's thermal offset, rendering further corrections unnecessary. The improvements in irradiance data achieved in this study will serve to increase the accuracy of solar energy assessments and atmospheric research. (author)

  2. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  3. Combining Multidisciplinary Science, Quantitative Reasoning and Social Context to Teach Global Sustainability and Prepare Students for 21st Grand Challenges

    Science.gov (United States)

    Myers, J. D.

    2011-12-01

    The Earth's seven billion humans are consuming a growing proportion of the world's ecosystem products and services. Human activity has also wrought changes that rival the scale of many natural geologic processes, e.g. erosion, transport and deposition, leading to recognition of a new geological epoch, the Anthropocene. Because of these impacts, several natural systems have been pushed beyond the planetary boundaries that made the Holocene favorable for the expansion of humanity. Given these human-induced stresses on natural systems, global citizens will face an increasing number of grand challenges. Unfortunately, traditional discipline-based introductory science courses do little to prepare students for these complex, scientifically-based and technologically-centered challenges. With NSF funding, an introductory, integrated science course stressing quantitative reasoning and social context has been created at UW. The course (GEOL1600: Global Sustainability: Managing the Earth's Resources) is a lower division course designed around the energy-water-climate (EWC) nexus and integrating biology, chemistry, Earth science and physics. It melds lectures, lecture activities, reading questionnaires and labs to create a learning environment that examines the EWT nexus from a global through regional context. The focus on the EWC nexus, while important socially and intended to motivate students, also provides a coherent framework for identifying which disciplinary scientific principles and concepts to include in the course: photosynthesis and deep time (fossil fuels), biogeochemical cycles (climate), chemical reactions (combustion), electromagnetic radiation (solar power), nuclear physics (nuclear power), phase changes and diagrams (water and climate), etc. Lecture activities are used to give students the practice they need to make quantitative skills routine and automatic. Laboratory exercises on energy (coal, petroleum, nuclear power), water (in Bangladesh), energy

  4. Use of quantitative molecular diagnostic methods to identify causes of diarrhoea in children: a reanalysis of the GEMS case-control study.

    Science.gov (United States)

    Liu, Jie; Platts-Mills, James A; Juma, Jane; Kabir, Furqan; Nkeze, Joseph; Okoi, Catherine; Operario, Darwin J; Uddin, Jashim; Ahmed, Shahnawaz; Alonso, Pedro L; Antonio, Martin; Becker, Stephen M; Blackwelder, William C; Breiman, Robert F; Faruque, Abu S G; Fields, Barry; Gratz, Jean; Haque, Rashidul; Hossain, Anowar; Hossain, M Jahangir; Jarju, Sheikh; Qamar, Farah; Iqbal, Najeeha Talat; Kwambana, Brenda; Mandomando, Inacio; McMurry, Timothy L; Ochieng, Caroline; Ochieng, John B; Ochieng, Melvin; Onyango, Clayton; Panchalingam, Sandra; Kalam, Adil; Aziz, Fatima; Qureshi, Shahida; Ramamurthy, Thandavarayan; Roberts, James H; Saha, Debasish; Sow, Samba O; Stroup, Suzanne E; Sur, Dipika; Tamboura, Boubou; Taniuchi, Mami; Tennant, Sharon M; Toema, Deanna; Wu, Yukun; Zaidi, Anita; Nataro, James P; Kotloff, Karen L; Levine, Myron M; Houpt, Eric R

    2016-09-24

    Diarrhoea is the second leading cause of mortality in children worldwide, but establishing the cause can be complicated by diverse diagnostic approaches and varying test characteristics. We used quantitative molecular diagnostic methods to reassess causes of diarrhoea in the Global Enteric Multicenter Study (GEMS). GEMS was a study of moderate to severe diarrhoea in children younger than 5 years in Africa and Asia. We used quantitative real-time PCR (qPCR) to test for 32 enteropathogens in stool samples from cases and matched asymptomatic controls from GEMS, and compared pathogen-specific attributable incidences with those found with the original GEMS microbiological methods, including culture, EIA, and reverse-transcriptase PCR. We calculated revised pathogen-specific burdens of disease and assessed causes in individual children. We analysed 5304 sample pairs. For most pathogens, incidence was greater with qPCR than with the original methods, particularly for adenovirus 40/41 (around five times), Shigella spp or enteroinvasive Escherichia coli (EIEC) and Campylobactor jejuni o C coli (around two times), and heat-stable enterotoxin-producing E coli ([ST-ETEC] around 1·5 times). The six most attributable pathogens became, in descending order, Shigella spp, rotavirus, adenovirus 40/41, ST-ETEC, Cryptosporidium spp, and Campylobacter spp. Pathogen-attributable diarrhoeal burden was 89·3% (95% CI 83·2-96·0) at the population level, compared with 51·5% (48·0-55·0) in the original GEMS analysis. The top six pathogens accounted for 77·8% (74·6-80·9) of all attributable diarrhoea. With use of model-derived quantitative cutoffs to assess individual diarrhoeal cases, 2254 (42·5%) of 5304 cases had one diarrhoea-associated pathogen detected and 2063 (38·9%) had two or more, with Shigella spp and rotavirus being the pathogens most strongly associated with diarrhoea in children with mixed infections. A quantitative molecular diagnostic approach improved population

  5. A Method for Quantitative Determination of Biofilm Viability

    Directory of Open Access Journals (Sweden)

    Maria Strømme

    2012-06-01

    Full Text Available In this study we present a scheme for quantitative determination of biofilm viability offering significant improvement over existing methods with metabolic assays. Existing metabolic assays for quantifying viable bacteria in biofilms usually utilize calibration curves derived from planktonic bacteria, which can introduce large errors due to significant differences in the metabolic and/or growth rates of biofilm bacteria in the assay media compared to their planktonic counterparts. In the presented method we derive the specific growth rate of Streptococcus mutans bacteria biofilm from a series of metabolic assays using the pH indicator phenol red, and show that this information could be used to more accurately quantify the relative number of viable bacteria in a biofilm. We found that the specific growth rate of S. mutans in biofilm mode of growth was 0.70 h−1, compared to 1.09 h−1 in planktonic growth. This method should be applicable to other bacteria types, as well as other metabolic assays, and, for example, to quantify the effect of antibacterial treatments or the performance of bactericidal implant surfaces.

  6. On the theory of global population growth

    International Nuclear Information System (INIS)

    Kapitza, Sergei P

    2010-01-01

    Ours is an epoch of global demographic revolution, a time of a rapid transition from explosive population growth to a low reproduction level. This, possibly the most momentous change ever witnessed by humankind has, first and foremost, important implications for the dynamics of population. But it also affects billions of people in all aspects of their lives, and it is for this reason that demographic processes have grown into a vast problem, both globally and in Russia. Their fundamental understanding will to a large extent impact the present, the short-term future following the current critical epoch, the stable and uniform global development and its priorities, and indeed global security. Quantitative treatment of historical processes is reached using the phenomenological theory of mankind's population growth. This theory relies on the concepts and methods of physics and its conclusions should take into account the ideas of economics and genetics. (interdisciplinary physics)

  7. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    International Nuclear Information System (INIS)

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired

  8. Effects of ROI definition and reconstruction method on quantitative outcome and applicability in a response monitoring trial

    International Nuclear Information System (INIS)

    Krak, Nanda C.; Boellaard, R.; Hoekstra, Otto S.; Hoekstra, Corneline J.; Twisk, Jos W.R.; Lammertsma, Adriaan A.

    2005-01-01

    Quantitative measurement of tracer uptake in a tumour can be influenced by a number of factors, including the method of defining regions of interest (ROIs) and the reconstruction parameters used. The main purpose of this study was to determine the effects of different ROI methods on quantitative outcome, using two reconstruction methods and the standard uptake value (SUV) as a simple quantitative measure of FDG uptake. Four commonly used methods of ROI definition (manual placement, fixed dimensions, threshold based and maximum pixel value) were used to calculate SUV (SUV [MAN] , SUV 15 mm , SUV 50 , SUV 75 and SUV max , respectively) and to generate ''metabolic'' tumour volumes. Test-retest reproducibility of SUVs and of ''metabolic'' tumour volumes and the applicability of ROI methods during chemotherapy were assessed. In addition, SUVs calculated on ordered subsets expectation maximisation (OSEM) and filtered back-projection (FBP) images were compared. ROI definition had a direct effect on quantitative outcome. On average, SUV [MAN] , SUV 15 mm , SUV 50 and SUV 75 , were respectively 48%, 27%, 34% and 15% lower than SUV max when calculated on OSEM images. No statistically significant differences were found between SUVs calculated on OSEM and FBP reconstructed images. Highest reproducibility was found for SUV 15 mm and SUV [MAN] (ICC 0.95 and 0.94, respectively) and for ''metabolic'' volumes measured with the manual and 50% threshold ROIs (ICC 0.99 for both). Manual, 75% threshold and maximum pixel ROIs could be used throughout therapy, regardless of changes in tumour uptake or geometry. SUVs showed the same trend in relative change in FDG uptake after chemotherapy, irrespective of the ROI method used. The method of ROI definition has a direct influence on quantitative outcome. In terms of simplicity, user-independence, reproducibility and general applicability the threshold-based and fixed dimension methods are the best ROI methods. Threshold methods are in

  9. Proposal of Evolutionary Simplex Method for Global Optimization Problem

    Science.gov (United States)

    Shimizu, Yoshiaki

    To make an agile decision in a rational manner, role of optimization engineering has been notified increasingly under diversified customer demand. With this point of view, in this paper, we have proposed a new evolutionary method serving as an optimization technique in the paradigm of optimization engineering. The developed method has prospects to solve globally various complicated problem appearing in real world applications. It is evolved from the conventional method known as Nelder and Mead’s Simplex method by virtue of idea borrowed from recent meta-heuristic method such as PSO. Mentioning an algorithm to handle linear inequality constraints effectively, we have validated effectiveness of the proposed method through comparison with other methods using several benchmark problems.

  10. Quantitative methods for evaluating the efficacy of thalamic deep brain stimulation in patients with essential tremor.

    Science.gov (United States)

    Wastensson, Gunilla; Holmberg, Björn; Johnels, Bo; Barregard, Lars

    2013-01-01

    Deep brain stimulation (DBS) of the thalamus is a safe and efficient method for treatment of disabling tremor in patient with essential tremor (ET). However, successful tremor suppression after surgery requires careful selection of stimulus parameters. Our aim was to examine the possible use of certain quantitative methods for evaluating the efficacy of thalamic DBS in ET patients in clinical practice, and to compare these methods with traditional clinical tests. We examined 22 patients using the Essential Tremor Rating Scale (ETRS) and quantitative assessment of tremor with the stimulator both activated and deactivated. We used an accelerometer (CATSYS tremor Pen) for quantitative measurement of postural tremor, and a eurythmokinesimeter (EKM) to evaluate kinetic tremor in a rapid pointing task. The efficacy of DBS on tremor suppression was prominent irrespective of the method used. The agreement between clinical rating of postural tremor and tremor intensity as measured by the CATSYS tremor pen was relatively high (rs = 0.74). The agreement between kinetic tremor as assessed by the ETRS and the main outcome variable from the EKM test was low (rs = 0.34). The lack of agreement indicates that the EKM test is not comparable with the clinical test. Quantitative methods, such as the CATSYS tremor pen, could be a useful complement to clinical tremor assessment in evaluating the efficacy of DBS in clinical practice. Future studies should evaluate the precision of these methods and long-term impact on tremor suppression, activities of daily living (ADL) function and quality of life.

  11. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  12. Method for the quantitation of steroids in umbilical cord plasma

    International Nuclear Information System (INIS)

    Schindler, A.E.; Sparke, H.

    1975-01-01

    A method for simultaneous quantitation of nine steroids in cord plasma is described which consisted of Amberlite XAD-2 column chromatography at constant temperature of 45 degC, enzyme hydrolysis with β-glucoronidase/aryl sulfatase, addition of five radioactive internal standards, ethyl acetate extraction, thin-layer chromatography and quantitation by gas-liquid chromatography after trimethylsilyl ether derivative formation. Reliability criteria were established and the following steroid concentrations found: progesterone, 132.1+-102.5 μg/100 ml; pregnenolone, 57.3+-45.7 μg/100 ml; dehydroepiandrosterone, 46.5+-29.4 μg/100 ml; pregnanediol, 67.5+-46.6 μg/100 ml; 16-ketoandrostenediol, 19.8+-13.7 μg/100 ml; 16 α-hydroxydehydroepiandrosterone, 126.3+-86.9 μg/100 ml; 16 α-hydroxypregnenolone, 78.2+-56.5 μg/100 ml; androstenetriol, 22.2+-17.5 μg/100 ml and oestriol, 127.7+-116.9 μg/100 ml. (author)

  13. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Tran Khac, Bien Cuong; Chung, Koo-Hyun, E-mail: khchung@ulsan.ac.kr

    2016-02-15

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8–29% smaller than those obtained from the other two methods. This discrepancy decreased to 3–19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. - Highlights: • Quantitative assessment of three lateral force calibration methods for AFM. • Advantages and disadvantages of three different lateral force calibration method. • Implementation of Multi-Load Pivot method as non-contact calibration technique. • The torsional mode correction for Lateral AFM Thermal-Sader method.

  14. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy

    International Nuclear Information System (INIS)

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-01-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8–29% smaller than those obtained from the other two methods. This discrepancy decreased to 3–19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method. - Highlights: • Quantitative assessment of three lateral force calibration methods for AFM. • Advantages and disadvantages of three different lateral force calibration method. • Implementation of Multi-Load Pivot method as non-contact calibration technique. • The torsional mode correction for Lateral AFM Thermal-Sader method.

  15. A 3D global-to-local deformable mesh model based registration and anatomy-constrained segmentation method for image guided prostate radiotherapy

    International Nuclear Information System (INIS)

    Zhou Jinghao; Kim, Sung; Jabbour, Salma; Goyal, Sharad; Haffty, Bruce; Chen, Ting; Levinson, Lydia; Metaxas, Dimitris; Yue, Ning J.

    2010-01-01

    Purpose: In the external beam radiation treatment of prostate cancers, successful implementation of adaptive radiotherapy and conformal radiation dose delivery is highly dependent on precise and expeditious segmentation and registration of the prostate volume between the simulation and the treatment images. The purpose of this study is to develop a novel, fast, and accurate segmentation and registration method to increase the computational efficiency to meet the restricted clinical treatment time requirement in image guided radiotherapy. Methods: The method developed in this study used soft tissues to capture the transformation between the 3D planning CT (pCT) images and 3D cone-beam CT (CBCT) treatment images. The method incorporated a global-to-local deformable mesh model based registration framework as well as an automatic anatomy-constrained robust active shape model (ACRASM) based segmentation algorithm in the 3D CBCT images. The global registration was based on the mutual information method, and the local registration was to minimize the Euclidian distance of the corresponding nodal points from the global transformation of deformable mesh models, which implicitly used the information of the segmented target volume. The method was applied on six data sets of prostate cancer patients. Target volumes delineated by the same radiation oncologist on the pCT and CBCT were chosen as the benchmarks and were compared to the segmented and registered results. The distance-based and the volume-based estimators were used to quantitatively evaluate the results of segmentation and registration. Results: The ACRASM segmentation algorithm was compared to the original active shape model (ASM) algorithm by evaluating the values of the distance-based estimators. With respect to the corresponding benchmarks, the mean distance ranged from -0.85 to 0.84 mm for ACRASM and from -1.44 to 1.17 mm for ASM. The mean absolute distance ranged from 1.77 to 3.07 mm for ACRASM and from 2.45 to

  16. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    Science.gov (United States)

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  17. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  18. A quantitative method to analyse an open answer questionnaire: A case study about the Boltzmann Factor

    International Nuclear Information System (INIS)

    Battaglia, Onofrio Rosario; Di Paola, Benedetto

    2015-01-01

    This paper describes a quantitative method to analyse an openended questionnaire. Student responses to a specially designed written questionnaire are quantitatively analysed by not hierarchical clustering called k-means method. Through this we can characterise behaviour students with respect their expertise to formulate explanations for phenomena or processes and/or use a given model in the different context. The physics topic is about the Boltzmann Factor, which allows the students to have a unifying view of different phenomena in different contexts.

  19. Nuclear medicine and imaging research. Instrumentation and quantitative methods of evaluation. Progress report, January 15, 1984-January 14, 1985

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1984-09-01

    This program addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. Project I addresses problems associated with the quantitative imaging of single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures

  20. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  1. Examining Elementary Preservice Teachers’ Self-Efficacy Beliefs: Combination of Quantitative and Qualitative Methods

    Directory of Open Access Journals (Sweden)

    Çiğdem ŞAHİN-TAŞKIN

    2010-04-01

    Full Text Available This study examines elementary preservice teachers’ self-efficacy beliefs. Quantitative and qualitative research methods were used in this study. In the quantitative part, data were collected from 122 final year preservice teachers. The instrument developed by Tschannen–Moran and Woolfolk–Hoy (2001 was administered to preservice teachers. Findings of the quantitative part revealed that preservice teachers’ self-efficacy towards teaching profession was not fully adequate. There were no differences amongst preservice teachers’ self-efficacy towards teaching regarding gender and achievement. In the qualitative part of the study, preservice teachers responded to factors involving Student Engagement and Classroom Management based on experiences that they gained in teaching practice. However, their explanation relied on their theoretical knowledge regarding the Instructional Strategies factor. This could be explained as they have lack of experiences regarding this factor

  2. Dual respiratory and cardiac motion estimation in PET imaging: Methods design and quantitative evaluation.

    Science.gov (United States)

    Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W

    2018-04-01

    The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be

  3. Qualitative and quantitative methods for human factor analysis and assessment in NPP. Investigations and results

    International Nuclear Information System (INIS)

    Hristova, R.; Kalchev, B.; Atanasov, D.

    2005-01-01

    We consider here two basic groups of methods for analysis and assessment of the human factor in the NPP area and give some results from performed analyses as well. The human factor is the human interaction with the design equipment, with the working environment and takes into account the human capabilities and limits. In the frame of the qualitative methods for analysis of the human factor are considered concepts and structural methods for classifying of the information, connected with the human factor. Emphasize is given to the HPES method for human factor analysis in NPP. Methods for quantitative assessment of the human reliability are considered. These methods allow assigning of probabilities to the elements of the already structured information about human performance. This part includes overview of classical methods for human reliability assessment (HRA, THERP), and methods taking into account specific information about human capabilities and limits and about the man-machine interface (CHR, HEART, ATHEANA). Quantitative and qualitative results concerning human factor influence in the initiating events occurrences in the Kozloduy NPP are presented. (authors)

  4. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    Science.gov (United States)

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  5. A method to determine the necessity for global signal regression in resting-state fMRI studies.

    Science.gov (United States)

    Chen, Gang; Chen, Guangyu; Xie, Chunming; Ward, B Douglas; Li, Wenjun; Antuono, Piero; Li, Shi-Jiang

    2012-12-01

    In resting-state functional MRI studies, the global signal (operationally defined as the global average of resting-state functional MRI time courses) is often considered a nuisance effect and commonly removed in preprocessing. This global signal regression method can introduce artifacts, such as false anticorrelated resting-state networks in functional connectivity analyses. Therefore, the efficacy of this technique as a correction tool remains questionable. In this article, we establish that the accuracy of the estimated global signal is determined by the level of global noise (i.e., non-neural noise that has a global effect on the resting-state functional MRI signal). When the global noise level is low, the global signal resembles the resting-state functional MRI time courses of the largest cluster, but not those of the global noise. Using real data, we demonstrate that the global signal is strongly correlated with the default mode network components and has biological significance. These results call into question whether or not global signal regression should be applied. We introduce a method to quantify global noise levels. We show that a criteria for global signal regression can be found based on the method. By using the criteria, one can determine whether to include or exclude the global signal regression in minimizing errors in functional connectivity measures. Copyright © 2012 Wiley Periodicals, Inc.

  6. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  7. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    Science.gov (United States)

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  8. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  9. Quantitative assessment of breast density: comparison of different methods

    International Nuclear Information System (INIS)

    Qin Naishan; Guo Li; Dang Yi; Song Luxin; Wang Xiaoying

    2011-01-01

    Objective: To Compare different methods of quantitative breast density measurement. Methods: The study included sixty patients who underwent both mammography and breast MRI. The breast density was computed automatically on digital mammograms with R2 workstation, Two experienced radiologists read the mammograms and assessed the breast density with Wolfe and ACR classification respectively. Fuzzy C-means clustering algorithm (FCM) was used to assess breast density on MRI. Each assessment method was repeated after 2 weeks. Spearman and Pearson correlations of inter- and intrareader and intermodality were computed for density estimates. Results: Inter- and intrareader correlation of Wolfe classification were 0.74 and 0.65, and they were 0.74 and 0.82 for ACR classification respectively. Correlation between Wolfe and ACR classification was 0.77. High interreader correlation of 0.98 and intrareader correlation of 0.96 was observed with MR FCM measurement. And the correlation between digital mammograms and MRI was high in the assessment of breast density (r=0.81, P<0.01). Conclusion: High correlation of breast density estimates on digital mammograms and MRI FCM suggested the former could be used as a simple and accurate method. (authors)

  10. Validation of quantitative 1H NMR method for the analysis of pharmaceutical formulations

    International Nuclear Information System (INIS)

    Santos, Maiara da S.

    2013-01-01

    The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE N. 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample. (author)

  11. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    Science.gov (United States)

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  13. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    Science.gov (United States)

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  15. Quantitative sacroiliac scintigraphy. The effect of method of selection of region of interest

    International Nuclear Information System (INIS)

    Davis, M.C.; Turner, D.A.; Charters, J.R.; Golden, H.E.; Ali, A.; Fordham, E.W.

    1984-01-01

    Various authors have advocated quantitative methods of evaluating bone scintigrams to detect sacroiliitis, while others have not found them useful. Many explanations for this disagreement have been offered, including differences in the method of case selection, ethnicity, gender, and previous drug therapy. It would appear that one of the most important impediments to consistent results is the variability of selecting sacroiliac joint and reference regions of interest (ROIs). The effect of ROI selection would seem particularly important because of the normal variability of radioactivity within the reference regions that have been used (sacrum, spine, iliac wing) and the inhomogeneity of activity in the SI joints. We have investigated the effect of ROI selection, using five different methods representative of, though not necessarily identical to, those found in the literature. Each method produced unique mean indices that were different for patients with ankylosing spondylitis (AS) and controls. The method of Ayres (19) proved superior (largest mean difference, smallest variance), but none worked well as a diagnostic tool because of substantial overlap of the distributions of indices of patient and control groups. We conclude that ROI selection is important in determining results, and quantitative scintigraphic methods in general are not effective tools for diagnosing AS. Among the possible factors limiting success, difficulty in selecting a stable reference area seems of particular importance

  16. Soil fungal community responses to global changes

    DEFF Research Database (Denmark)

    Haugwitz, Merian Skouw

    Global change will affect the functioning and structure of terrestrial ecosystems and since soil fungi are key players in organic matter decomposition and nutrient turnover, shifts in fungal community composition might have a strong impact on soil functioning. The main focus of this thesis...... was therefore to investigate the impact of global environmental changes on soil fungal communities in a temperate and subartic heath ecosystem. The objective was further to determine global change effects on major functional groups of fungi and analyze the influence of fungal community changes on soil carbon...... and nutrient availability and storage. By combining molecular methods such as 454 pyrosequencing and quantitative PCR of fungal ITS amplicons with analyses of soil enzymes, nutrient pools of carbon, nitrogen and phosphorus we were able to characterize soil fungal communities as well as their impact on nutrient...

  17. A quantitative method for assessing resilience of interdependent infrastructures

    International Nuclear Information System (INIS)

    Nan, Cen; Sansavini, Giovanni

    2017-01-01

    The importance of understanding system resilience and identifying ways to enhance it, especially for interdependent infrastructures our daily life depends on, has been recognized not only by academics, but also by the corporate and public sectors. During recent years, several methods and frameworks have been proposed and developed to explore applicable techniques to assess and analyze system resilience in a comprehensive way. However, they are often tailored to specific disruptive hazards/events, or fail to properly include all the phases such as absorption, adaptation, and recovery. In this paper, a quantitative method for the assessment of the system resilience is proposed. The method consists of two components: an integrated metric for system resilience quantification and a hybrid modeling approach for representing the failure behavior of infrastructure systems. The feasibility and applicability of the proposed method are tested using an electric power supply system as the exemplary infrastructure. Simulation results highlight that the method proves effective in designing, engineering and improving the resilience of infrastructures. Finally, system resilience is proposed as a proxy to quantify the coupling strength between interdependent infrastructures. - Highlights: • A method for quantifying resilience of interdependent infrastructures is proposed. • It combines multi-layer hybrid modeling and a time-dependent resilience metric. • The feasibility of the proposed method is tested on the electric power supply system. • The method provides insights to decision-makers for strengthening system resilience. • Resilience capabilities can be used to engineer interdependencies between subsystems.

  18. The laboratory of quantitative methods in historic monument research at the CTU Prague

    International Nuclear Information System (INIS)

    Musilek, L.; Cechak, T.; Kubelik, M.; Pavelka, K.; Pavlik, M.

    2001-01-01

    A 'Laboratory of Quantitative Methods in Historic Monument Research' has been established at the Department of Dosimetry and Application of Ionizing Radiation of the CTU Prague. Its primary orientation is the investigation of historic architecture, although other objects of art can also be, investigated. In the first phase, one investigative method was established for each of the above groups: X-ray fluorescence as the analytic method, thermoluminescence for dating and photogrammetry for surveying. The first results demonstrate the need and usefulness of these methods for investigations in the rich architectural heritage of the Czech Republic.

  19. A Quantitative Method for Long-Term Water Erosion Impacts on Productivity with a Lack of Field Experiments: A Case Study in Huaihe Watershed, China

    Directory of Open Access Journals (Sweden)

    Degen Lin

    2016-07-01

    Full Text Available Water erosion causes reduced farmland productivity, and with a longer period of cultivation, agricultural productivity becomes increasingly vulnerable. The vulnerability of farmland productivity needs assessment due to long-term water erosion. The key to quantitative assessment is to propose a quantitative method with water loss scenarios to calculate productivity losses due to long-term water erosion. This study uses the agricultural policy environmental extender (APEX model and the global hydrological watershed unit and selects the Huaihe River watershed as a case study to describe the methodology. An erosion-variable control method considering soil and water conservation measure scenarios was used to study the relationship between long-term erosion and productivity losses and to fit with 3D surface (to come up with three elements, which are time, the cumulative amount of water erosion and productivity losses to measure long-term water erosion. Results showed that: (1 the 3D surfaces fit significantly well; fitting by the 3D surface can more accurately reflect the impact of long-term water erosion on productivity than fitting by the 2D curve (to come up with two elements, which are water erosion and productivity losses; (2 the cumulative loss surface can reflect differences in productivity loss caused by long-term water erosion.

  20. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    Science.gov (United States)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  1. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    Science.gov (United States)

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  2. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    Science.gov (United States)

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  3. An Optimal Method for Developing Global Supply Chain Management System

    Directory of Open Access Journals (Sweden)

    Hao-Chun Lu

    2013-01-01

    Full Text Available Owing to the transparency in supply chains, enhancing competitiveness of industries becomes a vital factor. Therefore, many developing countries look for a possible method to save costs. In this point of view, this study deals with the complicated liberalization policies in the global supply chain management system and proposes a mathematical model via the flow-control constraints, which are utilized to cope with the bonded warehouses for obtaining maximal profits. Numerical experiments illustrate that the proposed model can be effectively solved to obtain the optimal profits in the global supply chain environment.

  4. New 'ex vivo' radioisotopic method of quantitation of platelet deposition

    International Nuclear Information System (INIS)

    Badimon, L.; Mayo Clinic, Rochester, MN; Thrombosis and Atherosclerosis Unit, Barcelona; Mayo Clinic, Rochester, MN; Fuster, V.; Chesebro, J.H.; Dewanjee, M.K.

    1983-01-01

    We have developed a sensitive and quantitative method of 'ex vivo' evaluation of platelet deposition on collagen strips, from rabbit Achilles tendon, superfused by flowing blood and applied it to four animal species, cat, rabbit, dog and pig. Autologous platelets were labeled with indium-111-tropolone, injected to the animal 24 hr before the superfusion and the number of deposited platelets was quantitated from the tendon gamma-radiation and the blood platelet count. We detected some platelet consumption with superfusion time when blood was reinfused entering the contralateral jugular vein after collagen contact but not if blood was discarded after the contact. Therefore, in order to have a more physiological animal model we decided to discard blood after superfusion of the tendon. In all species except for the cat there was a linear relationship between increase of platelet on the tendon and time of exposure to blood superfusion. The highest number of platelets deposited on the collagen was found in cats, the lowest in dogs. Ultrastructural analysis showed the platelets were deposited as aggregates after only 5 min of superfusion. (orig.)

  5. The development of quantitative determination method of organic acids in complex poly herbal extraction

    Directory of Open Access Journals (Sweden)

    I. L. Dyachok

    2016-08-01

    Full Text Available Aim. The development of sensible, economical and expressive method of quantitative determination of organic acids in complex poly herbal extraction counted on izovaleric acid with the use of digital technologies. Materials and methods. Model complex poly herbal extraction of sedative action was chosen as a research object. Extraction is composed of these medical plants: Valeriana officinalis L., Crataégus, Melissa officinalis L., Hypericum, Mentha piperita L., Húmulus lúpulus, Viburnum. Based on chemical composition of plant components, we consider that main pharmacologically active compounds, which can be found in complex poly herbal extraction are: polyphenolic substances (flavonoids, which are contained in Crataégus, Viburnum, Hypericum, Mentha piperita L., Húmulus lúpulus; also organic acids, including izovaleric acid, which are contained in Valeriana officinalis L., Mentha piperita L., Melissa officinalis L., Viburnum; the aminoacid are contained in Valeriana officinalis L. For the determination of organic acids content in low concentration we applied instrumental method of analysis, namely conductometry titration which consisted in the dependences of water solution conductivity of complex poly herbal extraction on composition of organic acids. Result. The got analytical dependences, which describes tangent lines to the conductometry curve before and after the point of equivalence, allow to determine the volume of solution expended on titration and carry out procedure of quantitative determination of organic acids in the digital mode. Conclusion. The proposed method enables to determine the point of equivalence and carry out quantitative determination of organic acids counted on izovaleric acid with the use of digital technologies, that allows to computerize the method on the whole.

  6. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  7. Global Mortality Impact of the 1957-1959 Influenza Pandemic

    DEFF Research Database (Denmark)

    Viboud, Cécile; Simonsen, Lone; Fuentes, Rodrigo

    2016-01-01

    BACKGROUND: Quantitative estimates of the global burden of the 1957 influenza pandemic are lacking. Here we fill this gap by modeling historical mortality statistics. METHODS: We used annual rates of age- and cause-specific deaths to estimate pandemic-related mortality in excess of background...... levels in 39 countries in Europe, the Asia-Pacific region, and the Americas. We modeled the relationship between excess mortality and development indicators to extrapolate the global burden of the pandemic. RESULTS: The pandemic-associated excess respiratory mortality rate was 1.9/10,000 population (95...... excess deaths (95% CI, .7 million-1.5 million excess deaths) globally to the 1957-1959 pandemic. CONCLUSIONS: The global mortality rate of the 1957-1959 influenza pandemic was moderate relative to that of the 1918 pandemic but was approximately 10-fold greater than that of the 2009 pandemic. The impact...

  8. Quantitative methods of data analysis for the physical sciences and engineering

    CERN Document Server

    Martinson, Douglas G

    2018-01-01

    This book provides thorough and comprehensive coverage of most of the new and important quantitative methods of data analysis for graduate students and practitioners. In recent years, data analysis methods have exploded alongside advanced computing power, and it is critical to understand such methods to get the most out of data, and to extract signal from noise. The book excels in explaining difficult concepts through simple explanations and detailed explanatory illustrations. Most unique is the focus on confidence limits for power spectra and their proper interpretation, something rare or completely missing in other books. Likewise, there is a thorough discussion of how to assess uncertainty via use of Expectancy, and the easy to apply and understand Bootstrap method. The book is written so that descriptions of each method are as self-contained as possible. Many examples are presented to clarify interpretations, as are user tips in highlighted boxes.

  9. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    NARCIS (Netherlands)

    Bosschaart, Nienke; van Leeuwen, Ton; Aalders, Maurice C.G.; Faber, Dirk

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of

  10. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    Science.gov (United States)

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  11. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    Science.gov (United States)

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  12. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  13. Qualitative and quantitative determination of ubiquinones by the method of high-efficiency liquid chromatography

    International Nuclear Information System (INIS)

    Yanotovskii, M.T.; Mogilevskaya, M.P.; Obol'nikova, E.A.; Kogan, L.M.; Samokhvalov, G.I.

    1986-01-01

    A method has been developed for the qualitative and quantitative determination of ubiquinones CoQ 6 -CoQ 10 , using high-efficiency reversed-phase liquid chromatography. Tocopherol acetate was used as the internal standard

  14. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Similar estimates of temperature impacts on global wheat yield by three independent methods

    NARCIS (Netherlands)

    Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Supit, Iwan; Wolf, Joost

    2016-01-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO 2 fertilization effects,

  16. Identification of metabolic system parameters using global optimization methods

    Directory of Open Access Journals (Sweden)

    Gatzke Edward P

    2006-01-01

    Full Text Available Abstract Background The problem of estimating the parameters of dynamic models of complex biological systems from time series data is becoming increasingly important. Methods and results Particular consideration is given to metabolic systems that are formulated as Generalized Mass Action (GMA models. The estimation problem is posed as a global optimization task, for which novel techniques can be applied to determine the best set of parameter values given the measured responses of the biological system. The challenge is that this task is nonconvex. Nonetheless, deterministic optimization techniques can be used to find a global solution that best reconciles the model parameters and measurements. Specifically, the paper employs branch-and-bound principles to identify the best set of model parameters from observed time course data and illustrates this method with an existing model of the fermentation pathway in Saccharomyces cerevisiae. This is a relatively simple yet representative system with five dependent states and a total of 19 unknown parameters of which the values are to be determined. Conclusion The efficacy of the branch-and-reduce algorithm is illustrated by the S. cerevisiae example. The method described in this paper is likely to be widely applicable in the dynamic modeling of metabolic networks.

  17. Global regularization method for planar restricted three-body problem

    Directory of Open Access Journals (Sweden)

    Sharaf M.A.

    2015-01-01

    Full Text Available In this paper, global regularization method for planar restricted three-body problem is purposed by using the transformation z = x+iy = ν cos n(u+iv, where i = √−1, 0 < ν ≤ 1 and n is a positive integer. The method is developed analytically and computationally. For the analytical developments, analytical solutions in power series of the pseudotime τ are obtained for positions and velocities (u, v, u', v' and (x, y, x˙, y˙ in both regularized and physical planes respectively, the physical time t is also obtained as power series in τ. Moreover, relations between the coefficients of the power series are obtained for two consequent values of n. Also, we developed analytical solutions in power series form for the inverse problem of finding τ in terms of t. As typical examples, three symbolic expressions for the coefficients of the power series were developed in terms of initial values. As to the computational developments, the global regularized equations of motion are developed together with their initial values in forms suitable for digital computations using any differential equations solver. On the other hand, for numerical evolutions of power series, an efficient method depending on the continued fraction theory is provided.

  18. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    OpenAIRE

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from smartphones and survey (perception) data. We find that there are dimensions of domestication that explain how the use of smartphones affects our daily routines. Contributions are stronger for downloaded a...

  19. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    Science.gov (United States)

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. The Performance of Indian Equity Funds in the Era of Quantitative Easing

    Directory of Open Access Journals (Sweden)

    Ömer Faruk Tan

    2015-10-01

    Full Text Available This study aims to evaluate the performance of Indian equity funds between January 2009 and October 2014. This study period coincides with the period of quantitative easing during which the developing economies in financial markets have been influenced. After the global financial crisis of 2008 came a period of quantitative easing (QE, creating an increase in the money supply and leading to a capital flow from developed countries to developing countries. During this 5-year 10-month period, in which the relevant quantitative easing continued, Indian CNX500 price index yielded approximately 21% compounded on average, per annum. In this study, Indian equity funds are examined in order to compare these funds’ performance within this period. Within this scope, 12 Indian equity funds are chosen. In order to measure these funds’ performances, the Sharpe ratio (1966, Treynor ratio (1965, Jensen’s alpha (1968 methods are used. Jensen’s alpha is also used in identifying selectivity skills of fund managers. Additionally, the Treynor & Mazuy (1966 regression analysis method is applied to show the market timing ability of fund managers.

  1. Cross-method validation as a solution to the problem of excessive simplification of measurement in quantitative IR research

    DEFF Research Database (Denmark)

    Beach, Derek

    2007-01-01

    The purpose of this article is to make IR scholars more aware of the costs of choosing quantitative methods. The article first shows that quantification can have analytical ‘costs’ when the measures created are too simple to capture the essence of the systematized concept that was supposed...... detail based upon a review of the democratic peace literature. I then offer two positive suggestions for a way forward. First, I argue that quantitative scholars should spend more time validating their measures, and in particular should engage in multi-method partnerships with qualitative scholars...... that have a deep understanding of particular cases in order to exploit the comparative advantages of qualitative methodology, using the more accurate qualitative measures to validate their own quantitative measures. Secondly, quantitative scholars should lower their level of ambition given the often poor...

  2. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  3. A novel dual energy method for enhanced quantitative computed tomography

    Science.gov (United States)

    Emami, A.; Ghadiri, H.; Rahmim, A.; Ay, M. R.

    2018-01-01

    Accurate assessment of bone mineral density (BMD) is critically important in clinical practice, and conveniently enabled via quantitative computed tomography (QCT). Meanwhile, dual-energy QCT (DEQCT) enables enhanced detection of small changes in BMD relative to single-energy QCT (SEQCT). In the present study, we aimed to investigate the accuracy of QCT methods, with particular emphasis on a new dual-energy approach, in comparison to single-energy and conventional dual-energy techniques. We used a sinogram-based analytical CT simulator to model the complete chain of CT data acquisitions, and assessed performance of SEQCT and different DEQCT techniques in quantification of BMD. We demonstrate a 120% reduction in error when using a proposed dual-energy Simultaneous Equation by Constrained Least-squares method, enabling more accurate bone mineral measurements.

  4. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    Science.gov (United States)

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  5. New chromatographic method for separating Omeprazole from its degradation components and the quantitatively determining it in its pharmaceutical products

    International Nuclear Information System (INIS)

    Touma, M.; Rajab, A.; Seuleiman, M.

    2007-01-01

    New chromatographic method for Quantitative Determination of Omeprazole in its Pharmaceutical Products was produced. Omeprazole and its degradation components were well separated in same chromatogram by using high perfume liquid chromatography (HPLC). The new analytical method has been validated by these characteristic tests (accuracy, precision, range, linearity, specificity/selectivity, limit of detection (LOD) and limit of quantitative (LOQ) ).(author)

  6. New chromatographic Methods for Separation of Lansoprazole from its Degradation Components and The Quantitative Determination in its Pharmaceutical Products

    International Nuclear Information System (INIS)

    Touma, M.; Rajab, A.

    2009-01-01

    New chromatographic method was found for Quantitative Determination of Lansoprazole in its pharmaceutical products. Lansoprazole and its degradation components were well separated in same chromatogram by using high perfume liquid chromatography (HPLC). The new analytical method has been validated by these characteristic tests (accuracy, precision, range, linearity, specificity/selectivity, limit of detection (LOD) and limit of quantitative (LOQ)). (author)

  7. Quantitative phase analysis of uranium carbide from x-ray diffraction data using the Rietveld method

    International Nuclear Information System (INIS)

    Singh Mudher, K.D.; Krishnan, K.

    2003-01-01

    Quantitative phase analysis of a uranium carbide sample was carried out from the x-ray diffraction data by Rietveld profile fitting method. The method does not require the addition of any reference material. The percentage of UC, UC 2 and UO 2 phases in the sample were determined. (author)

  8. The Mechanism of Russian Nanoindustry Development Caused by Globalization: Methods and Tools

    Directory of Open Access Journals (Sweden)

    Avtonomova Oksana Alekseevna

    2015-05-01

    Full Text Available Establishing the effective mechanism of the Russian nanoindustry functioning by means of globalization benefits is one of the most important factors of raising the national economy competitiveness in the context of transition to a new technological way. Nanotechnologies is one of the key factors of this new way. The mechanism of nanoindustrial development of the Russian Federation caused by globalization is characterized in the article as a way of task-oriented implementation and management of the global nanotechnology industry development on the basis of cooperation of entities at different levels of the global economic system using the appropriate methods, tools, resources and communication channels, factors and capitals. The mechanism aims at adjusting the described contradictions faced by Russian entities in their business activities in the sphere of production, consumption and promotion of nanotechnologies, nanogoods and nanoservices. Within the framework of a theoretical research the author proposes the classification of methods and tools for the development of the Russian nanoindustry through the international cooperation by the criteria of economic functions: planning, institution, organization, management, investment, finance, information, analysis, control – all aimed at promoting the unification of concepts and actions of collaborating entities in the sphere of nanotechnology. The developed methodology of the international nanoindustrial interaction of Russian entities includes the result-oriented, institutional, organizational, budgetary, investment, tax, informative, and administrative methods, as well as analysis, audit, accounting and evaluation. Besides, the article proves the feasibility of marketing tools application in the sphere of nanoindustrial cooperation aimed at developing a more efficient strategy of promoting products with nanofeatures to the global market.

  9. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  10. Global Convergence of a Spectral Conjugate Gradient Method for Unconstrained Optimization

    Directory of Open Access Journals (Sweden)

    Jinkui Liu

    2012-01-01

    Full Text Available A new nonlinear spectral conjugate descent method for solving unconstrained optimization problems is proposed on the basis of the CD method and the spectral conjugate gradient method. For any line search, the new method satisfies the sufficient descent condition gkTdk<−∥gk∥2. Moreover, we prove that the new method is globally convergent under the strong Wolfe line search. The numerical results show that the new method is more effective for the given test problems from the CUTE test problem library (Bongartz et al., 1995 in contrast to the famous CD method, FR method, and PRP method.

  11. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    International Nuclear Information System (INIS)

    Conconi, M.S.; Gauna, M.R.; Serra, M.F.; Suarez, G.; Aglietti, E.F.; Rendtorff, N.M.

    2014-01-01

    The firing transformations of traditional (clay based) ceramics are of technological and archaeological interest, and are usually reported qualitatively or semi quantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite), the low crystalline (metakaolinite and/or spinel type pre-mullite) and glassy phases evolution of a triaxial (clay-quartz-feldspar) ceramic fired in a wide temperature range between 900 and 1300 deg C. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 deg C) spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy) phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and materials

  12. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Energy Technology Data Exchange (ETDEWEB)

    Conconi, M.S.; Gauna, M.R.; Serra, M.F. [Centro de Tecnologia de Recursos Minerales y Ceramica (CETMIC), Buenos Aires (Argentina); Suarez, G.; Aglietti, E.F.; Rendtorff, N.M., E-mail: rendtorff@cetmic.unlp.edu.ar [Universidad Nacional de La Plata (UNLP), Buenos Aires (Argentina). Fac. de Ciencias Exactas. Dept. de Quimica

    2014-10-15

    The firing transformations of traditional (clay based) ceramics are of technological and archaeological interest, and are usually reported qualitatively or semi quantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite), the low crystalline (metakaolinite and/or spinel type pre-mullite) and glassy phases evolution of a triaxial (clay-quartz-feldspar) ceramic fired in a wide temperature range between 900 and 1300 deg C. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 deg C) spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy) phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and materials

  13. Acceptability criteria for linear dependence in validating UV-spectrophotometric methods of quantitative determination in forensic and toxicological analysis

    Directory of Open Access Journals (Sweden)

    L. Yu. Klimenko

    2014-08-01

    Full Text Available Introduction. This article is the result of authors’ research in the field of development of the approaches to validation of quantitative determination methods for purposes of forensic and toxicological analysis and devoted to the problem of acceptability criteria formation for validation parameter «linearity/calibration model». The aim of research. The purpose of this paper is to analyse the present approaches to acceptability estimation of the calibration model chosen for method description according to the requirements of the international guidances, to form the own approaches to acceptability estimation of the linear dependence when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. Materials and methods. UV-spectrophotometric method of doxylamine quantitative determination in blood. Results. The approaches to acceptability estimation of calibration models when carrying out the validation of bioanalytical methods is stated in international papers, namely «Guidance for Industry: Bioanalytical method validation» (U.S. FDA, 2001, «Standard Practices for Method Validation in Forensic Toxicology» (SWGTOX, 2012, «Guidance for the Validation of Analytical Methodology and Calibration of Equipment used for Testing of Illicit Drugs in Seized Materials and Biological Specimens» (UNODC, 2009 and «Guideline on validation of bioanalytical methods» (ЕМА, 2011 have been analysed. It has been suggested to be guided by domestic developments in the field of validation of analysis methods for medicines and, particularly, by the approaches to validation methods in the variant of the calibration curve method for forming the acceptability criteria of the obtained linear dependences when carrying out the validation of UV-spectrophotometric methods of quantitative determination for forensic and toxicological analysis. The choice of the method of calibration curve is

  14. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Sample preparation methods for quantitative detection of DNA by molecular assays and marine biosensors

    International Nuclear Information System (INIS)

    Cox, Annie M.; Goodwin, Kelly D.

    2013-01-01

    Highlights: • DNA extraction methods affected measured qPCR target recovery. • Recovery and variability differed, sometimes by more than an order of magnitude. • SCODA did not offer significant improvement with PCR-inhibited seawater. • Aggressive lysis did appear to improve target recovery. • Reliable and affordable correction methods are needed for quantitative PCR. -- Abstract: The need for quantitative molecular methods is growing in environmental, food, and medical fields but is hindered by low and variable DNA extraction and by co-extraction of PCR inhibitors. DNA extracts from Enterococcus faecium, seawater, and seawater spiked with E. faecium and Vibrio parahaemolyticus were tested by qPCR for target recovery and inhibition. Conventional and novel methods were tested, including Synchronous Coefficient of Drag Alteration (SCODA) and lysis and purification systems used on an automated genetic sensor (the Environmental Sample Processor, ESP). Variable qPCR target recovery and inhibition were measured, significantly affecting target quantification. An aggressive lysis method that utilized chemical, enzymatic, and mechanical disruption enhanced target recovery compared to commercial kit protocols. SCODA purification did not show marked improvement over commercial spin columns. Overall, data suggested a general need to improve sample preparation and to accurately assess and account for DNA recovery and inhibition in qPCR applications

  16. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    International Nuclear Information System (INIS)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline

  17. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    Energy Technology Data Exchange (ETDEWEB)

    Tam, Allison [Stanford Institutes of Medical Research Program, Stanford University School of Medicine, Stanford, California 94305 (United States); Barker, Jocelyn [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 (United States); Rubin, Daniel [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 and Department of Medicine (Biomedical Informatics Research), Stanford University School of Medicine, Stanford, California 94305 (United States)

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  18. Two quantitative forecasting methods for macroeconomic indicators in Czech Republic

    Directory of Open Access Journals (Sweden)

    Mihaela BRATU (SIMIONESCU

    2012-03-01

    Full Text Available Econometric modelling and exponential smoothing techniques are two quantitative forecasting methods with good results in practice, but the objective of the research was to find out which of the two techniques are better for short run predictions. Therefore, for inflation, unemployment and interest rate in Czech Republic some accuracy indicators were calculated for the predictions based on these methods. Short run forecasts on a horizon of 3 months were made for December 2011-February 2012, the econometric models being updated. For Czech Republic, the exponential smoothing techniques provided more accurate forecasts than the econometric models (VAR(2 models, ARMA procedure and models with lagged variables. One explication for the better performance of smoothing techniques would be that in the chosen countries the short run predictions more influenced by the recent evolution of the indicators.

  19. Rapid method for protein quantitation by Bradford assay after elimination of the interference of polysorbate 80.

    Science.gov (United States)

    Cheng, Yongfeng; Wei, Haiming; Sun, Rui; Tian, Zhigang; Zheng, Xiaodong

    2016-02-01

    Bradford assay is one of the most common methods for measuring protein concentrations. However, some pharmaceutical excipients, such as detergents, interfere with Bradford assay even at low concentrations. Protein precipitation can be used to overcome sample incompatibility with protein quantitation. But the rate of protein recovery caused by acetone precipitation is only about 70%. In this study, we found that sucrose not only could increase the rate of protein recovery after 1 h acetone precipitation, but also did not interfere with Bradford assay. So we developed a method for rapid protein quantitation in protein drugs even if they contained interfering substances. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Study on a quantitative evaluation method of equipment maintenance level and plant safety level for giant complex plant system

    International Nuclear Information System (INIS)

    Aoki, Takayuki

    2010-01-01

    In this study, a quantitative method on maintenance level which is determined by the two factors, maintenance plan and field work implementation ability by maintenance crew is discussed. And also a quantitative evaluation method on safety level for giant complex plant system is discussed. As a result of consideration, the following results were obtained. (1) It was considered that equipment condition after maintenance work was determined by the two factors, maintenance plan and field work implementation ability possessed by maintenance crew. The equipment condition determined by the two factors was named as 'equipment maintenance level' and its quantitative evaluation method was clarified. (2) It was considered that CDF in a nuclear power plant, evaluated by using a failure rate counting the above maintenance level was quite different from CDF evaluated by using existing failure rates including a safety margin. Then, the former CDF was named as 'plant safety level' of plant system and its quantitative evaluation method was clarified. (3) Enhancing equipment maintenance level means an improvement of maintenance quality. That results in the enhancement of plant safety level. Therefore, plant safety level should be always watched as a plant performance indicator. (author)

  1. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  2. A new method to estimate global mass transport and its implication for sea level rise

    Science.gov (United States)

    Yi, S.; Heki, K.

    2017-12-01

    Estimates of changes in global land mass by using GRACE observations can be achieved by two methods, a mascon method and a forward modeling method. However, results from these two methods show inconsistent secular trend. Sea level budget can be adopted to validate the consistency among observations of sea level rise by altimetry, steric change by the Argo project, and mass change by GRACE. Mascon products from JPL, GSFC and CSR are compared here, we find that all these three products cannot achieve a reconciled sea level budget, while this problem can be solved by a new forward modeling method. We further investigate the origin of this difference, and speculate that it is caused by the signal leakage from the ocean mass. Generally, it is well recognized that land signals leak into oceans, but it also happens the other way around. We stress the importance of correction of leakage from the ocean in the estimation of global land masses. Based on a reconciled sea level budget, we confirmed that global sea level rise has been accelerating significantly over 2005-2015, as a result of the ongoing global temperature increase.

  3. Full quantitative phase analysis of hydrated lime using the Rietveld method

    Energy Technology Data Exchange (ETDEWEB)

    Lassinantti Gualtieri, Magdalena, E-mail: magdalena.gualtieri@unimore.it [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Romagnoli, Marcello; Miselli, Paola; Cannio, Maria [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Gualtieri, Alessandro F. [Dipartimento di Scienze della Terra, Universita Degli Studi di Modena e Reggio Emilia, I-41100 Modena (Italy)

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  4. Full quantitative phase analysis of hydrated lime using the Rietveld method

    International Nuclear Information System (INIS)

    Lassinantti Gualtieri, Magdalena; Romagnoli, Marcello; Miselli, Paola; Cannio, Maria; Gualtieri, Alessandro F.

    2012-01-01

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2–15 wt.%.

  5. Yours, Mine and Ours: Theorizing the Global Articulation of Qualitative Research Methods and Academic Disciplines

    Directory of Open Access Journals (Sweden)

    Bryan C. Taylor

    2016-02-01

    Full Text Available Two current forms of globalization are inherently interesting to academic qualitative researchers. The first is the globalization of qualitative research methods themselves. The second is the globalization of academic disciplines in which those methods are institutionalized as a valuable resource for professional practices of teaching and scholarly research. This essay argues that patterns in existing discussion of these two trends create an opportunity for innovative scholarship. That opportunity involves reflexively leveraging qualitative research methods to study the simultaneous negotiation by academic communities of both qualitative methods and their professional discipline. Five theories that serve to develop this opportunity are reviewed, focusing on their related benefits and limitations, and the specific research questions they yield. The essay concludes by synthesizing distinctive commitments of this proposed research program.

  6. Quantitative Method to Measure Thermal Conductivity of One-Dimensional Nanostructures Based on Scanning Thermal Wave Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Kyung Bae; Chung, Jae Hun; Hwang, Gwang Seok; Jung, Eui Han; Kwon, Oh Myoung [Korea University, Seoul (Korea, Republic of)

    2014-12-15

    We present a method to quantitatively measure the thermal conductivity of one-dimensional nanostructures by utilizing scanning thermal wave microscopy (STWM) at a nanoscale spatial resolution. In this paper, we explain the principle for measuring the thermal diffusivity of one-dimensional nanostructures using STWM and the theoretical analysis procedure for quantifying the thermal diffusivity. The SWTM measurement method obtains the thermal conductivity by measuring the thermal diffusivity, which has only a phase lag relative to the distance corresponding to the transferred thermal wave. It is not affected by the thermal contact resistances between the heat source and nanostructure and between the nanostructure and probe. Thus, the heat flux applied to the nanostructure is accurately obtained. The proposed method provides a very simple and quantitative measurement relative to conventional measurement techniques.

  7. The Methodical Approaches to the Research of Informatization of the Global Economic Development

    Directory of Open Access Journals (Sweden)

    Kazakova Nadezhda A.

    2018-03-01

    Full Text Available The article is aimed at researching the identification of global economic development informatization. The complex of issues connected with research of development of informatization of the world countries in the conditions of globalization is considered. The development of informatization in the global economic space, which facilitates opening of new markets for international trade enterprises, international transnational corporations and other organizations, which not only provide exports, but also create production capacities for local producers. The methodical approach which includes three stages together with formation of the input information on the status of informatization of the global economic development of the world countries has been proposed.

  8. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  9. The use of Triangulation in Social Sciences Research : Can qualitative and quantitative methods be combined?

    Directory of Open Access Journals (Sweden)

    Ashatu Hussein

    2015-03-01

    Full Text Available This article refers to a study in Tanzania on fringe benefits or welfare via the work contract1 where we will work both quantitatively and qualitatively. My focus is on the vital issue of combining methods or methodologies. There has been mixed views on the uses of triangulation in researches. Some authors argue that triangulation is just for increasing the wider and deep understanding of the study phenomenon, while others have argued that triangulation is actually used to increase the study accuracy, in this case triangulation is one of the validity measures. Triangulation is defined as the use of multiple methods mainly qualitative and quantitative methods in studying the same phenomenon for the purpose of increasing study credibility. This implies that triangulation is the combination of two or more methodological approaches, theoretical perspectives, data sources, investigators and analysis methods to study the same phenomenon.However, using both qualitative and quantitative paradigms in the same study has resulted into debate from some researchers arguing that the two paradigms differ epistemologically and ontologically. Nevertheless, both paradigms are designed towards understanding about a particular subject area of interest and both of them have strengths and weaknesses. Thus, when combined there is a great possibility of neutralizing the flaws of one method and strengthening the benefits of the other for the better research results. Thus, to reap the benefits of two paradigms and minimizing the drawbacks of each, the combination of the two approaches have been advocated in this article. The quality of our studies on welfare to combat poverty is crucial, and especially when we want our conclusions to matter in practice.

  10. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  11. Mixed multiscale finite element methods using approximate global information based on partial upscaling

    KAUST Repository

    Jiang, Lijian

    2009-10-02

    The use of limited global information in multiscale simulations is needed when there is no scale separation. Previous approaches entail fine-scale simulations in the computation of the global information. The computation of the global information is expensive. In this paper, we propose the use of approximate global information based on partial upscaling. A requirement for partial homogenization is to capture long-range (non-local) effects present in the fine-scale solution, while homogenizing some of the smallest scales. The local information at these smallest scales is captured in the computation of basis functions. Thus, the proposed approach allows us to avoid the computations at the scales that can be homogenized. This results in coarser problems for the computation of global fields. We analyze the convergence of the proposed method. Mathematical formalism is introduced, which allows estimating the errors due to small scales that are homogenized. The proposed method is applied to simulate two-phase flows in heterogeneous porous media. Numerical results are presented for various permeability fields, including those generated using two-point correlation functions and channelized permeability fields from the SPE Comparative Project (Christie and Blunt, SPE Reserv Evalu Eng 4:308-317, 2001). We consider simple cases where one can identify the scales that can be homogenized. For more general cases, we suggest the use of upscaling on the coarse grid with the size smaller than the target coarse grid where multiscale basis functions are constructed. This intermediate coarse grid renders a partially upscaled solution that contains essential non-local information. Numerical examples demonstrate that the use of approximate global information provides better accuracy than purely local multiscale methods. © 2009 Springer Science+Business Media B.V.

  12. A method to forecast quantitative variables relating to nuclear public acceptance

    International Nuclear Information System (INIS)

    Ohnishi, T.

    1992-01-01

    A methodology is proposed for forecasting the future trend of quantitative variables profoundly related to the public acceptance (PA) of nuclear energy. The social environment influencing PA is first modeled by breaking it down into a finite number of fundamental elements and then the interactive formulae between the quantitative variables, which are attributed to and characterize each element, are determined by using the actual values of the variables in the past. Inputting the estimated values of exogenous variables into these formulae, the forecast values of endogenous variables can finally be obtained. Using this method, the problem of nuclear PA in Japan is treated as, for example, where the context is considered to comprise a public sector and the general social environment and socio-psychology. The public sector is broken down into three elements of the general public, the inhabitants living around nuclear facilities and the activists of anti-nuclear movements, whereas the social environment and socio-psychological factors are broken down into several elements, such as news media and psychological factors. Twenty-seven endogenous and seven exogenous variables are introduced to quantify these elements. After quantitatively formulating the interactive features between them and extrapolating the exogenous variables into the future estimates are made of the growth or attenuation of the endogenous variables, such as the pro- and anti-nuclear fractions in public opinion polls and the frequency of occurrence of anti-nuclear movements. (author)

  13. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    Science.gov (United States)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  14. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  15. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  16. Quantitative functional scintigraphy of the salivary glands: A new method of interpreting and clinical results

    International Nuclear Information System (INIS)

    Schneider, P.; Trauring, G.; Haas, J.P.; Noodt, A.; Draf, W.

    1984-01-01

    Tc-99m pertechnetate is injected i.v. and the kinetics of the tracer in the salivary glands is analyzed using a gamma camera and a computer system. To visualize regional gland function, phase images as well as socalled gradient images are generated, which reflect the rate of tracer inflow and outflow. The time activity curves for the individual glands which are obtained with the ROI technique show an initial rise which reflects the pertechnetate uptake potential of the gland and is superimposed by background activity. After a standardized lemon juice dose the curve drops steeply, with the slope depending on the outflow potential of the gland and the background activity. In the past, attempts at quantifying the uptake and elimination functions have failed because of problems in allowing for the variable background component of the time activity curves, which normally amounts of about 60%. In 25 patients in whom one gland had been removed surgically the background activity was examined in terms of the time course and the regional pattern and a patient and gland-specific subtraction method was developed for visualizing the time activity curves of isolated glands devoid of any background activity and describing the uptake and elimination potentials in quantitative terms. Using this new method we evaluated 305 salivary gland scans. Normal ranges for the quantitative parameters were established and their reproducibility was examined. Unlike qualitative functional images of the salivary glands the new quantitative method offers accurate evidence of the extent of gland function and thus helps to decide wether a gland should be salvaged or not (conservative versus surgical treatment). However, quantitation does not furnish any clues on the benign or malignant nature of a tumor. (Author)

  17. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    Science.gov (United States)

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  18. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  19. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    Science.gov (United States)

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  20. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  1. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  2. Nuclear medicine and imaging research. Instrumentation and quantitative methods of evaluation. Progress report, January 15, 1985-January 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1985-09-01

    This program of research addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation, and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. These developments are designed to meet the needs imposed by new radiopharmaceuticals developed to solve specific biomedical problems, as well as to meet the instrumentation needs associated with radiopharmaceutical production and quantitative clinical feasibility studies of the brain with PET VI. Project I addresses problems associated with the quantitative imaging of single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures. The original proposal covered work to be carried out over the three-year contract period. This report covers progress made during Year Three. 36 refs., 1 tab

  3. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  4. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    Science.gov (United States)

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  5. A Simple Linear Regression Method for Quantitative Trait Loci Linkage Analysis With Censored Observations

    OpenAIRE

    Anderson, Carl A.; McRae, Allan F.; Visscher, Peter M.

    2006-01-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using...

  6. QUANTITATIVE MEASUREMENT AND ASSESSMENT OF THE EFFECTS OF GLOBALIZATION OF COMPANIES AND MARKETS

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2015-10-01

    Full Text Available The results of improving the author’s methodology linked with the assessment of companies’ and markets’ globalization level were presented in this paper. Based on the analysis of the global companies’ and global markets’ features referred to in scientific literature, the specifications which can be used to determine the globalization level of companies and markets were suggested. In addition, the globalization level of the largest top-ten companies (according to the rating of Forbes Global 2000 Leading Companies in 2015 was identified as well as that of corresponding industry markets: auto and truck manufacturers, major banks, software and programming, large department stores (retailers, telecommunication services, electronics producers, electronics, oil and gas operations.

  7. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    Science.gov (United States)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  8. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    Science.gov (United States)

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  9. AI/OR computational model for integrating qualitative and quantitative design methods

    Science.gov (United States)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  10. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  11. Development and evaluation of event-specific quantitative PCR method for genetically modified soybean A2704-12.

    Science.gov (United States)

    Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.

  12. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    International Nuclear Information System (INIS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-01-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10 -6 , comparable with the recent results reported in the literature

  13. A Global Network Alignment Method Using Discrete Particle Swarm Optimization.

    Science.gov (United States)

    Huang, Jiaxiang; Gong, Maoguo; Ma, Lijia

    2016-10-19

    Molecular interactions data increase exponentially with the advance of biotechnology. This makes it possible and necessary to comparatively analyse the different data at a network level. Global network alignment is an important network comparison approach to identify conserved subnetworks and get insight into evolutionary relationship across species. Network alignment which is analogous to subgraph isomorphism is known to be an NP-hard problem. In this paper, we introduce a novel heuristic Particle-Swarm-Optimization based Network Aligner (PSONA), which optimizes a weighted global alignment model considering both protein sequence similarity and interaction conservations. The particle statuses and status updating rules are redefined in a discrete form by using permutation. A seed-and-extend strategy is employed to guide the searching for the superior alignment. The proposed initialization method "seeds" matches with high sequence similarity into the alignment, which guarantees the functional coherence of the mapping nodes. A greedy local search method is designed as the "extension" procedure to iteratively optimize the edge conservations. PSONA is compared with several state-of-art methods on ten network pairs combined by five species. The experimental results demonstrate that the proposed aligner can map the proteins with high functional coherence and can be used as a booster to effectively refine the well-studied aligners.

  14. GLOBAL CLASSIFICATION OF DERMATITIS DISEASE WITH K-MEANS CLUSTERING IMAGE SEGMENTATION METHODS

    OpenAIRE

    Prafulla N. Aerkewar1 & Dr. G. H. Agrawal2

    2018-01-01

    The objective of this paper to presents a global technique for classification of different dermatitis disease lesions using the process of k-Means clustering image segmentation method. The word global is used such that the all dermatitis disease having skin lesion on body are classified in to four category using k-means image segmentation and nntool of Matlab. Through the image segmentation technique and nntool can be analyze and study the segmentation properties of skin lesions occurs in...

  15. Global Modeling of Microwave Three Terminal Active Devices Using the FDTD Method

    National Research Council Canada - National Science Library

    Mrabet, O. E; Essaaidi, M; Drissi, M'hamed

    2005-01-01

    This paper presents a new approach for the global electromagnetic analysis of the three-Terminal active linear and nonlinear microwave circuits using the Finite-Difference Time Domain (FDTD) Method...

  16. Quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    Energy Technology Data Exchange (ETDEWEB)

    Miyanaga, Hajime [Kyoto Prefectural Univ. of Medicine (Japan)

    1982-08-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes.

  17. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    Science.gov (United States)

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  18. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  19. Global tractography with embedded anatomical priors for quantitative connectivity analysis

    Directory of Open Access Journals (Sweden)

    Alia eLemkaddem

    2014-11-01

    Full Text Available The main assumption of fiber-tracking algorithms is that fiber trajectories are represented by paths of highest diffusion, which is usually accomplished by following the principal diffusion directions estimated in every voxel from the measured diffusion MRI data. The state-of-the-art approaches, known as global tractography, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The tractograms obtained with these algorithms outperform any previous technique but, unfortunately, the price to pay is an increased computational cost which is not suitable in many practical settings, both in terms of time and memory requirements. Furthermore, existing global tractography algorithms suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are used during in the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the white matter. This does not only unnecessarily slow down the estimation procedure and potentially biases any subsequent analysis but also, most importantly, prevents the de facto quantification of brain connectivity. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications by explicitly enforcing anatomical priors of the tracts in the optimization and considering the effective contribution of each of them, i.e. volume, to the acquired diffusion MRI image. We evaluated our approach on both a realistic diffusion MRI phantom and in-vivo data, and also compared its performance to existing tractography aloprithms.

  20. Empowering people to change occupational behaviours to address critical global issues.

    Science.gov (United States)

    Ikiugu, Moses N; Westerfield, Madeline A; Lien, Jamie M; Theisen, Emily R; Cerny, Shana L; Nissen, Ranelle M

    2015-06-01

    The greatest threat to human well-being in this century is climate change and related global issues. We examined the effectiveness of the Modified Instrumentalism in Occupational Therapy model as a framework for facilitating occupational behaviour change to address climate change and related issues. Eleven individuals participated in this mixed-methods single-subject-design study. Data were gathered using the Modified Assessment and Intervention Instrument for Instrumentalism in Occupational Therapy and Daily Occupational Inventories. Quantitative data were analyzed using two- and three-standard deviation band methods. Qualitative data were analyzed using heuristic phenomenological procedures. Occupational performance changed for five participants. Participants' feelings shifted from frustration and helplessness to empowerment and a desire for action. They felt empowered to find occupation-based solutions to the global issues. Occupation-based interventions that increase personal awareness of the connection between occupational performance and global issues could empower people to be agents for action to ameliorate the issues.

  1. Fractal aspects and convergence of Newton`s method

    Energy Technology Data Exchange (ETDEWEB)

    Drexler, M. [Oxford Univ. Computing Lab. (United Kingdom)

    1996-12-31

    Newton`s Method is a widely established iterative algorithm for solving non-linear systems. Its appeal lies in its great simplicity, easy generalization to multiple dimensions and a quadratic local convergence rate. Despite these features, little is known about its global behavior. In this paper, we will explain a seemingly random global convergence pattern using fractal concepts and show that the behavior of the residual is entirely explicable. We will also establish quantitative results for the convergence rates. Knowing the mechanism of fractal generation, we present a stabilization to the orthodox Newton method that remedies the fractal behavior and improves convergence.

  2. Decadal Changes in Global Ocean Annual Primary Production

    Science.gov (United States)

    Gregg, Watson; Conkright, Margarita E.; Behrenfeld, Michael J.; Ginoux, Paul; Casey, Nancy W.; Koblinsky, Chester J. (Technical Monitor)

    2002-01-01

    The Sea-viewing Wide Field-of-View Sensor (SeaWiFS) has produced the first multi-year time series of global ocean chlorophyll observations since the demise of the Coastal Zone Color Scanner (CZCS) in 1986. Global observations from 1997-present from SeaWiFS combined with observations from 1979-1986 from the CZCS should in principle provide an opportunity to observe decadal changes in global ocean annual primary production, since chlorophyll is the primary driver for estimates of primary production. However, incompatibilities between algorithms have so far precluded quantitative analysis. We have developed and applied compatible processing methods for the CZCS, using modern advances in atmospheric correction and consistent bio-optical algorithms to advance the CZCS archive to comparable quality with SeaWiFS. We applied blending methodologies, where in situ data observations are incorporated into the CZCS and SeaWiFS data records, to provide improvement of the residuals. These re-analyzed, blended data records provide maximum compatibility and permit, for the first time, a quantitative analysis of the changes in global ocean primary production in the early-to-mid 1980's and the present, using synoptic satellite observations. An intercomparison of the global and regional primary production from these blended satellite observations is important to understand global climate change and the effects on ocean biota. Photosynthesis by chlorophyll-containing phytoplankton is responsible for biotic uptake of carbon in the oceans and potentially ultimately from the atmosphere. Global ocean annual primary decreased from the CZCS record to SeaWiFS, by nearly 6% from the early 1980s to the present. Annual primary production in the high latitudes was responsible for most of the decadal change. Conversely, primary production in the low latitudes generally increased, with the exception of the tropical Pacific. The differences and similarities of the two data records provide evidence

  3. From individual innovation to global impact: the Global Cooperation on Assistive Technology (GATE) innovation snapshot as a method for sharing and scaling.

    Science.gov (United States)

    Layton, Natasha; Murphy, Caitlin; Bell, Diane

    2018-05-09

    Assistive technology (AT) is an essential facilitator of independence and participation, both for people living with the effects of disability and/or non-communicable disease, as well as people aging with resultant functional decline. The World Health Organization (WHO) recognizes the substantial gap between the need for and provision of AT and is leading change through the Global Cooperation on Assistive Technology (GATE) initiative. Showcasing innovations gathered from 92 global researchers, innovators, users and educators of AT through the WHO GREAT Summit, this article provides an analysis of ideas and actions on a range of dimensions in order to provide a global overview of AT innovation. The accessible method used to capture and showcase this data is presented and critiqued, concluding that "innovation snapshots" are a rapid and concise strategy to capture and showcase AT innovation and to foster global collaboration. Implications for Rehabilitation Focal tools such as ePosters with uniform data requirements enable the rapid sharing of information. A diversity of innovative practices are occurring globally in the areas of AT Products, Policy, Provision, People and Personnel. The method offered for Innovation Snapshots had substantial uptake and is a feasible means to capture data across a range of stakeholders. Meeting accessibility criteria is an emerging competency in the AT community. Substantial areas of common interest exist across regions and globally in the AT community, demonstrating the effectiveness of information sharing platforms such as GATE and supporting the idea of regional forums and networks.

  4. A method to characterize the roughness of 2-D line features: recrystallization boundaries

    DEFF Research Database (Denmark)

    Sun, Jun; Zhang, Yubin; Dahl, Anders Bjorholm

    2017-01-01

    A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough fe...

  5. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  6. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    Science.gov (United States)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  7. A comparison of ancestral state reconstruction methods for quantitative characters.

    Science.gov (United States)

    Royer-Carenzi, Manuela; Didier, Gilles

    2016-09-07

    Choosing an ancestral state reconstruction method among the alternatives available for quantitative characters may be puzzling. We present here a comparison of seven of them, namely the maximum likelihood, restricted maximum likelihood, generalized least squares under Brownian, Brownian-with-trend and Ornstein-Uhlenbeck models, phylogenetic independent contrasts and squared parsimony methods. A review of the relations between these methods shows that the maximum likelihood, the restricted maximum likelihood and the generalized least squares under Brownian model infer the same ancestral states and can only be distinguished by the distributions accounting for the reconstruction uncertainty which they provide. The respective accuracy of the methods is assessed over character evolution simulated under a Brownian motion with (and without) directional or stabilizing selection. We give the general form of ancestral state distributions conditioned on leaf states under the simulation models. Ancestral distributions are used first, to give a theoretical lower bound of the expected reconstruction error, and second, to develop an original evaluation scheme which is more efficient than comparing the reconstructed and the simulated states. Our simulations show that: (i) the distributions of the reconstruction uncertainty provided by the methods generally make sense (some more than others); (ii) it is essential to detect the presence of an evolutionary trend and to choose a reconstruction method accordingly; (iii) all the methods show good performances on characters under stabilizing selection; (iv) without trend or stabilizing selection, the maximum likelihood method is generally the most accurate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Nuclear medicine and image research: instrumentation and quantitative methods of evaluation. Comprehensive 3-year progress report, January 15, 1983-January 14, 1986

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1985-09-01

    This program of research addresses problems involving the basic science and technology of radioactive tracer methods as they relate to nuclear medicine and imaging. The broad goal is to develop new instruments and methods for image formation, processing, quantitation, and display, so as to maximize the diagnostic information per unit of absorbed radiation dose to the patient. Project I addresses problems with the quantitative imaging a single-photon emitters; Project II addresses similar problems associated with the quantitative imaging of positron emitters; Project III addresses methodological problems associated with the quantitative evaluation of the efficacy of diagnostic imaging procedures

  9. Measuring Globalization: Existing Methods and Their Implications for Teaching Global Studies and Forecasting

    Science.gov (United States)

    Zinkina, Julia; Korotayev, Andrey; Andreev, Aleksey I.

    2013-01-01

    Purpose: The purpose of this paper is to encourage discussions regarding the existing approaches to globalization measurement (taking mainly the form of indices and rankings) and their shortcomings in terms of applicability to developing Global Studies curricula. Another aim is to propose an outline for the globalization measurement methodology…

  10. An improved method for quantitative magneto-optical analysis of superconductors

    International Nuclear Information System (INIS)

    Laviano, F; Botta, D; Chiodoni, A; Gerbaldo, R; Ghigo, G; Gozzelino, L; Zannella, S; Mezzetti, E

    2003-01-01

    We report on the analysis method to extract quantitative local electrodynamics in superconductors by means of the magneto-optical technique. First of all, we discuss the calibration procedure to convert the local light intensity values into magnetic induction field distribution and start focusing on the role played by the generally disregarded magnetic induction components parallel to the indicator film plane (in-plane field effect). To account for the reliability of the whole technique, the method used to reconstruct the electrical current density distribution is reported, together with a numerical test example. The methodology is applied to measure local magnetic field and current distributions on a typical YBa 2 Cu 3 O 7-x good quality film. We show how the in-plane field influences the MO measurements, after which we present an algorithm to account for the in-plane field components. The meaningful impact of the correction on the experimental results is shown. Afterwards, we discuss some aspects about the electrodynamics of the superconducting sample

  11. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  12. Integrated and global pseudotargeted metabolomics strategy applied to screening for quality control markers of Citrus TCMs.

    Science.gov (United States)

    Shu, Yisong; Liu, Zhenli; Zhao, Siyu; Song, Zhiqian; He, Dan; Wang, Menglei; Zeng, Honglian; Lu, Cheng; Lu, Aiping; Liu, Yuanyan

    2017-08-01

    Traditional Chinese medicine (TCM) exerts its therapeutic effect in a holistic fashion with the synergistic function of multiple characteristic constituents. The holism philosophy of TCM is coincident with global and systematic theories of metabolomics. The proposed pseudotargeted metabolomics methodologies were employed for the establishment of reliable quality control markers for use in the screening strategy of TCMs. Pseudotargeted metabolomics integrates the advantages of both targeted and untargeted methods. In the present study, targeted metabolomics equipped with the gold standard RRLC-QqQ-MS method was employed for in vivo quantitative plasma pharmacochemistry study of characteristic prototypic constituents. Meanwhile, untargeted metabolomics using UHPLC-QE Orbitrap HRMS with better specificity and selectivity was employed for identification of untargeted metabolites in the complex plasma matrix. In all, 32 prototypic metabolites were quantitatively determined, and 66 biotransformed metabolites were convincingly identified after being orally administered with standard extracts of four labeled Citrus TCMs. The global absorption and metabolism process of complex TCMs was depicted in a systematic manner.

  13. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  14. Prediction of Coronal Mass Ejections from Vector Magnetograms: Quantitative Measures as Predictors

    Science.gov (United States)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.

    2001-05-01

    In a pilot study of 4 active regions (Falconer, D.A. 2001, JGR, in press), we derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (IN), and 2) the length of the strong-shear, strong-field main neutral line (LSS), and used these two measures of the CME productivity of the active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU (Falconer, Moore, & Gary, 2000, EOS 81, 48 F998), we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (IN and LSS) as well as two new ones, the total magnetic flux (Φ ) (a measure of an active region's size), and the normalized twist (α =μ IN/Φ ). We found that the three measures of global nonpotentiality (IN, LSS, α ) were all well correlated (>99% confidence level) with an active region's CME productivity within (2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is

  15. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  16. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  17. Global warming and climate change: control methods

    International Nuclear Information System (INIS)

    Laal, M.; Aliramaie, A.

    2008-01-01

    This paper aimed at finding causes of global warming and ways to bring it under control. Data based on scientific opinion as given by synthesis reports of news, articles, web sites, and books. global warming is the observed and projected increases in average temperature of Earth's atmosphere and oceans. Carbon dioxide and other air pollution that is collecting in the atmosphere like a thickening blanket, trapping the sun's heat and causing the planet to warm up. Pollution is one of the biggest man-made problems. Burning fossil fuels is the main factor of pollution. As average temperature increases, habitats, species and people are threatened by drought, changes in rainfall, altered seasons, and more violent storms and floods. Indeed the life cycle of nuclear power results in relatively little pollution. Energy efficiency, solar, wind and other renewable fuels are other weapons against global warming . Human activity, primarily burning fossil fuels, is the major driving factor in global warming . Curtailing the release of carbon dioxide into the atmosphere by reducing use of oil, gasoline, coal and employment of alternate energy, sources are the tools for keeping global warming under control. global warming can be slowed and stopped, with practical actions thal yield a cleaner, healthier atmosphere

  18. PCR-free quantitative detection of genetically modified organism from raw materials – A novel electrochemiluminescence-based bio-barcode method

    Science.gov (United States)

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.

    2018-01-01

    Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909

  19. A Quantitative Method to Screen Common Bean Plants for Resistance to Bean common mosaic necrosis virus.

    Science.gov (United States)

    Strausbaugh, C A; Myers, J R; Forster, R L; McClean, P E

    2003-11-01

    ABSTRACT A quantitative method to screen common bean (Phaseolus vulgaris) plants for resistance to Bean common mosaic necrosis virus (BCMNV) is described. Four parameters were assessed in developing the quantitative method: symptoms associated with systemic virus movement, plant vigor, virus titer, and plant dry weight. Based on these parameters, two rating systems (V and VV rating) were established. Plants from 21 recombinant inbred lines (RILs) from a Sierra (susceptible) x Olathe (partially resistant) cross inoculated with the BCMNV-NL-3 K strain were used to evaluate this quantitative approach. In all, 11 RILs exhibited very susceptible reactions and 10 RILs expressed partially resistant reactions, thus fitting a 1:1 susceptible/partially resistant ratio (chi(2) = 0.048, P = 0.827) and suggesting that the response is mediated by a single gene. Using the classical qualitative approach based only on symptom expression, the RILs were difficult to separate into phenotypic groups because of a continuum of responses. By plotting mean percent reduction in either V (based on visual symptoms) or VV (based on visual symptoms and vigor) rating versus enzyme-linked immunosorbent assay (ELISA) absorbance values, RILs could be separated clearly into different phenotypic groups. The utility of this quantitative approach also was evaluated on plants from 12 cultivars or pure lines inoculated with one of three strains of BCMNV. Using the mean VV rating and ELISA absorbance values, significant differences were established not only in cultivar and pure line comparisons but also in virus strain comparisons. This quantitative system should be particularly useful for the evaluation of the independent action of bc genes, the discovery of new genes associated with partial resistance, and assessing virulence of virus strains.

  20. Development of method quantitative content of dihydroquercetin. Report 1

    Directory of Open Access Journals (Sweden)

    Олександр Юрійович Владимиров

    2016-01-01

    Full Text Available Today is markedly increasing scientific interest in the study of flavonoids in plant objects due to their high biological activity. In this regard, the urgent task of analytical chemistry is in developing available analytical techniques of determination for flavonoids in plant objects.Aim. The aim was to develop specific techniques of quantitative determination for dihydroquercetin and determination of its validation characteristics.Methods. The technique for photocolorimetric quantification of DQW, which was based on the specific reaction of cyanidine chloride formation when added zinc powder to dihydroquercetin solution in an acidic medium has been elaborated.Results. Photocolorimetric technique of determining flavonoids recalculating on DQW has been developed, its basic validation characteristics have been determined. The obtained metrological characteristics of photocolorimetric technique for determining DQW did not exceed admissibility criteria in accordance with the requirements of the State Pharmacopoeia of Ukraine.Conclusions. By the results of statistical analysis of experimental data, it has been stated that the developed technique can be used for quantification of DQW. Metrological data obtained indicate that the method reproduced in conditions of two different laboratories with confidence probability 95 % unit value deviation was 101,85±2,54 %

  1. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    Science.gov (United States)

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  2. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  3. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Directory of Open Access Journals (Sweden)

    Melanie I Stefan

    2015-04-01

    Full Text Available The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014 show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  4. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Science.gov (United States)

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  5. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    Science.gov (United States)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  6. Methods of Bank Valuation in the Age of Globalization

    Directory of Open Access Journals (Sweden)

    Alexander Karminsky

    2015-01-01

    Full Text Available This paper reviews the theory ofvalue-based management at the commercial bank and the main valuation methods in the age of globalization. The paper identifies five main factors that significantly influence valuation models selection and building: funding, liquidity, risks, exogenous factors and the capital cushion. It is shown that valuation models can be classified depending on underlying cash flows. Particular attention is paid to models based on potentially available cash flows (Discounted cash flow-oriented approaches, DCF and models based on residual income flows (Residual income-oriented approaches. In addition, we consider an alternative approach based on comparison with same sector banks (based on multiples. For bank valuation equity discounted сash flow method is recommended (Equity DCF. Equity DCF values equity value of a bank directly by discounting cash flows to equity at the cost of equity (Capital Asset Pricing Model, CAPM, rather than at the weighted average cost of capital (WACC. For the purposes of operational management residual income-oriented approaches are recommended for use, because they are better aligned with the process of internal planning and forecasting in banks. For strategic management residual income-oriented methods most useful when expected cash flows are negative throughout the forecast period. Discounted сash flow-oriented approaches are preferable when expected cash flows have positive values and needs for models using is motivated by supporting the investment decisions. Proposed classification can be developed in interests of bank management tasks in the midterm in the age of globalization.

  7. Reconciling incongruous qualitative and quantitative findings in mixed methods research: exemplars from research with drug using populations.

    Science.gov (United States)

    Wagner, Karla D; Davidson, Peter J; Pollini, Robin A; Strathdee, Steffanie A; Washburn, Rachel; Palinkas, Lawrence A

    2012-01-01

    Mixed methods research is increasingly being promoted in the health sciences as a way to gain more comprehensive understandings of how social processes and individual behaviours shape human health. Mixed methods research most commonly combines qualitative and quantitative data collection and analysis strategies. Often, integrating findings from multiple methods is assumed to confirm or validate the findings from one method with the findings from another, seeking convergence or agreement between methods. Cases in which findings from different methods are congruous are generally thought of as ideal, whilst conflicting findings may, at first glance, appear problematic. However, the latter situation provides the opportunity for a process through which apparently discordant results are reconciled, potentially leading to new emergent understandings of complex social phenomena. This paper presents three case studies drawn from the authors' research on HIV risk amongst injection drug users in which mixed methods studies yielded apparently discrepant results. We use these case studies (involving injection drug users [IDUs] using a Needle/Syringe Exchange Program in Los Angeles, CA, USA; IDUs seeking to purchase needle/syringes at pharmacies in Tijuana, Mexico; and young street-based IDUs in San Francisco, CA, USA) to identify challenges associated with integrating findings from mixed methods projects, summarize lessons learned, and make recommendations for how to more successfully anticipate and manage the integration of findings. Despite the challenges inherent in reconciling apparently conflicting findings from qualitative and quantitative approaches, in keeping with others who have argued in favour of integrating mixed methods findings, we contend that such an undertaking has the potential to yield benefits that emerge only through the struggle to reconcile discrepant results and may provide a sum that is greater than the individual qualitative and quantitative parts

  8. Digital integrated protection system: Quantitative methods for dependability evaluation

    International Nuclear Information System (INIS)

    Krotoff, H.; Benski, C.

    1986-01-01

    The inclusion of programmed digital techniques in the SPIN system provides the used with the capability of performing sophisticated processing operations. However, it causes the quantitative evaluation of the overall failure probabilities to become somewhat more intricate by reason that: A single component may be involved in several functions; Self-tests may readily be incorporated for the purpose of monitoring the dependable operation of the equipment at all times. This paper describes the methods as implemented by MERLIN GERIN for the purpose of evaluating: The probabilities for the protective actions not to be initiated (dangerous failures); The probabilities for such protective actions to be initiated accidentally. Although the communication is focused on the programmed portion of the SPIN (UAIP) it will also deal with the evaluation performed within the scope of study works that do not exclusively cover the UAIPs

  9. Developing the "120 by 20" goal for the Global FP2020 Initiative.

    Science.gov (United States)

    Brown, Win; Druce, Nel; Bunting, Julia; Radloff, Scott; Koroma, Desmond; Gupta, Srishti; Siems, Brian; Kerrigan, Monica; Kress, Dan; Darmstadt, Gary L

    2014-03-01

    This report describes the purpose for developing a quantitative goal for the London Summit on Family Planning held in July 2012, the methodology behind its formulation, and the lessons learned in the process. The London Summit has evolved into the global initiative known as FP2020, and the goal has become "120 by 20," or reaching 120 million additional users of modern contraceptive methods by 2020 in the world's poorest countries. The success of FP2020 will first be evaluated on the basis of quantitative verification to determine that the "120 by 20" goal was reached. More important, however, is the extent to which the goal today serves as a global rallying cry to mobilize resources and leadership around current family planning programs, with a focus on voluntary family planning and quality of care, and with an emphasis on meeting girls' and women's unmet needs and their right to practice contraception. We hope this article provides greater transparency and understanding of the FP2020 goal, and that the global goal spurs annual monitoring of progress toward national goals in the world's poorest countries. © 2014 The Population Council, Inc.

  10. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Directory of Open Access Journals (Sweden)

    M. S. Conconi

    2014-12-01

    Full Text Available The firing transformations of traditional (clay based ceramics are of technological and archeological interest, and are usually reported qualitatively or semiquantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite, the low crystalline (metakaolinite and/or spinel type pre-mullite and glassy phases evolution of a triaxial (clay-quartz-feldspar ceramic fired in a wide temperature range between 900 and 1300 ºC. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 ºC spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and

  11. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    Science.gov (United States)

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  13. A combined usage of stochastic and quantitative risk assessment methods in the worksites: Application on an electric power provider

    International Nuclear Information System (INIS)

    Marhavilas, P.K.; Koulouriotis, D.E.

    2012-01-01

    An individual method cannot build either a realistic forecasting model or a risk assessment process in the worksites, and future perspectives should focus on the combined forecasting/estimation approach. The main purpose of this paper is to gain insight into a risk prediction and estimation methodological framework, using the combination of three different methods, including the proportional quantitative-risk-assessment technique (PRAT), the time-series stochastic process (TSP), and the method of estimating the societal-risk (SRE) by F–N curves. In order to prove the usefulness of the combined usage of stochastic and quantitative risk assessment methods, an application on an electric power provider industry is presented to, using empirical data.

  14. Forecasting with quantitative methods the impact of special events in time series

    OpenAIRE

    Nikolopoulos, Konstantinos

    2010-01-01

    Abstract Quantitative methods are very successful for producing baseline forecasts of time series; however these models fail to forecast neither the timing nor the impact of special events such as promotions or strikes. In most of the cases the timing of such events is not known so they are usually referred as shocks (economics) or special events (forecasting). Sometimes the timing of such events is known a priori (i.e. a future promotion); but even then the impact of the forthcom...

  15. Effects of calibration methods on quantitative material decomposition in photon-counting spectral computed tomography using a maximum a posteriori estimator.

    Science.gov (United States)

    Curtis, Tyler E; Roeder, Ryan K

    2017-10-01

    Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in

  16. Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods. NCEE 2014-4017

    Science.gov (United States)

    Schochet, Peter Z.; Puma, Mike; Deke, John

    2014-01-01

    This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed…

  17. Alzheimer disease: Quantitative analysis of I-123-iodoamphetamine SPECT brain imaging

    International Nuclear Information System (INIS)

    Hellman, R.S.; Tikofsky, R.S.; Collier, B.D.; Hoffmann, R.G.; Palmer, D.W.; Glatt, S.L.; Antuono, P.G.; Isitman, A.T.; Papke, R.A.

    1989-01-01

    To enable a more quantitative diagnosis of senile dementia of the Alzheimer type (SDAT), the authors developed and tested a semiautomated method to define regions of interest (ROIs) to be used in quantitating results from single photon emission computed tomography (SPECT) of regional cerebral blood flow performed with N-isopropyl iodine-123-iodoamphetamine. SPECT/IMP imaging was performed in ten patients with probable SDAT and seven healthy subjects. Multiple ROIs were manually and semiautomatically generated, and uptake was quantitated for each ROI. Mean cortical activity was estimated as the average of the mean activity in 24 semiautomatically generated ROIs; mean cerebellar activity was determined from the mean activity in separate ROIs. A ratio of parietal to cerebellar activity less than 0.60 and a ratio of parietal to mean cortical activity less than 0.90 allowed correct categorization of nine of ten and eight of ten patients, respectively, with SDAT and all control subjects. The degree of diminished mental status observed in patients with SDAT correlated with both global and regional changes in IMP uptake

  18. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  19. Determination of global heart function parameters

    International Nuclear Information System (INIS)

    Adam, W.E.; Hoffmann, H.; Sigel, H.; Bitter, F.; Nechwatal, W.; Stauch, M.; Ulm Univ.; Freiburg Univ.

    1980-01-01

    1. ECG-triggered scintigraphy of the interior of the heart (radioactive ventriculography) is a reliable non-invasive technique for the acquisition of the global and regional function of the left ventricle. 2. The most important global parameter is the output function (OF) of the left ventricle. It can be measured exactly. The decrease of the OF under load is typical for coronary insufficience under load, but is not specifically. 3. A movement disturbance on the ground of a KHK is recognized with highest sensitivity at the decrease of the maximum relaxation velocity of the global left-ventricular time-activity characteristic (fast phase of filling). 4. Regional wall movement disturbances can be measured quantitatively by means of viewing the radioactive nucleid ventriculogramm at the display. 5. The quantitative measurement of the regional function needs an extensive analysis of the local time-activity characteristics of a representative coronary cycle. For this the amplitude and phase and the contraction and relaxation velocity of all time-activity characteristics is determined by Fourier analysis and their spatial distribution is drawn (parametric scan). 6. The parametric scans (distribution of amplitude, phase, contraction and relaxation velocities) describe the regional wall movement in detail, the reliability of its quantitative acquisition has to be approved by further investigations. (orig.) [de

  20. [The development and validation of the methods for the quantitative determination of sibutramine derivatives in dietary supplements].

    Science.gov (United States)

    Stern, K I; Malkova, T L

    The objective of the present study was the development and validation of sibutramine demethylated derivatives, desmethyl sibutramine and didesmethyl sibutramine. Gas-liquid chromatography with the flame ionization detector was used for the quantitative determination of the above substances in dietary supplements. The conditions for the chromatographic determination of the analytes in the presence of the reference standard, methyl stearate, were proposed allowing to achieve the efficient separation. The method has the necessary sensitivity, specificity, linearity, accuracy, and precision (on the intra-day and inter-day basis) which suggests its good validation characteristics. The proposed method can be employed in the analytical laboratories for the quantitative determination of sibutramine derivatives in biologically active dietary supplements.

  1. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  2. Computational Experience with Globally Convergent Descent Methods for Large Sparse Systems of Nonlinear Equations

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Vlček, Jan

    1998-01-01

    Roč. 8, č. 3-4 (1998), s. 201-223 ISSN 1055-6788 R&D Projects: GA ČR GA201/96/0918 Keywords : nonlinear equations * Armijo-type descent methods * Newton-like methods * truncated methods * global convergence * nonsymmetric linear systems * conjugate gradient -type methods * residual smoothing * computational experiments Subject RIV: BB - Applied Statistics, Operational Research

  3. A method for improved clustering and classification of microscopy images using quantitative co-localization coefficients

    LENUS (Irish Health Repository)

    Singan, Vasanth R

    2012-06-08

    AbstractBackgroundThe localization of proteins to specific subcellular structures in eukaryotic cells provides important information with respect to their function. Fluorescence microscopy approaches to determine localization distribution have proved to be an essential tool in the characterization of unknown proteins, and are now particularly pertinent as a result of the wide availability of fluorescently-tagged constructs and antibodies. However, there are currently very few image analysis options able to effectively discriminate proteins with apparently similar distributions in cells, despite this information being important for protein characterization.FindingsWe have developed a novel method for combining two existing image analysis approaches, which results in highly efficient and accurate discrimination of proteins with seemingly similar distributions. We have combined image texture-based analysis with quantitative co-localization coefficients, a method that has traditionally only been used to study the spatial overlap between two populations of molecules. Here we describe and present a novel application for quantitative co-localization, as applied to the study of Rab family small GTP binding proteins localizing to the endomembrane system of cultured cells.ConclusionsWe show how quantitative co-localization can be used alongside texture feature analysis, resulting in improved clustering of microscopy images. The use of co-localization as an additional clustering parameter is non-biased and highly applicable to high-throughput image data sets.

  4. Global Monitoring of Water Supply and Sanitation: History, Methods and Future Challenges

    Science.gov (United States)

    Bartram, Jamie; Brocklehurst, Clarissa; Fisher, Michael B.; Luyendijk, Rolf; Hossain, Rifat; Wardlaw, Tessa; Gordon, Bruce

    2014-01-01

    International monitoring of drinking water and sanitation shapes awareness of countries’ needs and informs policy, implementation and research efforts to extend and improve services. The Millennium Development Goals established global targets for drinking water and sanitation access; progress towards these targets, facilitated by international monitoring, has contributed to reducing the global disease burden and increasing quality of life. The experiences of the MDG period generated important lessons about the strengths and limitations of current approaches to defining and monitoring access to drinking water and sanitation. The methods by which the Joint Monitoring Programme (JMP) of WHO and UNICEF tracks access and progress are based on analysis of data from household surveys and linear regression modelling of these results over time. These methods provide nationally-representative and internationally-comparable insights into the drinking water and sanitation facilities used by populations worldwide, but also have substantial limitations: current methods do not address water quality, equity of access, or extra-household services. Improved statistical methods are needed to better model temporal trends. This article describes and critically reviews JMP methods in detail for the first time. It also explores the impact of, and future directions for, international monitoring of drinking water and sanitation. PMID:25116635

  5. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  6. Pleistocene Lake Bonneville and Eberswalde Crater of Mars: Quantitative Methods for Recognizing Poorly Developed Lacustrine Shorelines

    Science.gov (United States)

    Jewell, P. W.

    2014-12-01

    The ability to quantify shoreline features on Earth has been aided by advances in acquisition of high-resolution topography through laser imaging and photogrammetry. Well-defined and well-documented features such as the Bonneville, Provo, and Stansbury shorelines of Late Pleistocene Lake Bonneville are recognizable to the untrained eye and easily mappable on aerial photos. The continuity and correlation of lesser shorelines must rely quantitative algorithms for processing high-resolution data in order to gain widespread scientific acceptance. Using Savitsky-Golay filters and the geomorphic methods and criteria described by Hare et al. [2001], minor, transgressive, erosional shorelines of Lake Bonneville have been identified and correlated across the basin with varying degrees of statistical confidence. Results solve one of the key paradoxes of Lake Bonneville first described by G. K. Gilbert in the late 19th century and point the way for understanding climatically driven oscillations of the Last Glacial Maximum in the Great Basin of the United States. Similar techniques have been applied to the Eberswalde Crater area of Mars using HRiSE DEMs (1 m horizontal resolution) where a paleolake is hypothesized to have existed. Results illustrate the challenges of identifying shorelines where long term aeolian processes have degraded the shorelines and field validation is not possible. The work illustrates the promises and challenges of indentifying remnants of a global ocean elsewhere on the red planet.

  7. Understanding Factors that Shape Gender Attitudes in Early Adolescence Globally: A Mixed-Methods Systematic Review

    Science.gov (United States)

    Gibbs, Susannah; Blum, Robert Wm; Moreau, Caroline; Chandra-Mouli, Venkatraman; Herbert, Ann; Amin, Avni

    2016-01-01

    Background Early adolescence (ages 10–14) is a period of increased expectations for boys and girls to adhere to socially constructed and often stereotypical norms that perpetuate gender inequalities. The endorsement of such gender norms is closely linked to poor adolescent sexual and reproductive and other health-related outcomes yet little is known about the factors that influence young adolescents’ personal gender attitudes. Objectives To explore factors that shape gender attitudes in early adolescence across different cultural settings globally. Methods A mixed-methods systematic review was conducted of the peer-reviewed literature in 12 databases from 1984–2014. Four reviewers screened the titles and abstracts of articles and reviewed full text articles in duplicate. Data extraction and quality assessments were conducted using standardized templates by study design. Thematic analysis was used to synthesize quantitative and qualitative data organized by the social-ecological framework (individual, interpersonal and community/societal-level factors influencing gender attitudes). Results Eighty-two studies (46 quantitative, 31 qualitative, 5 mixed-methods) spanning 29 countries were included. Ninety percent of studies were from North America or Western Europe. The review findings indicate that young adolescents, across cultural settings, commonly express stereotypical or inequitable gender attitudes, and such attitudes appear to vary by individual sociodemographic characteristics (sex, race/ethnicity and immigration, social class, and age). Findings highlight that interpersonal influences (family and peers) are central influences on young adolescents’ construction of gender attitudes, and these gender socialization processes differ for boys and girls. The role of community factors (e.g. media) is less clear though there is some evidence that schools may reinforce stereotypical gender attitudes among young adolescents. Conclusions The findings from this

  8. The q-G method : A q-version of the Steepest Descent method for global optimization.

    Science.gov (United States)

    Soterroni, Aline C; Galski, Roberto L; Scarabello, Marluce C; Ramos, Fernando M

    2015-01-01

    In this work, the q-Gradient (q-G) method, a q-version of the Steepest Descent method, is presented. The main idea behind the q-G method is the use of the negative of the q-gradient vector of the objective function as the search direction. The q-gradient vector, or simply the q-gradient, is a generalization of the classical gradient vector based on the concept of Jackson's derivative from the q-calculus. Its use provides the algorithm an effective mechanism for escaping from local minima. The q-G method reduces to the Steepest Descent method when the parameter q tends to 1. The algorithm has three free parameters and it is implemented so that the search process gradually shifts from global exploration in the beginning to local exploitation in the end. We evaluated the q-G method on 34 test functions, and compared its performance with 34 optimization algorithms, including derivative-free algorithms and the Steepest Descent method. Our results show that the q-G method is competitive and has a great potential for solving multimodal optimization problems.

  9. New 'ex vivo' radioisotopic method of quantitation of platelet deposition

    Energy Technology Data Exchange (ETDEWEB)

    Badimon, L.; Fuster, V.; Chesebro, J.H.; Dewanjee, M.K.

    1983-01-01

    We have developed a sensitive and quantitative method of 'ex vivo' evaluation of platelet deposition on collagen strips, from rabbit Achilles tendon, superfused by flowing blood and applied it to four animal species, cat, rabbit, dog and pig. Autologous platelets were labeled with indium-111-tropolone, injected to the animal 24 hr before the superfusion and the number of deposited platelets was quantitated from the tendon gamma-radiation and the blood platelet count. We detected some platelet consumption with superfusion time when blood was reinfused entering the contralateral jugular vein after collagen contact but not if blood was discarded after the contact. Therefore, in order to have a more physiological animal model we decided to discard blood after superfusion of the tendon. In all species except for the cat there was a linear relationship between increase of platelet on the tendon and time of exposure to blood superfusion. The highest number of platelets deposited on the collagen was found in cats, the lowest in dogs. Ultrastructural analysis showed the platelets were deposited as aggregates after only 5 min of superfusion.

  10. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    Science.gov (United States)

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  11. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    Science.gov (United States)

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    Science.gov (United States)

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  13. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  14. [Research on rapid and quantitative detection method for organophosphorus pesticide residue].

    Science.gov (United States)

    Sun, Yuan-Xin; Chen, Bing-Tai; Yi, Sen; Sun, Ming

    2014-05-01

    The methods of physical-chemical inspection is adopted in the traditional pesticide residue detection, which require a lot of pretreatment processes, are time-consuming and complicated. In the present study, the authors take chlorpyrifos applied widely in the present agricultural field as the research object and propose a rapid and quantitative detection method for organophosphorus pesticide residues. At first, according to the chemical characteristics of chlorpyrifos and comprehensive chromogenic effect of several colorimetric reagents and secondary pollution, the pretreatment of the scheme of chromogenic reaction of chlorpyrifos with resorcin in a weak alkaline environment was determined. Secondly, by analyzing Uv-Vis spectrum data of chlorpyrifos samples whose content were between 0. 5 and 400 mg kg-1, it was confirmed that the characteristic information after the color reaction mainly was concentrated among 360 approximately 400 nm. Thirdly, the full spectrum forecasting model was established based on the partial least squares, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 995 6, standard deviation of calibration (RMSEC) was 2. 814 7 mg kg-1, and standard deviation of verification (RMSEP) was 8. 012 4 mg kg-1. Fourthly, the wavelengths whose center wavelength is 400 nm was extracted as characteristic region to build a forecasting model, whose correlation coefficient of calibration was 0. 999 6, correlation coefficient of prediction reached 0. 999 3, standard deviation of calibration (RMSEC) was 2. 566 7 mg kg-1 , standard deviation of verification (RMSEP) was 4. 886 6 mg kg-1, respectively. At last, by analyzing the near infrared spectrum data of chlorpyrifos samples with contents between 0. 5 and 16 mg kg-1, the authors found that although the characteristics of the chromogenic functional group are not obvious, the change of absorption peaks of resorcin itself in the neighborhood of 5 200 cm

  15. Global human capital: integrating education and population.

    Science.gov (United States)

    Lutz, Wolfgang; KC, Samir

    2011-07-29

    Almost universally, women with higher levels of education have fewer children. Better education is associated with lower mortality, better health, and different migration patterns. Hence, the global population outlook depends greatly on further progress in education, particularly of young women. By 2050, the highest and lowest education scenarios--assuming identical education-specific fertility rates--result in world population sizes of 8.9 and 10.0 billion, respectively. Better education also matters for human development, including health, economic growth, and democracy. Existing methods of multi-state demography can quantitatively integrate education into standard demographic analysis, thus adding the "quality" dimension.

  16. A quantitative analysis of Tl-201 myocardial perfusion image with special reference to circumferential profile method

    International Nuclear Information System (INIS)

    Miyanaga, Hajime

    1982-01-01

    A quantitative analysis of thallium-201 myocardial perfusion image (MPI) was attempted by using circumferential profile method (CPM) and the first purpose of this study is to assess the clinical utility of this method for the detection of myocardial ischemia. In patients with coronary artery disease, CPM analysis to exercise T1-MPI showed high sensitivity (9/12, 75%) and specificity (9/9, 100%), whereas exercise ECG showed high sensitivity (9/12, 75%), but relatively low specificity (7/9, 78%). In patients with myocardial infarction, CPM also showed high sensitivity (34/38, 89%) for the detection of myocardial necrosis, compared with visual interpretation (31/38, 81%) and with ECG (31/38, 81%). Defect score was correlated well with the number of abnormal Q waves. In exercise study, CPM was also sensitive to the change of perfusion defect in T1-MPI produced by exercise. So the results indicate that CPM is a good method not only quantitatively but also objectively to analyze T1-MPI. Although ECG is the most commonly used diagnostic tool for ischemic heart disease, several exercise induced ischemic changes in ECG have been still on discussion as criteria. So the second purpose of this study is to evaluate these ischemic ECG changes by exercise T1-MPI analized quantitatively. ST depression (ischemic 1 mm and junctional 2 mm or more), ST elevation (1 mm or more), and coronary T wave reversion in exercise ECG were though to be ischemic changes. (J.P.N.)

  17. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    Science.gov (United States)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as 1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  18. Do Methods Matter in Global Leadership Development? Testing the Global Leadership Development Ecosystem Conceptual Model

    Science.gov (United States)

    Walker, Jennie L.

    2018-01-01

    As world communication, technology, and trade become increasingly integrated through globalization, multinational corporations seek employees with global leadership skills. However, the demand for these skills currently outweighs the supply. Given the rarity of globally ready leaders, global competency development should be emphasized in business…

  19. A Study on Efficiency Improvement of the Hybrid Monte Carlo/Deterministic Method for Global Transport Problems

    International Nuclear Information System (INIS)

    Kim, Jong Woo; Woo, Myeong Hyeon; Kim, Jae Hyun; Kim, Do Hyun; Shin, Chang Ho; Kim, Jong Kyung

    2017-01-01

    In this study hybrid Monte Carlo/Deterministic method is explained for radiation transport analysis in global system. FW-CADIS methodology construct the weight window parameter and it useful at most global MC calculation. However, Due to the assumption that a particle is scored at a tally, less particles are transported to the periphery of mesh tallies. For compensation this space-dependency, we modified the module in the ADVANTG code to add the proposed method. We solved the simple test problem for comparing with result from FW-CADIS methodology, it was confirmed that a uniform statistical error was secured as intended. In the future, it will be added more practical problems. It might be useful to perform radiation transport analysis using the Hybrid Monte Carlo/Deterministic method in global transport problems.

  20. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    Science.gov (United States)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  1. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    Science.gov (United States)

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  2. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  3. Global self-esteem and method effects: competing factor structures, longitudinal invariance, and response styles in adolescents.

    Science.gov (United States)

    Urbán, Róbert; Szigeti, Réka; Kökönyei, Gyöngyi; Demetrovics, Zsolt

    2014-06-01

    The Rosenberg Self-Esteem Scale (RSES) is a widely used measure for assessing self-esteem, but its factor structure is debated. Our goals were to compare 10 alternative models for the RSES and to quantify and predict the method effects. This sample involves two waves (N =2,513 9th-grade and 2,370 10th-grade students) from five waves of a school-based longitudinal study. The RSES was administered in each wave. The global self-esteem factor with two latent method factors yielded the best fit to the data. The global factor explained a large amount of the common variance (61% and 46%); however, a relatively large proportion of the common variance was attributed to the negative method factor (34 % and 41%), and a small proportion of the common variance was explained by the positive method factor (5% and 13%). We conceptualized the method effect as a response style and found that being a girl and having a higher number of depressive symptoms were associated with both low self-esteem and negative response style, as measured by the negative method factor. Our study supported the one global self-esteem construct and quantified the method effects in adolescents.

  4. Global self-esteem and method effects: competing factor structures, longitudinal invariance and response styles in adolescents

    Science.gov (United States)

    Urbán, Róbert; Szigeti, Réka; Kökönyei, Gyöngyi; Demetrovics, Zsolt

    2013-01-01

    The Rosenberg Self-Esteem Scale (RSES) is a widely used measure for assessing self-esteem, but its factor structure is debated. Our goals were to compare 10 alternative models for RSES; and to quantify and predict the method effects. This sample involves two waves (N=2513 ninth-grade and 2370 tenth-grade students) from five waves of a school-based longitudinal study. RSES was administered in each wave. The global self-esteem factor with two latent method factors yielded the best fit to the data. The global factor explained large amount of the common variance (61% and 46%); however, a relatively large proportion of the common variance was attributed to the negative method factor (34 % and 41%), and a small proportion of the common variance was explained by the positive method factor (5% and 13%). We conceptualized the method effect as a response style, and found that being a girl and having higher number of depressive symptoms were associated with both low self-esteem and negative response style measured by the negative method factor. Our study supported the one global self-esteem construct and quantified the method effects in adolescents. PMID:24061931

  5. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    Science.gov (United States)

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  6. Global-Mindedness and Intercultural Competence: A Quantitative Study of Pre-Service Teachers

    Science.gov (United States)

    Cui, Qi

    2013-01-01

    This study assessed pre-service teachers' levels of global-mindedness and intercultural competence using the Global-Mindedness Scale (GMS) and the Cultural Intelligence Scale (CQS) and investigated the correlation between the two. The study examined whether the individual scale factors such as gender, perceived competence in non-native language or…

  7. Understanding Factors that Shape Gender Attitudes in Early Adolescence Globally: A Mixed-Methods Systematic Review.

    Science.gov (United States)

    Kågesten, Anna; Gibbs, Susannah; Blum, Robert Wm; Moreau, Caroline; Chandra-Mouli, Venkatraman; Herbert, Ann; Amin, Avni

    2016-01-01

    Early adolescence (ages 10-14) is a period of increased expectations for boys and girls to adhere to socially constructed and often stereotypical norms that perpetuate gender inequalities. The endorsement of such gender norms is closely linked to poor adolescent sexual and reproductive and other health-related outcomes yet little is known about the factors that influence young adolescents' personal gender attitudes. To explore factors that shape gender attitudes in early adolescence across different cultural settings globally. A mixed-methods systematic review was conducted of the peer-reviewed literature in 12 databases from 1984-2014. Four reviewers screened the titles and abstracts of articles and reviewed full text articles in duplicate. Data extraction and quality assessments were conducted using standardized templates by study design. Thematic analysis was used to synthesize quantitative and qualitative data organized by the social-ecological framework (individual, interpersonal and community/societal-level factors influencing gender attitudes). Eighty-two studies (46 quantitative, 31 qualitative, 5 mixed-methods) spanning 29 countries were included. Ninety percent of studies were from North America or Western Europe. The review findings indicate that young adolescents, across cultural settings, commonly express stereotypical or inequitable gender attitudes, and such attitudes appear to vary by individual sociodemographic characteristics (sex, race/ethnicity and immigration, social class, and age). Findings highlight that interpersonal influences (family and peers) are central influences on young adolescents' construction of gender attitudes, and these gender socialization processes differ for boys and girls. The role of community factors (e.g. media) is less clear though there is some evidence that schools may reinforce stereotypical gender attitudes among young adolescents. The findings from this review suggest that young adolescents in different cultural

  8. Understanding Factors that Shape Gender Attitudes in Early Adolescence Globally: A Mixed-Methods Systematic Review.

    Directory of Open Access Journals (Sweden)

    Anna Kågesten

    Full Text Available Early adolescence (ages 10-14 is a period of increased expectations for boys and girls to adhere to socially constructed and often stereotypical norms that perpetuate gender inequalities. The endorsement of such gender norms is closely linked to poor adolescent sexual and reproductive and other health-related outcomes yet little is known about the factors that influence young adolescents' personal gender attitudes.To explore factors that shape gender attitudes in early adolescence across different cultural settings globally.A mixed-methods systematic review was conducted of the peer-reviewed literature in 12 databases from 1984-2014. Four reviewers screened the titles and abstracts of articles and reviewed full text articles in duplicate. Data extraction and quality assessments were conducted using standardized templates by study design. Thematic analysis was used to synthesize quantitative and qualitative data organized by the social-ecological framework (individual, interpersonal and community/societal-level factors influencing gender attitudes.Eighty-two studies (46 quantitative, 31 qualitative, 5 mixed-methods spanning 29 countries were included. Ninety percent of studies were from North America or Western Europe. The review findings indicate that young adolescents, across cultural settings, commonly express stereotypical or inequitable gender attitudes, and such attitudes appear to vary by individual sociodemographic characteristics (sex, race/ethnicity and immigration, social class, and age. Findings highlight that interpersonal influences (family and peers are central influences on young adolescents' construction of gender attitudes, and these gender socialization processes differ for boys and girls. The role of community factors (e.g. media is less clear though there is some evidence that schools may reinforce stereotypical gender attitudes among young adolescents.The findings from this review suggest that young adolescents in different

  9. What Is in Your Wallet? Quantitation of Drugs of Abuse on Paper Currency with a Rapid LC-MS/MS Method

    Science.gov (United States)

    Parker, Patrick D.; Beers, Brandon; Vergne, Matthew J.

    2017-01-01

    Laboratory experiments were developed to introduce students to the quantitation of drugs of abuse by high performance liquid chromatography-tandem mass spectrometry (LC-MS/MS). Undergraduate students were introduced to internal standard quantitation and the LC-MS/MS method optimization for cocaine. Cocaine extracted from paper currency was…

  10. Innovative Born Globals

    DEFF Research Database (Denmark)

    Kraus, Sascha; Brem, Alexander; Muench, Miriam

    2017-01-01

    Internationalization is a hot topic in innovation management, whereby the phenomenon of “Born Globals” is still limited to research in the domains of Entrepreneurship and International Management. As business model design plays a key role for Born Globals, we link these two concepts. For this, we...... propose hypotheses about the influence of efficiency-centered and novelty-entered business model design on international firm performance. To test these hypotheses, we performed a quantitative survey with 252 founders of international companies in Germany, Switzerland and Liechtenstein. Additionally, we...... gained further insights through a case study analysis of 11 Born Globals. The results show that business model design matters to international firm performance and the business model design of Born Globals tends to be more efficiency-centered. Based on a multiple case study, we analyzed business models...

  11. Parameter identification and global sensitivity analysis of Xin'anjiang model using meta-modeling approach

    Directory of Open Access Journals (Sweden)

    Xiao-meng Song

    2013-01-01

    Full Text Available Parameter identification, model calibration, and uncertainty quantification are important steps in the model-building process, and are necessary for obtaining credible results and valuable information. Sensitivity analysis of hydrological model is a key step in model uncertainty quantification, which can identify the dominant parameters, reduce the model calibration uncertainty, and enhance the model optimization efficiency. There are, however, some shortcomings in classical approaches, including the long duration of time and high computation cost required to quantitatively assess the sensitivity of a multiple-parameter hydrological model. For this reason, a two-step statistical evaluation framework using global techniques is presented. It is based on (1 a screening method (Morris for qualitative ranking of parameters, and (2 a variance-based method integrated with a meta-model for quantitative sensitivity analysis, i.e., the Sobol method integrated with the response surface model (RSMSobol. First, the Morris screening method was used to qualitatively identify the parameters' sensitivity, and then ten parameters were selected to quantify the sensitivity indices. Subsequently, the RSMSobol method was used to quantify the sensitivity, i.e., the first-order and total sensitivity indices based on the response surface model (RSM were calculated. The RSMSobol method can not only quantify the sensitivity, but also reduce the computational cost, with good accuracy compared to the classical approaches. This approach will be effective and reliable in the global sensitivity analysis of a complex large-scale distributed hydrological model.

  12. Interlaboratory validation of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Takabatake, Reona; Koiwa, Tomohiro; Kasahara, Masaki; Takashima, Kaori; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Oguchi, Taichi; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi

    2011-01-01

    To reduce the cost and time required to routinely perform the genetically modified organism (GMO) test, we developed a duplex quantitative real-time PCR method for a screening analysis simultaneously targeting an event-specific segment for GA21 and Cauliflower Mosaic Virus 35S promoter (P35S) segment [Oguchi et al., J. Food Hyg. Soc. Japan, 50, 117-125 (2009)]. To confirm the validity of the method, an interlaboratory collaborative study was conducted. In the collaborative study, conversion factors (Cfs), which are required to calculate the GMO amount (%), were first determined for two real-time PCR instruments, the ABI PRISM 7900HT and the ABI PRISM 7500. A blind test was then conducted. The limit of quantitation for both GA21 and P35S was estimated to be 0.5% or less. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)). The determined bias and RSD(R) were each less than 25%. We believe the developed method would be useful for the practical screening analysis of GM maize.

  13. NNAlign: A Web-Based Prediction Method Allowing Non-Expert End-User Discovery of Sequence Motifs in Quantitative Peptide Data

    DEFF Research Database (Denmark)

    Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole

    2011-01-01

    Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity...... to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs...... associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can...

  14. A quantitative method for risk assessment of agriculture due to climate change

    Science.gov (United States)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  15. Quantitative cerebral H215O perfusion PET without arterial blood sampling, a method based on washout rate

    International Nuclear Information System (INIS)

    Treyer, Valerie; Jobin, Mathieu; Burger, Cyrill; Buck, Alfred; Teneggi, Vincenzo

    2003-01-01

    The quantitative determination of regional cerebral blood flow (rCBF) is important in certain clinical and research applications. The disadvantage of most quantitative methods using H 2 15 O positron emission tomography (PET) is the need for arterial blood sampling. In this study a new non-invasive method for rCBF quantification was evaluated. The method is based on the washout rate of H 2 15 O following intravenous injection. All results were obtained with Alpert's method, which yields maps of the washin parameter K 1 (rCBF K1 ) and the washout parameter k 2 (rCBF k2 ). Maps of rCBF K1 were computed with measured arterial input curves. Maps of rCBF k2* were calculated with a standard input curve which was the mean of eight individual input curves. The mean of grey matter rCBF k2* (CBF k2* ) was then compared with the mean of rCBF K1 (CBF K1 ) in ten healthy volunteer smokers who underwent two PET sessions on day 1 and day 3. Each session consisted of three serial H 2 15 O scans. Reproducibility was analysed using the rCBF difference scan 3-scan 2 in each session. The perfusion reserve (PR = rCBF acetazolamide -rCBF baseline ) following acetazolamide challenge was calculated with rCBF k2* (PR k2* ) and rCBF K1 (PR K1 ) in ten patients with cerebrovascular disease. The difference CBF k2* -CBF K1 was 5.90±8.12 ml/min/100 ml (mean±SD, n=55). The SD of the scan 3-scan 1 difference was 6.1% for rCBF k2* and rCBF K1 , demonstrating a high reproducibility. Perfusion reserve values determined with rCBF K1 and rCBF k2* were in high agreement (difference PR k2* -PR K1 =-6.5±10.4%, PR expressed in percentage increase from baseline). In conclusion, a new non-invasive method for the quantitative determination of rCBF is presented. The method is in good agreement with Alpert's original method and the reproducibility is high. It does not require arterial blood sampling, yields quantitative voxel-by-voxel maps of rCBF, and is computationally efficient and easy to implement

  16. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    Science.gov (United States)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  17. Quantitative X-ray methods of amorphous content and crystallinity determination of SiO2, in Quartz and Opal mixture

    International Nuclear Information System (INIS)

    Ketabdari, M.R.; Ahmadi, K.; Esmaeilnia Shirvani, A.; Tofigh, A.

    2001-01-01

    X-ray diffraction technique is commonly used for qualitative analysis of minerals, and has also been successfully used for quantitative measurements. In this research, the matrix flushing and a new X-ray diffraction method have been used for the determination of crystallinity and amorphous content of Opal and Quartz mixture. The PCAPD is used to determine the quantitative analysis of these two minerals

  18. Fundamental and clinical studies on simultaneous, quantitative analysis of hepatobiliary and gastrointestinal scintigrams using double isotope method

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Y; Kakihara, M; Sasaki, M; Tabuse, Y; Takei, N [Wakayama Medical Coll. (Japan)

    1981-04-01

    Double isotope method was applied to carry out simultaneous and quantitative analysis of hepatobiliary and gastrointestinal scintigrams. A scinticamera with parallel collimator for medium energy was connected to a computer to distinguish the two isotopes at a time. 4mCi of sup(99m)Tc-(Sn)-pyridoxylideneisoleucine (Tc-PI) and 200 ..mu..Ci of /sup 111/In-diethylenetriaminepentaacetic acid (In-DTPA) were administrated by i.v. injection and per oral, respectively. Three normal (two women and a man) and 16 patients after the operation of gastric cancer (10 recovered by Roux-en Y method after the total gastrectomy, and 6 recovered after the operation replacing the jejunum between the esophagus and duodenum) were investigated. The process of bile secretion and its mixing with food were followed by the scanning quantitatively. The analysis of time-activity variation at each organ indicated that the replacing operation gave more physiological recovery than that by Roux-en Y method. This method is noninvasive to patients and is promising to follow the process or activity of digestion in any digestive organ after surgery.

  19. ExSTA: External Standard Addition Method for Accurate High-Throughput Quantitation in Targeted Proteomics Experiments.

    Science.gov (United States)

    Mohammed, Yassene; Pan, Jingxi; Zhang, Suping; Han, Jun; Borchers, Christoph H

    2018-03-01

    Targeted proteomics using MRM with stable-isotope-labeled internal-standard (SIS) peptides is the current method of choice for protein quantitation in complex biological matrices. Better quantitation can be achieved with the internal standard-addition method, where successive increments of synthesized natural form (NAT) of the endogenous analyte are added to each sample, a response curve is generated, and the endogenous concentration is determined at the x-intercept. Internal NAT-addition, however, requires multiple analyses of each sample, resulting in increased sample consumption and analysis time. To compare the following three methods, an MRM assay for 34 high-to-moderate abundance human plasma proteins is used: classical internal SIS-addition, internal NAT-addition, and external NAT-addition-generated in buffer using NAT and SIS peptides. Using endogenous-free chicken plasma, the accuracy is also evaluated. The internal NAT-addition outperforms the other two in precision and accuracy. However, the curves derived by internal vs. external NAT-addition differ by only ≈3.8% in slope, providing comparable accuracies and precision with good CV values. While the internal NAT-addition method may be "ideal", this new external NAT-addition can be used to determine the concentration of high-to-moderate abundance endogenous plasma proteins, providing a robust and cost-effective alternative for clinical analyses or other high-throughput applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Methods and instrumentation for quantitative microchip capillary electrophoresis

    NARCIS (Netherlands)

    Revermann, T.

    2007-01-01

    The development of novel instrumentation and analytical methodology for quantitative microchip capillary electrophoresis (MCE) is described in this thesis. Demanding only small quantities of reagents and samples, microfluidic instrumentation is highly advantageous. Fast separations at high voltages

  1. Quantitation of global and regional left ventricular function by MRI

    NARCIS (Netherlands)

    van der Geest, RJ; Reiber, JHC; Reiber, JHC; VanDerWall, EE

    1998-01-01

    Magnetic resonance imaging (MRI) provides several imaging strategies for assessing left ventricular function. As a three-dimensional imaging technique, all measurements can be performed without relying on geometrical assumptions. Global and regional function parameters can be derived from

  2. Negative health system effects of Global Fund's investments in AIDS, tuberculosis and malaria from 2002 to 2009: systematic review.

    Science.gov (United States)

    Car, Josip; Paljärvi, Tapio; Car, Mate; Kazeem, Ayodele; Majeed, Azeem; Atun, Rifat

    2012-10-01

    By using the Global Fund as a case example, we aim to critically evaluate the evidence generated from 2002 to 2009 for potential negative health system effects of Global Health Initiatives (GHI). Systematic review of research literature. Developing Countries. All interventions potentially affecting health systems that were funded by the Global Fund. Negative health system effects of Global Fund investments as reported by study authors. We identified 24 studies commenting on adverse effects on health systems arising from Global Fund investments. Sixteen were quantitative studies, six were qualitative and two used both quantitative and qualitative methods, but none explicitly stated that the studies were originally designed to capture or to assess health system effects (positive or negative). Only seemingly anecdotal evidence or authors' perceptions/interpretations of circumstances could be extracted from the included studies. This study shows that much of the currently available evidence generated between 2002 and 2009 on GHIs potential negative health system effects is not of the quality expected or needed to best serve the academic or broader community. The majority of the reviewed research did not fulfil the requirements of rigorous scientific evidence.

  3. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  4. A method to quantitate regional wall motion in left ventriculography using Hildreth algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Terashima, Mikio [Hyogo Red Cross Blood Center (Japan); Naito, Hiroaki; Sato, Yoshinobu; Tamura, Shinichi; Kurosawa, Tsutomu

    1998-06-01

    Quantitative measurement of ventricular wall motion is indispensable for objective evaluation of cardiac function associated with coronary artery disease. We have modified the Hildreth`s algorithm to estimate excursions of the ventricular wall on left ventricular images yielded by various imaging techniques. Tagging cine-MRI was carried out on 7 healthy volunteers. The original Hildreth method, the modified Hildreth method and the centerline method were applied to the outlines of the images obtained, to estimate excursion of the left ventricular wall and regional shortening and to evaluate the accuracy of these methods when measuring these parameters, compared to the values of these parameters measured actually using the attached tags. The accuracy of the original Hildreth method was comparable to that of the centerline method, while the modified Hildreth method was significantly more accurate than the centerline method (P<0.05). Regional shortening as estimated using the modified Hildreth method differed less from the actually measured regional shortening than did the shortening estimated using the centerline method (P<0.05). The modified Hildreth method allowed reasonable estimation of left ventricular wall excursion in all cases where it was applied. These results indicate that when applied to left ventriculograms for ventricular wall motion analysis, the modified Hildreth method is more useful than the original Hildreth method. (author)

  5. Proceedings First Workshop on Quantitative Formal Methods : theory and applications (QFM'09, Eindhoven, The Netherlands, November 3, 2009)

    NARCIS (Netherlands)

    Andova, S.; McIver, A.; D'Argenio, P.R.; Cuijpers, P.J.L.; Markovski, J.; Morgan, C.; Núñez, M.

    2009-01-01

    This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted

  6. A systematic study on the influencing parameters and improvement of quantitative analysis of multi-component with single marker method using notoginseng as research subject.

    Science.gov (United States)

    Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing

    2015-03-01

    A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi

  7. Can't Count or Won't Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment.

    Science.gov (United States)

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2016-06-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through 'doing' quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a 'magic bullet' and that a wider programme of content and assessment diversification across the curriculum is preferential.

  8. Keyframes Global Map Establishing Method for Robot Localization through Content-Based Image Matching

    Directory of Open Access Journals (Sweden)

    Tianyang Cao

    2017-01-01

    Full Text Available Self-localization and mapping are important for indoor mobile robot. We report a robust algorithm for map building and subsequent localization especially suited for indoor floor-cleaning robots. Common methods, for example, SLAM, can easily be kidnapped by colliding or disturbed by similar objects. Therefore, keyframes global map establishing method for robot localization in multiple rooms and corridors is needed. Content-based image matching is the core of this method. It is designed for the situation, by establishing keyframes containing both floor and distorted wall images. Image distortion, caused by robot view angle and movement, is analyzed and deduced. And an image matching solution is presented, consisting of extraction of overlap regions of keyframes extraction and overlap region rebuild through subblocks matching. For improving accuracy, ceiling points detecting and mismatching subblocks checking methods are incorporated. This matching method can process environment video effectively. In experiments, less than 5% frames are extracted as keyframes to build global map, which have large space distance and overlap each other. Through this method, robot can localize itself by matching its real-time vision frames with our keyframes map. Even with many similar objects/background in the environment or kidnapping robot, robot localization is achieved with position RMSE <0.5 m.

  9. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    Science.gov (United States)

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  10. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  11. Investigation of a dual modal method for bone pathologies using quantitative ultrasound and photoacoustics

    Science.gov (United States)

    Steinberg, Idan; Gannot, Israel; Eyal, Avishay

    2015-03-01

    Osteoporosis is a widespread disease that has a catastrophic impact on patient's lives and overwhelming related healthcare costs. In recent works, we have developed a multi-spectral, frequency domain photoacoustic method for the evaluation of bone pathologies. This method has great advantages over pure ultrasonic or optical methods as it provides both molecular information from the bone absorption spectrum and bone mechanical status from the characteristics of the ultrasound propagation. These characteristics include both the Speed of Sound (SOS) and Broadband Ultrasonic Attenuation (BUA). To test the method's quantitative predictions, we have constructed a combined ultrasound and photoacoustic setup. Here, we experimentally present a dual modality system, and compares between the methods on bone samples in-vitro. The differences between the two modalities are shown to provide valuable insight into the bone structure and functional status.

  12. Comparation of fundamental analytical methods for quantitative determination of copper(IIion

    Directory of Open Access Journals (Sweden)

    Ačanski Marijana M.

    2008-01-01

    Full Text Available Copper is a ductile metal with excellent electrical conductivity, and finds extensive use as an electrical conductor, heat conductor, as a building material, and as a component of various alloys. In this work accuracy of methods for quantitative determination (gravimetric and titrimetric methods of analysis of copper(II ion was studied. Gravimetric methods do not require a calibration or standardization step (as all other analytical procedures except coulometry do because the results are calculated directly from the experimental data and molar masses. Thus, when only one or two samples are to be analyzed, a gravimetric procedure may be the method of choice because it involves less time and effort than a procedure that requires preparation of standards and calibration. In this work in gravimetric analysis the concentration of copper(II ion is established through the measurement of a mass of CuSCN and CuO. Titrimetric methods is a process in which a standard reagent is added to a solution of an analyze until the reaction between the analyze and reagent is judged to be complete. In this work in titrimetric analysis the concentration of copper(II ion is established through the measurement of a volume of different standard reagents: Km, Na2S2O3 and AgNO3. Results were discussed individually and mutually with the aspect of exactility, reproductivity and rapidity. Relative error was calculated for all methods.

  13. A quantitative method to evaluate mesenchymal stem cell lipofection using real-time PCR.

    Science.gov (United States)

    Ribeiro, S C; Mendes, R; Madeira, C; Monteiro, G A; da Silva, C L; Cabral, J M S

    2010-01-01

    Genetic modification of human mesenchymal stem cells (MSC) is a powerful tool to improve the therapeutic utility of these cells and to increase the knowledge on their regulation mechanisms. In this context, strong efforts have been made recently to develop efficient nonviral gene delivery systems. Although several studies addressed this question most of them use the end product of a reporter gene instead of the DNA uptake quantification to test the transfection efficiency. In this study, we established a method based on quantitative real-time PCR (RT-PCR) to determine the intracellular plasmid DNA copy number in human MSC after lipofection. The procedure requires neither specific cell lysis nor DNA purification. The influence of cell number on the RT-PCR sensitivity was evaluated. The method showed good reproducibility, high sensitivity, and a wide linear range of 75-2.5 x 10⁶ plasmid DNA copies per cell. RT-PCR results were then compared with the percentage of transfected cells assessed by flow cytometry analysis, which showed that flow cytometry-based results are not always proportional to plasmid cellular uptake determined by RT-PCR. This work contributed for the establishment of a rapid quantitative assay to determine intracellular plasmid DNA in stem cells, which will be extremely beneficial for the optimization of gene delivery strategies. © 2010 American Institute of Chemical Engineers

  14. Comparison of several methods to calculate sunshine hours from global radiation; Vergelijking van diverse methodes voor de berekening van zonneschijnduur uit globale straling

    Energy Technology Data Exchange (ETDEWEB)

    Schipper, J.

    2004-07-01

    non-scientific use of sunshine hour data. As the sensor can also be used to make accurate solar radiation measurements, scientific data can be collected at the same time. Further development resulted in an alteration of this algorithm in 1993 (Algorithm Bergman). In this study the results for 'sunshine-duration' calculated by both algorithms will be compared to measurements from the Campbell-Stokes. The results of the comparison will be discussed and concluding remarks will be given. [Dutch] Het KNMI heeft de uurlijkse waarden van zonneschijnduur vanaf 1899 opgeslagen in de klimatologische database. Deze waarden zijn de basis voor tal van klimatologische producten. Per 1-1-1993 is op alle KNMI-stations overgeschakeld op een andere methode om de zonneschijnduur te bepalen. Sinds die datum wordt niet langer de oude en vertrouwde Campbell-Stokes (brandspoor) methode gebruikt, maar is overgeschakeld op een indirecte methode, gebaseerd op globale straling. Redenen voor vernieuwing waren de nieuwe richtlijnen van het WMO. Deze trok in september 1989 de Campbell-Stokes terug als standaard meetinstrument voor de bepaling van zonneschijnduur vanwege de te grote meetonzekerheid en de klimaatgevoeligheid van de Campbell-Stokes. Het KNMI heeft daarom gekozen voor een meer indirecte methode. Slob ontwikkelde hiertoe een methode waarbij de zonneschijnduur wordt bepaald met behulp van een herleiding uit metingen van de globale straling, in het bijzonder uit de 10-minuten registraties van de waarden gemiddelde, maximum en minimum globale straling, het zogeheten 'algoritme Slob'. Met het algoritme wordt uit de genoemde informatie de zonneschijnduur per 10'-vak bepaald. De 10'-waarden zijn de basis voor de berekening van de uurwaarden en de etmaalwaarden zonneschijnduur. De methode is in de periode 1991-1993 operationeel ingevoerd. De nieuwe technieken van inzamelen maakte het mogelijk om m.b.v. hogere tijdsresolutie (10'- basis) een nauwkeurige

  15. Two quantitative methods for assessment of [Tc-99m]-MAA in the lung in the treatment of liver metastases: a case study

    International Nuclear Information System (INIS)

    Willowson, Kathy P.; Bailey, Dale L.; Baldock, Clive

    2009-01-01

    Full text: Objective: The use of Y-90 microspheres to treat metastatic liver cancer is becoming widely utilized. Despite the fact that the microspheres are delivered directly to the liver, some activity may bypass the liver capillaries and be shunted to the lungs. To evaluate the percentage of pulmonary breakthrough, a pre-therapy test is performed using Tc- 9 9 m labeled spheres. The aim of this project was to compare two quantitative methods for assessing lung uptake, and consider the possibility of organ specific quantification. Method: A previously validated method for achieving CT-based quantitative SPECTI was compared to a simple planar approach. A 44 year old man suffering from metastatic liver sarcoma was referred to the clinic for pre-therapy evaluation. After injection of Tc- 9 9 m labeled microspheres and routine imaging, a SPECT/CT was acquired and specific organ uptake values calculated. A further calibrated injection of [Tc- 9 9 m ]-MAA was then given as a simplified alternative to quantify lung uptake by comparing pre and post counts. Results: The quantitative SPECT/CT method correctly accounted for all injected activity and found 80% of the dose was retained in the liver and 4% in the lungs. The planar method found -4% of the dose in the lungs. Conclusion: The quantitative technique we have developed allows for accurate calculation of organ specific uptake, which has important implications for treatment. The additional MAA injection offers a simplified but accurate method to quantify lung uptake. I. K Willow son, D.L Bailey and C Baldock (2008) Quantitative SPECT reconstructions using CT-derived corrections Phys Med Bioi 53:3099-3112.

  16. Validated ¹H and 13C Nuclear Magnetic Resonance Methods for the Quantitative Determination of Glycerol in Drug Injections.

    Science.gov (United States)

    Lu, Jiaxi; Wang, Pengli; Wang, Qiuying; Wang, Yanan; Jiang, Miaomiao

    2018-05-15

    In the current study, we employed high-resolution proton and carbon nuclear magnetic resonance spectroscopy (¹H and 13 C NMR) for quantitative analysis of glycerol in drug injections without any complex pre-treatment or derivatization on samples. The established methods were validated with good specificity, linearity, accuracy, precision, stability, and repeatability. Our results revealed that the contents of glycerol were convenient to calculate directly via the integration ratios of peak areas with an internal standard in ¹H NMR spectra, while the integration of peak heights were proper for 13 C NMR in combination with an external calibration of glycerol. The developed methods were both successfully applied in drug injections. Quantitative NMR methods showed an extensive prospect for glycerol determination in various liquid samples.

  17. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  18. A New Quantitative Method for the Non-Invasive Documentation of Morphological Damage in Paintings Using RTI Surface Normals

    Directory of Open Access Journals (Sweden)

    Marcello Manfredi

    2014-07-01

    Full Text Available In this paper we propose a reliable surface imaging method for the non-invasive detection of morphological changes in paintings. Usually, the evaluation and quantification of changes and defects results mostly from an optical and subjective assessment, through the comparison of the previous and subsequent state of conservation and by means of condition reports. Using quantitative Reflectance Transformation Imaging (RTI we obtain detailed information on the geometry and morphology of the painting surface with a fast, precise and non-invasive method. Accurate and quantitative measurements of deterioration were acquired after the painting experienced artificial damage. Morphological changes were documented using normal vector images while the intensity map succeeded in highlighting, quantifying and describing the physical changes. We estimate that the technique can detect a morphological damage slightly smaller than 0.3 mm, which would be difficult to detect with the eye, considering the painting size. This non-invasive tool could be very useful, for example, to examine paintings and artwork before they travel on loan or during a restoration. The method lends itself to automated analysis of large images and datasets. Quantitative RTI thus eases the transition of extending human vision into the realm of measuring change over time.

  19. Consumer behavior changing: methods of evaluation

    Directory of Open Access Journals (Sweden)

    Elīna Gaile-Sarkane

    2013-11-01

    Full Text Available The article is devoted to methods of analyses of consumer buying behavior as well as to evaluation of most important factors what influences consumer behavior. This research aims at investigations about the changes in consumer behavior caused by globalization and development of information technologies; it helps to understand the specific factors what should be taken into account in evaluation of consumer behavior. The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, synthesis, expert method, statistic method, etc. Research findings disclosed that there is possibility to introduce new methods for evaluation of changing consumer behavior.

  20. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    Directory of Open Access Journals (Sweden)

    Arun Sedhain

    2015-01-01

    Full Text Available Objective To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA and anthropometric measurements. Method We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI, mid-arm circumference (MAC, mid-arm muscle circumference (MAMC, triceps skin fold (TSF and biceps skin fold (BSF, serum albumin, C-reactive protein (CRP and lipid profile in a government tertiary hospital at Kathmandu, Nepal. Results Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m 2 respectively. MQSGA showed negative correlation with MAC ( r = −0.563; P = < 0.001, BMI ( r = −0.448; P = < 0.001, MAMC ( r = −0.506; P = < .0001, TSF ( r = −0.483; P = < .0002, and BSF ( r = −0.508; P = < 0.0001. Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. Conclusion Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status.

  1. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  2. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  3. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    International Nuclear Information System (INIS)

    Chen, Q G; Xu, Y; Zhu, H H; Chen, H; Lin, B

    2015-01-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565–750 nm. The spectral parameter, defined as the ratio of wavebands at 565–750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66–1.06, 1.06–1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems. (paper)

  4. Mode decomposition methods for flows in high-contrast porous media. Global-local approach

    KAUST Repository

    Ghommem, Mehdi; Presho, Michael; Calo, Victor M.; Efendiev, Yalchin R.

    2013-01-01

    In this paper, we combine concepts of the generalized multiscale finite element method (GMsFEM) and mode decomposition methods to construct a robust global-local approach for model reduction of flows in high-contrast porous media. This is achieved by implementing Proper Orthogonal Decomposition (POD) and Dynamic Mode Decomposition (DMD) techniques on a coarse grid computed using GMsFEM. The resulting reduced-order approach enables a significant reduction in the flow problem size while accurately capturing the behavior of fully-resolved solutions. We consider a variety of high-contrast coefficients and present the corresponding numerical results to illustrate the effectiveness of the proposed technique. This paper is a continuation of our work presented in Ghommem et al. (2013) [1] where we examine the applicability of POD and DMD to derive simplified and reliable representations of flows in high-contrast porous media on fully resolved models. In the current paper, we discuss how these global model reduction approaches can be combined with local techniques to speed-up the simulations. The speed-up is due to inexpensive, while sufficiently accurate, computations of global snapshots. © 2013 Elsevier Inc.

  5. Mode decomposition methods for flows in high-contrast porous media. Global-local approach

    KAUST Repository

    Ghommem, Mehdi

    2013-11-01

    In this paper, we combine concepts of the generalized multiscale finite element method (GMsFEM) and mode decomposition methods to construct a robust global-local approach for model reduction of flows in high-contrast porous media. This is achieved by implementing Proper Orthogonal Decomposition (POD) and Dynamic Mode Decomposition (DMD) techniques on a coarse grid computed using GMsFEM. The resulting reduced-order approach enables a significant reduction in the flow problem size while accurately capturing the behavior of fully-resolved solutions. We consider a variety of high-contrast coefficients and present the corresponding numerical results to illustrate the effectiveness of the proposed technique. This paper is a continuation of our work presented in Ghommem et al. (2013) [1] where we examine the applicability of POD and DMD to derive simplified and reliable representations of flows in high-contrast porous media on fully resolved models. In the current paper, we discuss how these global model reduction approaches can be combined with local techniques to speed-up the simulations. The speed-up is due to inexpensive, while sufficiently accurate, computations of global snapshots. © 2013 Elsevier Inc.

  6. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  7. A new method for analysing socio-ecological patterns of vulnerability

    OpenAIRE

    Kok, M.; Lüdeke, M.; Lucas, P.; Sterzel, T.; Walther, C.; Janssen, P.; Sietz, D.; de Soysa, I.

    2016-01-01

    This paper presents a method for the analysis of socio-ecological patterns of vulnerability of people being at risk of losing their livelihoods as a consequence of global environmental change. This method fills a gap in methodologies for vulnerability analysis by providing generalizations of the factors that shape vulnerability in specific socio-ecological systems and showing their spatial occurrence. The proposed method consists of four steps that include both quantitative and qualitative an...

  8. SU-F-J-112: Clinical Feasibility Test of An RF Pulse-Based MRI Method for the Quantitative Fat-Water Segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Yee, S; Wloch, J; Pirkola, M [William Beaumont Hospital, Royal Oak, MI (United States)

    2016-06-15

    Purpose: Quantitative fat-water segmentation is important not only because of the clinical utility of fat-suppressed MRI images in better detecting lesions of clinical significance (in the midst of bright fat signal) but also because of the possible physical need, in which CT-like images based on the materials’ photon attenuation properties may have to be generated from MR images; particularly, as in the case of MR-only radiation oncology environment to obtain radiation dose calculation or as in the case of hybrid PET/MR modality to obtain attenuation correction map for the quantitative PET reconstruction. The majority of such fat-water quantitative segmentations have been performed by utilizing the Dixon’s method and its variations, which have to enforce the proper settings (often predefined) of echo time (TE) in the pulse sequences. Therefore, such methods have been unable to be directly combined with those ultrashort TE (UTE) sequences that, taking the advantage of very low TE values (∼ 10’s microsecond), might be beneficial to directly detect bones. Recently, an RF pulse-based method (http://dx.doi.org/10.1016/j.mri.2015.11.006), termed as PROD pulse method, was introduced as a method of quantitative fat-water segmentation that does not have to depend on predefined TE settings. Here, the clinical feasibility of this method is verified in brain tumor patients by combining the PROD pulse with several sequences. Methods: In a clinical 3T MRI, the PROD pulse was combined with turbo spin echo (e.g. TR=1500, TE=16 or 60, ETL=15) or turbo field echo (e.g. TR=5.6, TE=2.8, ETL=12) sequences without specifying TE values. Results: The fat-water segmentation was possible without having to set specific TE values. Conclusion: The PROD pulse method is clinically feasible. Although not yet combined with UTE sequences in our laboratory, the method is potentially compatible with UTE sequences, and thus, might be useful to directly segment fat, water, bone and air.

  9. Comparison of reverse transcription-quantitative polymerase chain reaction methods and platforms for single cell gene expression analysis.

    Science.gov (United States)

    Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A

    2012-08-15

    Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Quantitation of left ventricular dimensions and function by digital video subtraction angiography

    International Nuclear Information System (INIS)

    Higgins, C.B.; Norris, S.L.; Gerber, K.H.; Slutsky, R.A.; Ashburn, W.L.; Baily, N.

    1982-01-01

    Digital video subtraction angiography (DVSA) after central intravenous administration of contrast media was used in experimental animals and in patients with suspected coronary artery disease to quantitate left ventricular dimensions and regional and global contractile function. In animals, measurements of left ventricular (LV) volumes, wall thickness, ejection fraction, segmental contraction, and cardiac output correlated closely with sonocardiometry or thermodilution measurements. In patients, volumes and ejection fractions calculated from mask mode digital images correlated closely with direct left ventriculography. Global and segmental contractile function was displayed in patients by ejection shell images, stroke volume images, and time interval difference images. Central cardiovascular function was also quantitated by measurement of pulmonary transit time and calculation of pulmonary blood volume from digital fluoroscopic images. DVSA was shown to be useful and accurate in the quantitation of central cardiovascular physiology

  11. A theoretical global optimization method for vapor-compression refrigeration systems based on entransy theory

    International Nuclear Information System (INIS)

    Xu, Yun-Chao; Chen, Qun

    2013-01-01

    The vapor-compression refrigeration systems have been one of the essential energy conversion systems for humankind and exhausting huge amounts of energy nowadays. Surrounding the energy efficiency promotion of the systems, there are lots of effectual optimization methods but mainly relied on engineering experience and computer simulations rather than theoretical analysis due to the complex and vague physical essence. We attempt to propose a theoretical global optimization method based on in-depth physical analysis for the involved physical processes, i.e. heat transfer analysis for condenser and evaporator, through introducing the entransy theory and thermodynamic analysis for compressor and expansion valve. The integration of heat transfer and thermodynamic analyses forms the overall physical optimization model for the systems to describe the relation between all the unknown parameters and known conditions, which makes theoretical global optimization possible. With the aid of the mathematical conditional extremum solutions, an optimization equation group and the optimal configuration of all the unknown parameters are analytically obtained. Eventually, via the optimization of a typical vapor-compression refrigeration system with various working conditions to minimize the total heat transfer area of heat exchangers, the validity and superior of the newly proposed optimization method is proved. - Highlights: • A global optimization method for vapor-compression systems is proposed. • Integrating heat transfer and thermodynamic analyses forms the optimization model. • A mathematical relation between design parameters and requirements is derived. • Entransy dissipation is introduced into heat transfer analysis. • The validity of the method is proved via optimization of practical cases

  12. Development of quantitative analysis method for stereotactic brain image. Assessment of reduced accumulation in extent and severity using anatomical segmentation

    International Nuclear Information System (INIS)

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-01-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on National Institute of Neurological and Communicative Disorders and Stroke-Alzheimer's Disease and Related Disorders Association (NINCDS-ADRDA), we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-stereotactic surface projections (SSP) program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution. (author)

  13. Can’t Count or Won’t Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment

    Science.gov (United States)

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2015-01-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through ‘doing’ quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a ‘magic bullet’ and that a wider programme of content and assessment diversification across the curriculum is preferential. PMID:27330225

  14. A Comparative Study on Recently-Introduced Nature-Based Global Optimization Methods in Complex Mechanical System Design

    Directory of Open Access Journals (Sweden)

    Abdulbaset El Hadi Saad

    2017-10-01

    Full Text Available Advanced global optimization algorithms have been continuously introduced and improved to solve various complex design optimization problems for which the objective and constraint functions can only be evaluated through computation intensive numerical analyses or simulations with a large number of design variables. The often implicit, multimodal, and ill-shaped objective and constraint functions in high-dimensional and “black-box” forms demand the search to be carried out using low number of function evaluations with high search efficiency and good robustness. This work investigates the performance of six recently introduced, nature-inspired global optimization methods: Artificial Bee Colony (ABC, Firefly Algorithm (FFA, Cuckoo Search (CS, Bat Algorithm (BA, Flower Pollination Algorithm (FPA and Grey Wolf Optimizer (GWO. These approaches are compared in terms of search efficiency and robustness in solving a set of representative benchmark problems in smooth-unimodal, non-smooth unimodal, smooth multimodal, and non-smooth multimodal function forms. In addition, four classic engineering optimization examples and a real-life complex mechanical system design optimization problem, floating offshore wind turbines design optimization, are used as additional test cases representing computationally-expensive black-box global optimization problems. Results from this comparative study show that the ability of these global optimization methods to obtain a good solution diminishes as the dimension of the problem, or number of design variables increases. Although none of these methods is universally capable, the study finds that GWO and ABC are more efficient on average than the other four in obtaining high quality solutions efficiently and consistently, solving 86% and 80% of the tested benchmark problems, respectively. The research contributes to future improvements of global optimization methods.

  15. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    Science.gov (United States)

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental

  16. Quantitative SPECT reconstruction for brain distribution with a non-uniform attenuation using a regularizing method

    International Nuclear Information System (INIS)

    Soussaline, F.; Bidaut, L.; Raynaud, C.; Le Coq, G.

    1983-06-01

    An analytical solution to the SPECT reconstruction problem, where the actual attenuation effect can be included, was developped using a regularizing iterative method (RIM). The potential of this approach in quantitative brain studies when using a tracer for cerebrovascular disorders is now under evaluation. Mathematical simulations for a distributed activity in the brain surrounded by the skull and physical phantom studies were performed, using a rotating camera based SPECT system, allowing the calibration of the system and the evaluation of the adapted method to be used. On the simulation studies, the contrast obtained along a profile, was less than 5%, the standard deviation 8% and the quantitative accuracy 13%, for a uniform emission distribution of mean = 100 per pixel and a double attenuation coefficient of μ = 0.115 cm -1 and 0.5 cm -1 . Clinical data obtained after injection of 123 I (AMPI) were reconstructed using the RIM without and with cerebrovascular diseases or lesion defects. Contour finding techniques were used for the delineation of the brain and the skull, and measured attenuation coefficients were assumed within these two regions. Using volumes of interest, selected on homogeneous regions on an hemisphere and reported symetrically, the statistical uncertainty for 300 K events in the tomogram was found to be 12%, the index of symetry was of 4% for normal distribution. These results suggest that quantitative SPECT reconstruction for brain distribution is feasible, and that combined with an adapted tracer and an adequate model physiopathological parameters could be extracted

  17. A global optimization method for evaporative cooling systems based on the entransy theory

    International Nuclear Information System (INIS)

    Yuan, Fang; Chen, Qun

    2012-01-01

    Evaporative cooling technique, one of the most widely used methods, is essential to both energy conservation and environment protection. This contribution introduces a global optimization method for indirect evaporative cooling systems with coupled heat and mass transfer processes based on the entransy theory to improve their energy efficiency. First, we classify the irreversible processes in the system into the heat transfer process, the coupled heat and mass transfer process and the mixing process of waters in different branches, where the irreversibility is evaluated by the entransy dissipation. Then through the total system entransy dissipation, we establish the theoretical relationship of the user demands with both the geometrical structures of each heat exchanger and the operating parameters of each fluid, and derive two optimization equation groups focusing on two typical optimization problems. Finally, an indirect evaporative cooling system is taken as an example to illustrate the applications of the newly proposed optimization method. It is concluded that there exists an optimal circulating water flow rate with the minimum total thermal conductance of the system. Furthermore, with different user demands and moist air inlet conditions, it is the global optimization, other than parametric analysis, will obtain the optimal performance of the system. -- Highlights: ► Introduce a global optimization method for evaporative cooling systems. ► Establish the direct relation between user demands and the design parameters. ► Obtain two groups of optimization equations for two typical optimization objectives. ► Solving the equations offers the optimal design parameters for the system. ► Provide the instruction for the design of coupled heat and mass transfer systems.

  18. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  19. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  20. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  1. Evaluation of global and regional left ventricular function obtained by quantitative gated SPECT using {sup 99m}Tc-tetrofosmin for left ventricular dysfunction

    Energy Technology Data Exchange (ETDEWEB)

    Ban, Kazunobu; Nakajima, Tohru; Iseki, Harukazu; Abe, Sumihisa; Handa, Shunnosuke; Suzuki, Yutaka [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2000-08-01

    The quantitative gated SPECT (QGS) software is able to calculate LV volumes and visualize LV wall motion and perfusion throughout the cardiac cycle using an automatic edge detection algorithm of the left ventricle. We evaluated the reliability of global and regional LV function assessment derived from QGS by comparing it with the results from left ventriculo-cineangiography (LVG). In 20 patients with left ventricular dysfunction who underwent ECG gated {sup 99m}Tc-tetrofosmin SPECT, the end-diastolic volume (EDV), end-systolic volume (ESV) and ejection fraction (LVEF) were calculated. The QGS-assessed regional wall motion was determined using the cinematic display. QGS-derived EDV, ESV and LVEF correlated well with those by LVG (p<0.001 for each). There was a good correlation between wall motion score (WMS) derived from the QGS and the LVG (r=0.40, p<0.05). In some patients with extensive myocardial infarction, there was a discrepancy in the regional wall motion results between QGS and LVG. The ECG-gated SPECT using QGS is useful to evaluate global and regional LV functions in left ventricular dysfunction. (author)

  2. Making Social Work Count: A Curriculum Innovation to Teach Quantitative Research Methods and Statistical Analysis to Undergraduate Social Work Students in the United Kingdom

    Science.gov (United States)

    Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan

    2017-01-01

    Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…

  3. Handling large numbers of observation units in three-way methods for the analysis of qualitative and quantitative two-way data

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Marchetti, G.M.

    1994-01-01

    Recently, a number of methods have been proposed for the exploratory analysis of mixtures of qualitative and quantitative variables. In these methods for each variable an object by object similarity matrix is constructed, and these are consequently analyzed by means of three-way methods like

  4. A new method to generate a high-resolution global distribution map of lake chlorophyll

    Science.gov (United States)

    Sayers, Michael J; Grimm, Amanda G.; Shuchman, Robert A.; Deines, Andrew M.; Bunnell, David B.; Raymer, Zachary B; Rogers, Mark W.; Woelmer, Whitney; Bennion, David; Brooks, Colin N.; Whitley, Matthew A.; Warner, David M.; Mychek-Londer, Justin G.

    2015-01-01

    A new method was developed, evaluated, and applied to generate a global dataset of growing-season chlorophyll-a (chl) concentrations in 2011 for freshwater lakes. Chl observations from freshwater lakes are valuable for estimating lake productivity as well as assessing the role that these lakes play in carbon budgets. The standard 4 km NASA OceanColor L3 chlorophyll concentration products generated from MODIS and MERIS sensor data are not sufficiently representative of global chl values because these can only resolve larger lakes, which generally have lower chl concentrations than lakes of smaller surface area. Our new methodology utilizes the 300 m-resolution MERIS full-resolution full-swath (FRS) global dataset as input and does not rely on the land mask used to generate standard NASA products, which masks many lakes that are otherwise resolvable in MERIS imagery. The new method produced chl concentration values for 78,938 and 1,074 lakes in the northern and southern hemispheres, respectively. The mean chl for lakes visible in the MERIS composite was 19.2 ± 19.2, the median was 13.3, and the interquartile range was 3.90–28.6 mg m−3. The accuracy of the MERIS-derived values was assessed by comparison with temporally near-coincident and globally distributed in situmeasurements from the literature (n = 185, RMSE = 9.39, R2 = 0.72). This represents the first global-scale dataset of satellite-derived chl estimates for medium to large lakes.

  5. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    Science.gov (United States)

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  6. The Advantages and Disadvantages of Using Qualitative and Quantitative Approaches and Methods in Language "Testing and Assessment" Research: A Literature Review

    Science.gov (United States)

    Rahman, Md Shidur

    2017-01-01

    The researchers of various disciplines often use qualitative and quantitative research methods and approaches for their studies. Some of these researchers like to be known as qualitative researchers; others like to be regarded as quantitative researchers. The researchers, thus, are sharply polarised; and they involve in a competition of pointing…

  7. SPECT myocardial blood flow quantitation toward clinical use: a comparative study with {sup 13}N-Ammonia PET myocardial blood flow quantitation

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, Bailing [University of Missouri-Columbia, Nuclear Science and Engineering Institute, Columbia, Missouri (United States); Hu, Lien-Hsin; Yang, Bang-Hung; Ting, Chien-Hsin; Huang, Wen-Sheng [Taipei Veterans General Hospital, Department of Nuclear Medicine, Taipei (China); Chen, Lung-Ching [Shin Kong Wu-Ho Su Memorial Hospital, Division of Cardiology, Taipei (China); Chen, Yen-Kung [Shin Kong Wu-Ho Su Memorial Hospital, Department of Nuclear Medicine, Taipei (China); Hung, Guang-Uei [Chang Bing Show Chwan Memorial Hospital, Department of Nuclear Medicine, Changhua (China); Wu, Tao-Cheng [National Yang-Ming University, Cardiovascular Research Center, Taipei (China)

    2017-01-15

    The aim of this study was to evaluate the accuracy of myocardial blood flow (MBF) quantitation of {sup 99m}Tc-Sestamibi (MIBI) single photon emission computed tomography (SPECT) compared with {sup 13}N-Ammonia (NH3) position emission tomography (PET) on the same cohorts. Recent advances of SPECT technologies have been applied to develop MBF quantitation as a promising tool to diagnose coronary artery disease (CAD) for areas where PET MBF quantitation is not available. However, whether the SPECT approach can achieve the same level of accuracy as the PET approach for clinical use still needs further investigations. Twelve healthy volunteers (HVT) and 16 clinical patients with CAD received both MIBI SPECT and NH3 PET flow scans. Dynamic SPECT images acquired with high temporary resolution were fully corrected for physical factors and processed to quantify K1 using the standard compartmental modeling. Human MIBI tracer extraction fraction (EF) was determined by comparing MIBI K1 and NH3 flow on the HVT group and then used to convert flow values from K1 for all subjects. MIBI and NH3 flow values were systematically compared to validate the SPECT approach. The human MIBI EF was determined as [1.0-0.816*exp(-0.267/MBF)]. Global and regional MBF and myocardial flow reserve (MFR) of MIBI SPECT and NH3 PET were highly correlated for all subjects (global R{sup 2}: MBF = 0.92, MFR = 0.78; regional R{sup 2}: MBF ≥ 0.88, MFR ≥ 0.71). No significant differences for rest flow, stress flow, and MFR between these two approaches were observed (All p ≥ 0.088). Bland-Altman plots overall revealed small bias between MIBI SPECT and NH3 PET (global: ΔMBF = -0.03Lml/min/g, ΔMFR = 0.07; regional: ΔMBF = -0.07 - 0.06, ΔMFR = -0.02 - 0.22). Quantitation with SPECT technologies can be accurate to measure myocardial blood flow as PET quantitation while comprehensive imaging factors of SPECT to derive the variability between these two approaches were fully addressed and corrected

  8. SPECT myocardial blood flow quantitation toward clinical use: a comparative study with "1"3N-Ammonia PET myocardial blood flow quantitation

    International Nuclear Information System (INIS)

    Hsu, Bailing; Hu, Lien-Hsin; Yang, Bang-Hung; Ting, Chien-Hsin; Huang, Wen-Sheng; Chen, Lung-Ching; Chen, Yen-Kung; Hung, Guang-Uei; Wu, Tao-Cheng

    2017-01-01

    The aim of this study was to evaluate the accuracy of myocardial blood flow (MBF) quantitation of "9"9"mTc-Sestamibi (MIBI) single photon emission computed tomography (SPECT) compared with "1"3N-Ammonia (NH3) position emission tomography (PET) on the same cohorts. Recent advances of SPECT technologies have been applied to develop MBF quantitation as a promising tool to diagnose coronary artery disease (CAD) for areas where PET MBF quantitation is not available. However, whether the SPECT approach can achieve the same level of accuracy as the PET approach for clinical use still needs further investigations. Twelve healthy volunteers (HVT) and 16 clinical patients with CAD received both MIBI SPECT and NH3 PET flow scans. Dynamic SPECT images acquired with high temporary resolution were fully corrected for physical factors and processed to quantify K1 using the standard compartmental modeling. Human MIBI tracer extraction fraction (EF) was determined by comparing MIBI K1 and NH3 flow on the HVT group and then used to convert flow values from K1 for all subjects. MIBI and NH3 flow values were systematically compared to validate the SPECT approach. The human MIBI EF was determined as [1.0-0.816*exp(-0.267/MBF)]. Global and regional MBF and myocardial flow reserve (MFR) of MIBI SPECT and NH3 PET were highly correlated for all subjects (global R"2: MBF = 0.92, MFR = 0.78; regional R"2: MBF ≥ 0.88, MFR ≥ 0.71). No significant differences for rest flow, stress flow, and MFR between these two approaches were observed (All p ≥ 0.088). Bland-Altman plots overall revealed small bias between MIBI SPECT and NH3 PET (global: ΔMBF = -0.03Lml/min/g, ΔMFR = 0.07; regional: ΔMBF = -0.07 - 0.06, ΔMFR = -0.02 - 0.22). Quantitation with SPECT technologies can be accurate to measure myocardial blood flow as PET quantitation while comprehensive imaging factors of SPECT to derive the variability between these two approaches were fully addressed and corrected. (orig.)

  9. Quantitative measurement of mixtures by terahertz time–domain ...

    Indian Academy of Sciences (India)

    Administrator

    earth and space science, quality control of food and agricultural products and global environmental monitoring. In quantitative applications, terahertz technology has been widely used for studying dif- ferent kinds of mixtures, such as amino acids,. 8 ter- nary chemical mixtures,. 9 pharmaceuticals,. 10 racemic compounds. 11.

  10. Validation of quantitative method for azoxystrobin residues in green beans and peas.

    Science.gov (United States)

    Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G

    2015-09-01

    This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    Science.gov (United States)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  12. Transcending the Quantitative-Qualitative Divide with Mixed Methods Research: A Multidimensional Framework for Understanding Congruence and Completeness in the Study of Values

    Science.gov (United States)

    McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.

    2010-01-01

    Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…

  13. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    Science.gov (United States)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  14. Validation of HPLC method for the simultaneous and quantitative determination of 12 UV-filters in cosmetics.

    Science.gov (United States)

    Nyeborg, M; Pissavini, M; Lemasson, Y; Doucet, O

    2010-02-01

    The aim of the study was the validation of a high-performance liquid chromatography (HPLC) method for the simultaneous and quantitative determination of twelve commonly used organic UV-filters (phenylbenzimidazole sulfonic acid, benzophenone-3, isoamyl p-methoxycinnamate, diethylamino hydroxybenzoyl hexyl benzoate, octocrylene, ethylhexyl methoxycinnamate, ethylhexyl salicylate, butyl methoxydibenzoylmethane, diethylhexyl butamido triazone, ethylhexyl triazone, methylene bis-benzotriazolyl tetramethylbutylphenol and bis-ethylhexyloxyphenol methoxyphenyl triazine) contained in suncare products. The separation and quantitative determination was performed in <30 min, using a Symmetry Shield(R) C18 (5 microm) column from Waters and a mobile phase (gradient mode) consisting of ethanol and acidified water. UV measurements were carried out at multi-wavelengths, according to the absorption of the analytes.

  15. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  16. Quantitative myocardial blood flow with Rubidium-82 PET

    DEFF Research Database (Denmark)

    Hagemann, Christoffer E; Ghotbi, Adam A; Kjær, Andreas

    2015-01-01

    Positron emission tomography (PET) allows assessment of myocardial blood flow in absolute terms (ml/min/g). Quantification of myocardial blood flow (MBF) and myocardial flow reserve (MFR) extend the scope of conventional semi-quantitative myocardial perfusion imaging (MPI): e.g. in 1) identificat......Positron emission tomography (PET) allows assessment of myocardial blood flow in absolute terms (ml/min/g). Quantification of myocardial blood flow (MBF) and myocardial flow reserve (MFR) extend the scope of conventional semi-quantitative myocardial perfusion imaging (MPI): e.g. in 1...... global MFR and major adverse cardiovascular events (MACE), and together with new diagnostic possibilities from measuring the longitudinal myocardial perfusion gradient, cardiac (82)Rb PET faces a promising clinical future. This article reviews current evidence on quantitative (82)Rb PET's ability...

  17. THE STUDY OF SOCIAL REPRESENTATIONS BY THE VIGNETTE METHOD: A QUANTITATIVE INTERPRETATION

    Directory of Open Access Journals (Sweden)

    Ж В Пузанова

    2017-12-01

    Full Text Available The article focuses on the prospects of creating vignettes as a new method in empirical sociology. It is a good alternative to the conventional mass survey methods. The article consists of a few sections differing by the focus. The vignette method is not popular among Russian scientists, but has a big history abroad. A wide range of problems can be solved by this method (e.g. the prospects for guardianship and its evaluation, international students’ adaptation to the educational system, social justice studies, market-ing and business research, etc.. The vignette method can be used for studying different problems including sensitive questions (e.g. HIV, drugs, psychological trauma, etc., because it is one of the projective techniques. Projective techniques allow to obtain more reliable information, because the respondent projects one situation on the another, but at the same time responds to his own stimulus. The article considers advantages and disadvantages of the method. The authors provide information about the limitations of the method. The article presents the key principles for designing and developing the vignettes method depending on the research type. The authors provide examples of their own vignettes tested in the course of their own empirical research. The authors highlight the advantages of the logical-combinatorial approaches (especially the JSM method with its dichotomy for the analysis of data in quantitative research. Also they consider another method of the analysis of the data that implies the technique of “steeping”, i.e. when the respondent gets new information step by step, which extends his previous knowledge.

  18. A new method for quantitative assessment of resilience engineering by PCA and NT approach: A case study in a process industry

    International Nuclear Information System (INIS)

    Shirali, Gh.A.; Mohammadfam, I.; Ebrahimipour, V.

    2013-01-01

    In recent years, resilience engineering (RE) has attracted widespread interest from industry as well as academia because it presents a new way of thinking about safety and accident. Although the concept of RE was defined scholarly in various areas, there are only few which specifically focus on how to measure RE. Therefore, there is a gap in assessing resilience by quantitative methods. This research aimed at presenting a new method for quantitative assessment of RE using questionnaire and based on principal component analysis. However, six resilience indicators, i.e., top management commitment, Just culture, learning culture, awareness and opacity, preparedness, and flexibility were chosen, and the data related to those in the 11 units of a process industry using a questionnaire was gathered. The data was analyzed based on principal component analysis (PCA) approach. The analysis also leads to determination of the score of resilience indicators and the process units. The process units were ranked using these scores. Consequently, the prescribed approach can determine the poor indicators and the process units. This is the first study that considers a quantitative assessment in RE area which is conducted through PCA. Implementation of the proposed methods would enable the managers to recognize the current weaknesses and challenges against the resilience of their system. -- Highlights: •We quantitatively measure the potential of resilience. •The results are more tangible to understand and interpret. •The method facilitates comparison of resilience state among various process units. •The method facilitates comparison of units' resilience state with the best practice

  19. A novel quantitative analysis method of three-dimensional fluorescence spectra for vegetable oils contents in edible blend oil

    Science.gov (United States)

    Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei

    2015-04-01

    Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.

  20. Application of quantitative real-time PCR compared to filtration methods for the enumeration of Escherichia coli in surface waters within Vietnam.

    Science.gov (United States)

    Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W

    2017-02-01

    Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.