WorldWideScience

Sample records for quantitation methods apex

  1. Comparison of two label-free global quantitation methods, APEX and 2D gel electrophoresis, applied to the Shigella dysenteriae proteome

    Directory of Open Access Journals (Sweden)

    Fleischmann Robert D

    2009-06-01

    Full Text Available Abstract The in vitro stationary phase proteome of the human pathogen Shigella dysenteriae serotype 1 (SD1 was quantitatively analyzed in Coomassie Blue G250 (CBB-stained 2D gels. More than four hundred and fifty proteins, of which 271 were associated with distinct gel spots, were identified. In parallel, we employed 2D-LC-MS/MS followed by the label-free computationally modified spectral counting method APEX for absolute protein expression measurements. Of the 4502 genome-predicted SD1 proteins, 1148 proteins were identified with a false positive discovery rate of 5% and quantitated using 2D-LC-MS/MS and APEX. The dynamic range of the APEX method was approximately one order of magnitude higher than that of CBB-stained spot intensity quantitation. A squared Pearson correlation analysis revealed a reasonably good correlation (R2 = 0.67 for protein quantities surveyed by both methods. The correlation was decreased for protein subsets with specific physicochemical properties, such as low Mr values and high hydropathy scores. Stoichiometric ratios of subunits of protein complexes characterized in E. coli were compared with APEX quantitative ratios of orthologous SD1 protein complexes. A high correlation was observed for subunits of soluble cellular protein complexes in several cases, demonstrating versatile applications of the APEX method in quantitative proteomics.

  2. An In Vitro Comparison of Propex II Apex Locator to Standard Radiographic Method

    Science.gov (United States)

    Chakravarthy Pishipati, Kalyan Vinayak

    2013-01-01

    Introduction The aim of this in vitro study was to compare the accuracy of radiography in assessing working length to Propex II apex locator. Materials and Methods Thirty single canal extracted human teeth with patent apical foramen were selected. Access cavities were prepared. Anatomic length (AL) was determined by inserting a K-file into the root canal until the file tip was just visible at the most coronal aspect of the apical foramen; subsequently 0.5 mm was deducted from this measured length. Working length by radiographic method (RL) was determined using Ingle’s method. Propex II apex locator was used to determine the electronic working length (EL). From these calculated lengths, AL was deducted to obtain D-value. D-value in the range of +/-0.5 mm was considered to be acceptable. Results The percentage accuracy of RL and Propex II apex locator was 76.6% and 86.6%, respectively. Paired t-test revealed significant difference between the RL and Propex II apex locator (Plocator has determined working length more accurately than radiographic method. PMID:23922572

  3. Electronic apex locators.

    Science.gov (United States)

    Gordon, M P J; Chandler, N P

    2004-07-01

    Prior to root canal treatment at least one undistorted radiograph is required to assess canal morphology. The apical extent of instrumentation and the final root filling have a role in treatment success, and are primarily determined radiographically. Electronic apex locators reduce the number of radiographs required and assist where radiographic methods create difficulty. They may also indicate cases where the apical foramen is some distance from the radiographic apex. Other roles include the detection of root canal perforation. A review of the literature focussed first on the subject of electronic apex location. A second review used the names of apex location devices. From the combined searches, 113 pertinent articles in English were found. This paper reviews the development, action, use and types of electronic apex locators.

  4. Novel apexification method in a non-vital tooth with an open apex: a case report.

    Directory of Open Access Journals (Sweden)

    Hamid Razavian

    2014-06-01

    Full Text Available Many materials have been introduced for apexification each having their own advantages and disadvantages. This case report aims to present a new method of apexification using a combination of deproteinized bovine bone mineral (DBBM and enamel matrix derivative (EMD. After irrigating the canal of the maxillary right canine with 2.5 % sodium hypochlorite, a mixture of Bio-Oss and EMD was packed into the apical region for formation of an apical barrier and the canal was obturated by thermoplastic gutta percha technique with AH26 sealer; coronal seal was achieved by resin bonded composite. The size of the periapical lesion decreased significantly after 3, 6, 12 and 18-months. The patient had no radiographic signs or clinical symptoms at 24-month follow up and complete maturation of the apex and healing of the periapical bone were achieved.

  5. Comparison of digital radiography and apex locator with the conventional method in root length determination of primary teeth

    Directory of Open Access Journals (Sweden)

    I E Neena

    2011-01-01

    Full Text Available Aim: The purpose of this study was to compare the Working length in primary teeth endodontics using intra oral digital radiovisiography and apex locator with conventional method for accuracy. Materials and Methods: This in vivo study was conducted on 30 primary teeth which were indicated for pulpectomy in the patients of the age group of 5-11 years All experimental teeth had adequate remaining tooth structure for rubber dam isolation and radiographicaly visible canals. Endodontic treatment was required due to irreversible pulpitis or pulp necrosis. A standardized intraoral periapical radiograph of the tooth was taken using conventional method by paralleling technique. The distance between the source and the tooth, tooth and the films were standardized using X-ray positioning device. During the pulpectomy procedure, the working length was determined by digital radiograph and apex locator. The measurements were then compared with the conventional method of root canal measurement technique for accuracy Result: From the results obtained we can conclude that Working length determined in primary molars using digital radiography and Apex locator did not show any significant difference in the mean working length measurements when compared with the conventional radiographic method. Conclusions: Apex locator is comparable to conventional radiograph in determining the working length without radiation in the primary teeth. Intraoral digital radiography is the safest method in determining the working length with significant reduction in radiation exposure.Hence, both the techniques can be safely used as alternatives to conventional radiographic methods in determining working length in primary teeth.

  6. Quantitative vs qualitative research methods.

    Science.gov (United States)

    Lakshman, M; Sinha, L; Biswas, M; Charles, M; Arora, N K

    2000-05-01

    Quantitative methods have been widely used because of the fact that things that can be measured or counted gain scientific credibility over the unmeasurable. But the extent of biological abnormality, severity, consequences and the impact of illness cannot be satisfactorily captured and answered by the quantitative research alone. In such situations qualitative methods take a holistic perspective preserving the complexities of human behavior by addressing the "why" and "how" questions. In this paper an attempt has been made to highlight the strengths and weaknesses of both the methods and also that a balanced mix of both qualitative as well as quantitative methods yield the most valid and reliable results.

  7. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  8. Learning Apex programming

    CERN Document Server

    Kaufman, Matt

    2015-01-01

    If you are a developer who has some object-oriented programming experience, Learning Apex Programming is the perfect book for you. This book is most appropriate for developers who wish to gain an understanding of the Force.com platform and how to use Apex to create business applications.

  9. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  10. Monte Carlo studies of APEX

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, I.; Back, B.B.; Betts, R.R. [and others

    1995-08-01

    An essential component in the assessment of the significance of the results from APEX is a demonstrated understanding of the acceptance and response of the apparatus. This requires detailed simulations which can be compared to the results of various source and in-beam measurements. These simulations were carried out using the computer codes EGS and GEANT, both specifically designed for this purpose. As far as is possible, all details of the geometry of APEX were included. We compared the results of these simulations with measurements using electron conversion sources, positron sources and pair sources. The overall agreement is quite acceptable and some of the details are still being worked on. The simulation codes were also used to compare the results of measurements of in-beam positron and conversion electrons with expectations based on known physics or other methods. Again, satisfactory agreement is achieved. We are currently working on the simulation of various pair-producing scenarios such as the decay of a neutral object in the mass range 1.5-2.0 MeV and also the emission of internal pairs from nuclear transitions in the colliding ions. These results are essential input to the final results from APEX on cross section limits for various, previously proposed, sharp-line producing scenarios.

  11. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  12. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  13. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  14. Review paper. Quantitative methods in neuropathology.

    Science.gov (United States)

    Armstrong, Richard A

    2010-01-01

    The last decade has seen a considerable increase in the application of quantitative methods in the study of histological sections of brain tissue and especially in the study of neurodegenerative disease. These disorders are characterised by the deposition and aggregation of abnormal or misfolded proteins in the form of extracellular protein deposits such as senile plaques (SP) and intracellular inclusions such as neurofibrillary tangles (NFT). Quantification of brain lesions and studying the relationships between lesions and normal anatomical features of the brain, including neurons, glial cells, and blood vessels, has become an important method of elucidating disease pathogenesis. This review describes methods for quantifying the abundance of a histological feature such as density, frequency, and 'load' and the sampling methods by which quantitative measures can be obtained including plot/quadrant sampling, transect sampling, and the point-quarter method. In addition, methods for determining the spatial pattern of a histological feature, i.e., whether the feature is distributed at random, regularly, or is aggregated into clusters, are described. These methods include the use of the Poisson and binomial distributions, pattern analysis by regression, Fourier analysis, and methods based on mapped point patterns. Finally, the statistical methods available for studying the degree of spatial correlation between pathological lesions and neurons, glial cells, and blood vessels are described.

  15. Surface Free Energy Determination of APEX Photosensitive Glass

    OpenAIRE

    William R. Gaillard; Emanuel Waddell; Williams, John D.

    2016-01-01

    Surface free energy (SFE) plays an important role in microfluidic device operation. Photosensitive glasses such as APEX offer numerous advantages over traditional glasses for microfluidics, yet the SFE for APEX has not been previously reported. We calculate SFE with the Owens/Wendt geometric method by using contact angles measured with the Sessile drop technique. While the total SFE for APEX is found to be similar to traditional microstructurable glasses, the polar component is lower, which i...

  16. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  17. apex: phylogenetics with multiple genes.

    Science.gov (United States)

    Jombart, Thibaut; Archer, Frederick; Schliep, Klaus; Kamvar, Zhian; Harris, Rebecca; Paradis, Emmanuel; Goudet, Jérome; Lapp, Hilmar

    2017-01-01

    Genetic sequences of multiple genes are becoming increasingly common for a wide range of organisms including viruses, bacteria and eukaryotes. While such data may sometimes be treated as a single locus, in practice, a number of biological and statistical phenomena can lead to phylogenetic incongruence. In such cases, different loci should, at least as a preliminary step, be examined and analysed separately. The r software has become a popular platform for phylogenetics, with several packages implementing distance-based, parsimony and likelihood-based phylogenetic reconstruction, and an even greater number of packages implementing phylogenetic comparative methods. Unfortunately, basic data structures and tools for analysing multiple genes have so far been lacking, thereby limiting potential for investigating phylogenetic incongruence. In this study, we introduce the new r package apex to fill this gap. apex implements new object classes, which extend existing standards for storing DNA and amino acid sequences, and provides a number of convenient tools for handling, visualizing and analysing these data. In this study, we introduce the main features of the package and illustrate its functionalities through the analysis of a simple data set.

  18. Quantitative methods in classical perturbation theory.

    Science.gov (United States)

    Giorgilli, A.

    Poincaré proved that the series commonly used in Celestial mechanics are typically non convergent, although their usefulness is generally evident. Recent work in perturbation theory has enlightened this conjecture of Poincaré, bringing into evidence that the series of perturbation theory, although non convergent in general, furnish nevertheless valuable approximations to the true orbits for a very large time, which in some practical cases could be comparable with the age of the universe. The aim of the author's paper is to introduce the quantitative methods of perturbation theory which allow to obtain such powerful results.

  19. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  20. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  1. In vitro evaluation of two methods of ultrasonic irrigation on marginal adaptation of MTA plugs in open apex teeth: a SEM analysis.

    Science.gov (United States)

    Khalighinejad, Navid; Barekatine, Behnaz; Hasheminia, Seyed Mohsen; Gheibolahi, Hamed

    2014-01-01

    Different factors can affect the marginal adaptation of MTA. The present study was designed to investigate the effect of two ultrasonic irrigation methods on the marginal adaptation of MTA plug in open apex teeth by scanning electron microscope. Thirty single mature teeth were included in this in vitro experimental prospective study. A total of 5 mm thickness of MTA plug was inserted at the end of the canals and after 24 h an ultrasonic file was used to irrigate the canals and remove the MTA remnants. Teeth were randomly divided into three groups: In the first and second groups, the canals were irrigated for 1 min by 2.5% sodium hypochlorite as #25 ultrasonic file was in direct contact and 1 mm away from MTA plug, respectively. The third group was not irrigated and left as control. A total of 1 mm transverse sections were prepared through the coronal and the apical parts of MTA plug and specimens were prepared for SEM analysis. The extent of gap was measured linearly under SEM device. Statistical analysis of the results was performed using the kruskal-Wallis test by SPSS software ver.18(a = 0.05). There was no significant difference between groups regarding the marginal gap size in apical (P: 0.17) and coronal sections (P: 0.33). However, the mean marginal gap size was higher in apical section compared to coronal section. It can be concluded that ultrasonic irrigation dose not adversely affect the marginal adaptation of MTA plugs.

  2. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  3. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  4. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  5. Quantitative methods for assessing drug synergism.

    Science.gov (United States)

    Tallarida, Ronald J

    2011-11-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug's potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination.

  6. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  7. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  8. Endovascular treatment for ruptured basilar apex aneurysm

    Directory of Open Access Journals (Sweden)

    Sheng LI

    2011-12-01

    Full Text Available Objective The present study aims to prove the effectiveness and safety of endovascular interventional therapy for ruptured basilar apex aneurysm.Methods The imaging data,methods of endovascular treatment,and clinical results of 12 patients suffering from ruptured basilar apex aneurysms from January 2001 to December 2009 were retrospectively analyzed.The 12 patients were composed of 5 males and 7 females,and their ages ranged from 21 years to 58 years.Results Nine patients suffered from narrow-necked aneurysms,which were directly embolized,and the other three suffered from wide-necked aneurysms,which were embolized using a microstent.Eight aneurysms were completely embolized,and the other four were partly embolized.No rebleeding occurred within the follow-up period of 12 months to 36 months,and all patients recovered well without neurological defects.Conclusions Therefore,endovascular treatment for ruptured basilar apex aneurysm is a semi-invasive,safe,and effective method.

  9. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  10. Surface Free Energy Determination of APEX Photosensitive Glass

    Directory of Open Access Journals (Sweden)

    William R. Gaillard

    2016-02-01

    Full Text Available Surface free energy (SFE plays an important role in microfluidic device operation. Photosensitive glasses such as APEX offer numerous advantages over traditional glasses for microfluidics, yet the SFE for APEX has not been previously reported. We calculate SFE with the Owens/Wendt geometric method by using contact angles measured with the Sessile drop technique. While the total SFE for APEX is found to be similar to traditional microstructurable glasses, the polar component is lower, which is likely attributable to composition. The SFE was modified at each stage of device fabrication, but the SFE of the stock and fully processed glass was found to be approximately the same at a value of 51 mJ·m−2. APEX exhibited inconsistent wetting behavior attributable to an inhomogeneous surface chemical composition. Means to produce more consistent wetting of photosensitive glass for microfluidic applications are discussed.

  11. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    Science.gov (United States)

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  12. qualitative and quantitative methods of suicide research in old age

    African Journals Online (AJOL)

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the .... Moreover, this type of data collection method may engender a higher .... illustrated ed: Springer Science & Business Media;. 2011. 204. 13.

  13. The APEX-SZ Instrument

    CERN Document Server

    Schwan, Daniel; Basu, Kaustuv; Bender, Amy N; Bertoldi, Frank; Cho, Hsaio-Mei; Chon, Guyong; Clarke, John; Dobbs, Matt; Ferrusca, Daniel; Gusten, Rolfe; Halverson, Nils W; Holzapfel, William L; Horellou, Cathy; Johansson, Daniel; Johnson, Bradley R; Kennedy, James; Kermish, Zigmund; Kneissl, Ruediger; Lanting, Trevor; Lee, Adrian T; Lueker, Martin; Mehl, Jared; Menten, Karl M; Muders, Dirk; Pacaud, Florian; Plagge, Thomas; Reichardt, Christian L; Richards, Paul L; Schaaf, Rienhold; Schilke, Peter; Sommer, Martin W; Spieler, Helmuth; Tucker, Carole; Weiss, Axel; Westbrook, Benjamin; Zahn, Oliver

    2010-01-01

    The APEX-SZ instrument is a millimeter-wave cryogenic receiver designed to observe galaxy clusters via the Sunyaev-Zel'dovich effect from the 12 m APEX telescope on the Atacama plateau in Chile. The receiver contains a focal plane of 280 superconducting transition-edge sensor (TES) bolometers instrumented with a frequency-domain multiplexed readout system. The bolometers are cooled to 280 mK via a three-stage helium sorption refrigerator and a mechanical pulse-tube cooler. Three warm mirrors, two 4 K lenses, and a horn array couple the TES bolometers to the telescope. APEX-SZ observes in a single frequency band at 150 GHz with 1' angular resolution and a 22' field-of-view, all well suited for cluster mapping. The APEX-SZ receiver has played a key role in the introduction of several new technologies including TES bolometers, the frequency-domain multiplexed readout, and the use of a pulse-tube cooler with bolometers. As a result of these new technologies, the instrument has a higher instantaneous sensitivity a...

  14. A Practical Qualitative+Quantitative Method-S-ANN

    Institute of Scientific and Technical Information of China (English)

    GUAN Wei; SHEN Jin-sheng; LI Peng-fei

    2002-01-01

    In this paper, a practical qualitative +quantitative method named S-ANN is proposed as a forecasting tool, in which the artificial neural network (ANN) of AI is used to handle the quantitative knowledge and the SCENARIO method of systems engineering is used to handle the qualitative knowledge respectively. As a case study, S-ANN method is employed to forecast the ridership of Beijing public transportation, the results show that S-ANN method possesses advantages of feasibility and easily to operate.

  15. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  16. Measurement of Apex Offsets for Fiber Connector End Faces

    Institute of Scientific and Technical Information of China (English)

    XU Yong-xiang; ZHU Ri-hong; CHEN Lei

    2006-01-01

    As one of the most important geometric parameters for a PC-type fiber connector end face, apex offset can contribute to high insertion loss and high back-reflection reading. A novel measurement method for the parameter, connector rotating-π method, is proposed. With the method, the apex offset of a common connector end face is measured. The result is compared with that measured by a Norland 3 000 fiber connector end face interferometer. It is found that the difference between two results is 1.8 μm. Meantime, the influences of relevant error resources on apex offset measurement under rotating-π method and apex-core method are respectively analyzed, and two error equations are derived. The analytical result shows that, compared with apex-core method, if two additional sub-tilts of axis within and in the direction perpendicular to principal plane caused by its rotation are not bigger than the original axis tilt angle, the max. measurement error will then be reduced by at least 22.5% with rotating-π method. The practicability of the method is confirmed by the experiments.

  17. Quantitative methods for studying design protocols

    CERN Document Server

    Kan, Jeff WT

    2017-01-01

    This book is aimed at researchers and students who would like to engage in and deepen their understanding of design cognition research. The book presents new approaches for analyzing design thinking and proposes methods of measuring design processes. These methods seek to quantify design issues and design processes that are defined based on notions from the Function-Behavior-Structure (FBS) design ontology and from linkography. A linkograph is a network of linked design moves or segments. FBS ontology concepts have been used in both design theory and design thinking research and have yielded numerous results. Linkography is one of the most influential and elegant design cognition research methods. In this book Kan and Gero provide novel and state-of-the-art methods of analyzing design protocols that offer insights into design cognition by integrating segmentation with linkography by assigning FBS-based codes to design moves or segments and treating links as FBS transformation processes. They propose and test ...

  18. Quantitative Hydrocarbon Energies from the PMO Method.

    Science.gov (United States)

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  19. Orthogonal Series Methods for Both Qualitative and Quantitative Data

    OpenAIRE

    Hall, Peter

    1983-01-01

    We introduce and describe orthogonal series methods for estimating the density of qualitative, quantitative or mixed data. The techniques are completely nonparametric in character, and so may be used in situations where parametric models are difficult to construct. Just this situation arises in the context of mixed--both qualitative and quantitative--data, where there are few parametric models.

  20. Optimization of statistical methods impact on quantitative proteomics data

    NARCIS (Netherlands)

    Pursiheimo, A.; Vehmas, A.P.; Afzal, S.; Suomi, T.; Chand, T.; Strauss, L.; Poutanen, M.; Rokka, A.; Corthals, G.L.; Elo, L.L.

    2015-01-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled

  1. Novel method for quantitative estimation of biofilms

    DEFF Research Database (Denmark)

    Syal, Kirtimaan

    2017-01-01

    Biofilm protects bacteria from stress and hostile environment. Crystal violet (CV) assay is the most popular method for biofilm determination adopted by different laboratories so far. However, biofilm layer formed at the liquid-air interphase known as pellicle is extremely sensitive to its washin...

  2. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Purpose: To develop new selective, precise, and accurate methods for the ... of the mixture on silica gel plates using chloroform: methanol (93:7, v/v) as a mobile phase. ... Thus, they are suitable for use in quality control (QC) laboratories and ...

  3. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  4. Change of time methods in quantitative finance

    CERN Document Server

    Swishchuk, Anatoliy

    2016-01-01

    This book is devoted to the history of Change of Time Methods (CTM), the connections of CTM to stochastic volatilities and finance, fundamental aspects of the theory of CTM, basic concepts, and its properties. An emphasis is given on many applications of CTM in financial and energy markets, and the presented numerical examples are based on real data. The change of time method is applied to derive the well-known Black-Scholes formula for European call options, and to derive an explicit option pricing formula for a European call option for a mean-reverting model for commodity prices. Explicit formulas are also derived for variance and volatility swaps for financial markets with a stochastic volatility following a classical and delayed Heston model. The CTM is applied to price financial and energy derivatives for one-factor and multi-factor alpha-stable Levy-based models. Readers should have a basic knowledge of probability and statistics, and some familiarity with stochastic processes, such as Brownian motion, ...

  5. Some thoughts on humanitarian logistics and quantitative methods

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-05-01

    Full Text Available Some of the research issues in humanitarian logistics and quantitative methods discussed in this presentation are Identifying people in a disaster; Facilitating movement of people and aid; Geographic Information Services (GIS) to support...

  6. Researching on quantitative project management plan and implementation method

    Science.gov (United States)

    Wang, Xin; Ren, Aihua; Liu, Xiangshang

    2017-08-01

    With the practice of high maturity process improvement, more and more attention has been paid to CMMI and other process improvement frameworks. The key to improve the process of high maturity is to quantify the process. At present, the method of improving the software process of high maturity is lack of specific introduction to the corresponding improvement link or process implementation. In this paper, based on the current improvement in the quantitative management of the framework and statistical analysis technical of the high maturity recommended for the enterprise to improve the process of planning and implementation methods. These methods provide quantitative process management for the enterprise, as well as quantitative management of the project to provide a systematic process, and finally evaluate the effectiveness of quantitative management projects. Finally, this method is used to verify the effectiveness of the framework in guiding the enterprise to improve the process of high maturity.

  7. Oracle APEX 4.2 cookbook

    CERN Document Server

    Van Zoest, Michel

    2013-01-01

    As a Cookbook, this book enables you to create APEX web applications and to implement features with immediately usable recipes that unleash the powerful functionality of Oracle APEX 4.2. Each recipe is presented as a separate, standalone entity and the reading of other, prior recipes is not required.It can be seen as a reference and a practical guide to APEX development.This book is aimed both at developers new to the APEX environment and at intermediate developers. More advanced developers will also gain from the information at hand.If you are new to APEX you will find recipes to start develo

  8. Comparison of subtemporal versus presigmoidal approaches for exposing petrous apex utilizing virtual reality technique

    Directory of Open Access Journals (Sweden)

    Ke TANG

    2016-08-01

    Full Text Available Objective To perform quantitative comparison of microanatomical features between subtemporal and presigmoidal minimally invasive approaches for exposing petrous apex on the basis of virtual reality image model.  Methods CT and MRI were performed on 15 adult cadaver heads (30 sides to establish virtual reality three-dimensional anatomical model of petrous apex. The superior edge of the root of temporal bone zygomatic process and the mastoidale on the calvaria were selected as landmark points of craniotomy through subtempral and presigmoidal approaches. Petrous apex was selected as exposure landmark point on the skull base. The lines between craniotomy and exposure landmark points were used as axis to outline a cylinder simulating surgical routes of subtemporal and presigmoidal approaches. Anatomical exposures in two surgical routes were observed and measured. Statistical comparison was launched by paired t test.  Results Surgical route of subtemporal approach passed through middle skull base and temporal lobe, and then reached petrous apex. Petrous bone drilling was performed to expose internal acoustic meatus, facial nerve and labyrinth. Then, trigeminal nerve, superior petrous sinus and cavernous sinus were exposed. Surgical route of presigmoidal approach was performed by drilling petrous bone through mastoid and passing vertical segment of facial nerve. Then, glomus jugulare, the lower cranial nerves, ossicular chain, labyrinth and internal carotid artery (ICA were exposed in turn. Reaching internal acoustic meatus, the route exposed anterior inferior cerebellar artery (AICA and facial-acoustic nerve complex. Reaching petrous apex, the route involved superior cerebellar artery, superior petrous sinus, inferior petrous sinus, cavernous sinus, trigeminal nerve and partial temporal lobe. The volumes of route, osseous structures, facial-acoustic complex, labyrinth and vein involved in presigmoidal approach were more than those in subtemporal

  9. A quantitative method for silica flux evaluation

    Science.gov (United States)

    Schonewille, R. H.; O'Connell, G. J.; Toguri, J. M.

    1993-02-01

    In the smelting of copper and copper/nickel concentrates, the role of silica flux is to aid in the removal of iron by forming a slag phase. Alternatively, the role of flux may be regarded as a means of controlling the formation of magnetite, which can severely hinder the operation of a furnace. To adequately control the magnetite level, the flux must react rapidly with all of the FeO within the bath. In the present study, a rapid method for silica flux evaluation that can be used directly in the smelter has been developed. Samples of flux are mixed with iron sulfide and magnetite and then smelted at a temperature of 1250 °C. Argon was swept over the reaction mixture and analyzed continuously for sulfur dioxide. The sulfur dioxide concentration with time was found to contain two peaks, the first one being independent of the flux content of the sample. A flux quality parameter has been defined as the height-to-time ratio of the second peak. The value of this parameter for pure silica is 5100 ppm/min. The effects of silica content, silica particle size, and silicate mineralogy were investigated. It was found that a limiting flux quality is achieved for particle sizes less than 0.1 mm in diameter and that fluxes containing feldspar are generally of a poorer quality. The relative importance of free silica and melting point was also studied using synthetic flux mixtures, with free silica displaying the strongest effect.

  10. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    Libo Hao; Qiaoqiao Wei; Yuyan Zhao; Zilong Lu; Xinyun Zhao

    2015-04-01

    Determination of types and amounts for clay minerals in soil are important in environmental, agricultural, and geological investigations. Many reliable methods have been established to identify clay mineral types. However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative determination of clay minerals in soil based on bulk chemical composition data. The fundamental principles and processes of the calculation are elucidated. Some samples were used for reliability verification of the method and the results prove the simplicity and efficacy of the approach.

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  12. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  13. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    Science.gov (United States)

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  14. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE.

    Science.gov (United States)

    Ojagbemi, A

    2017-06-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan Africa and Europe. The paper concludes that an integration of both the qualitative and quantitative research approaches may provide a better platform for unraveling the complex phenomenon of suicide in the elderly.

  15. Assessment of central venous catheter-associated infections using semi-quantitative or quantitative culture methods

    Directory of Open Access Journals (Sweden)

    E. L. Pizzolitto

    2009-01-01

    Full Text Available

    Semiquantitative (Maki and quantitative (Brun- Buisson culture techniques were employed in the diagnosis of catheter-related bloodstream infections (CRBSI in patients who have a short-term central venous catheter (inserted for 30 days. The diagnosis of CRBSI was based on the results of semiquantitative and quantitative culture of material from the removed catheters. Catheter tips (118 from 100 patients were evaluated by both methods. Semiquantitative analysis revealed 34 catheters (28.8% colonized by ≥15 colonyforming units (cfu, while quantitative cultures (34 catheters, 28.8% showed the growth of ≥103 cfu/mL. Bacteremia was confirmed in four patients by isolating microorganisms of identical species from both catheters and blood samples. Using the semiquantitative culture technique on short-term central venous catheter tips, we have shown that with a cut-off level of ≥15 cfu, the technique had 100.0% sensitivity, specificity of 68.4%, 25.0% positive predictive value (PPV and 100.0% negative predictive value (NPV, efficiency of 71.4% and a prevalence of 9.5%. The quantitative method, with a cut-off limit of ≥103 cfu/mL, gave identical values: the sensitivity was 100.0%, specificity 68.4%, positive predictive value (PPV 25.0%, negative predictive value (NPV 100.0%, efficiency 71.4% and prevalence 9.5%. We concluded that the semiquantitative and quantitative culture methods, evaluated in parallel, for the first time in Brazil, have similar sensitivity and specificity. Keywords: central venous catheter; semi-quantitative culture; quantitative culture; catheter-related bacteremia.

  16. A Quantitative Evaluation Method for Transportation Network Efficiency

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-02-01

    Full Text Available The efficiency of a transportation network is the comprehensive performance index of the network. In order to evaluate the operation situation of the transportation network objectively, the scientific quantitative evaluation method for the network efficiency is very important. In this study, a new quantitative evaluation method for transportation network efficiency is developed, which could evaluate the transportation network performance comprehensively and reasonably. The method is defined in the context of network equilibrium, which could reflect the influences of travel behavior, travel cost, travel demands and link flows, all in equilibrium state, on network efficiency. The computation results compared with a previously proposed one by numerical example, which denote that the new method can quantitatively reflect the influence on the transportation network efficiency induced by traffic flows, user behavior and network structure, which accords with the practical situation.

  17. Improvements to the APEX apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, I.; Back, B.B.; Betts, R.R. [and others

    1995-08-01

    A number of technical issues led us to rework extensively the APEX apparatus in summer 1994. During the earlier runs, a significant fraction of the 432 silicon detector elements showed degraded resolution such that they had to be excluded from the final analysis in software. The effect of this is to reduce the efficiency of APEX and possibly also to introduce holes in the acceptance which, for some perhaps exotic scenarios, might reduce the acceptance to an unacceptably low level. Also, the energy thresholds below which it is not possible to generate timing information from the silicon detectors, were high enough that the low-energy acceptance of APEX was compromised to a significant extent. The origins of these difficulties were in part due to degraded performance of the silicon detectors themselves, problems with the silicon cooling systems and electronics problems. Both silicon arrays were disassembled and sub-standard detectors replaced, all detectors were also cleaned with the result that all detectors now performed at the specified values of leakage current. The silicon cooling systems were disassembled and rebuilt with the result that many small leaks were fixed. Defective electronics channels were repaired or replaced. The rotating target wheel was also improved with the installation of new bearings and a computer-controlled rotation and readout system. The rebuilt wheel can now run at speeds up to 900 rpm for weeks on end without breakdown. The target wheel and associated beam sweeping now work extremely well so that low-melting-point targets such as Pb and In can be used in quite intense beams without melting.

  18. Proceedings First Workshop on Quantitative Formal Methods: Theory and Applications

    CERN Document Server

    Andova, Suzana; D'Argenio, Pedro; Cuijpers, Pieter; Markovski, Jasen; Morgan, Caroll; Núñez, Manuel; 10.4204/EPTCS.13

    2009-01-01

    This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted for presentation at the workshop.

  19. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  20. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  1. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  2. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  3. Oracle Apex reporting tips and tricks

    CERN Document Server

    Bara, George

    2013-01-01

    Take advantage of all the exciting Reporting features of Oracle Application Express 4.2. Designed for a hands-on approach, this book contains in-depth practical guidelines from George Bara, a well-known Oracle Apex expert and blogger. From Classic to Interactive Reports, Web Services and Pdf Printing, "Oracle Apex Reporting Tips & Tricks" is a must-have for all database developers that want to make the most out of the Oracle Apex reporting engine.

  4. An overview of electronic apex locators: part 1.

    Science.gov (United States)

    Ali, R; Okechukwu, N C; Brunton, P; Nattress, B

    2013-02-01

    To effectively carry out root canal therapy, the clinician must accurately determine the apical limit of the root canal system as well as the position of the canal terminus. Its position can be estimated using a variety of techniques, including radiographs, tactile feedback from endodontic instruments and electronic apex locators. This article describes the micro-anatomy of the apical terminus, different methods of measuring root canal system length and how a tooth can function as an electrical capacitor. This capacitor model represents a starting point upon which all apex locators are based. An understanding of this model can help the practitioner to optimise the use of apex locators, understand their limitations and avoid errors that can occur.

  5. Oracle APEX 4.2 reporting

    CERN Document Server

    Pathak, Vishal

    2013-01-01

    Oracle APEX 4.2 Reporting is a practical tutorial for intermediate to advanced use, with plenty of step-by-step instructions and business scenarios for understanding and implementing the ins and outs of making reports.""Oracle APEX 4.2 Reporting"" is for you if you design or develop advanced solutions in APEX or wish to know about the advanced features of APEX. If you wish to have a 360 degree view of reporting technologies or work in a complex heterogeneous enterprise, this is a must-have.

  6. A review of methods for quantitative evaluation of spinal curvature

    OpenAIRE

    2015-01-01

    The aim of this paper is to provide a complete overview of the existing methods for quantitative evaluation of spinal curvature from medical images, and to summarize the relevant publications, which may not only assist in the introduction of other researchers to the field, but also be a valuable resource for studying the existing methods or developing new methods and evaluation strategies. Key evaluation issues and future considerations, supported by the results of the overview, are also disc...

  7. A review of methods for quantitative evaluation of spinal curvature.

    Science.gov (United States)

    Vrtovec, Tomaz; Pernus, Franjo; Likar, Bostjan

    2009-05-01

    The aim of this paper is to provide a complete overview of the existing methods for quantitative evaluation of spinal curvature from medical images, and to summarize the relevant publications, which may not only assist in the introduction of other researchers to the field, but also be a valuable resource for studying the existing methods or developing new methods and evaluation strategies. Key evaluation issues and future considerations, supported by the results of the overview, are also discussed.

  8. Analysis of Forecasting Sales By Using Quantitative And Qualitative Methods

    Directory of Open Access Journals (Sweden)

    B. Rama Sanjeeva Sresta,

    2016-09-01

    Full Text Available This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives.

  9. Interference of electronic apex locators with implantable cardioverter defibrillators

    NARCIS (Netherlands)

    Idzahi, K.; de Cock, C.C.; Shemesh, H.; Brand, H.S.

    2014-01-01

    Introduction The purpose of this in vitro study was to evaluate the potential electromagnetic interference of electronic apex locators (EALs) on implantable cardioverter defibrillators (ICDs). Methods Four different EALs were tested for their ability to interfere with the correct function of 3 diffe

  10. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  11. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  12. A rare case of petrous apex osteoma.

    Science.gov (United States)

    Cece, Hasan; Yildiz, Sema; Iynen, Ismail; Karakas, Omer; Karakas, Ekrem; Dogan, Ferit

    2012-06-01

    Osteomas are the most common tumours of the cranial vault and facial skeleton. Temporal bone osteoma is a rare entity. An osteoma arising from the petrous apex is extremely rare. We present a case of osteoma arising from the petrous apex followed by a discussion of the etiology, presentation, and radiologic findings.

  13. A method for quantitatively determining temporomandibular joint bony relationships.

    Science.gov (United States)

    Blaschke, D D; Blaschke, T J

    1981-01-01

    The spatial relationship of the mandibular condyle to the temporal component of the TMJ is determined quantitatively from lateral tomograms using a new method of measuring specific areas of the joint space. Condyle position along the posteroanterior axis of the joint is expressed as a ratio of the posterior joint space area divided by the anterior joint space area (the P/A ratio). A description of the method, as used in a clinical evaluation of 50 asymptomatic tMJs, and a statistical evaluation of the method's reproducibility and accuracy are given. The method was found to have both high reproducibility and high accuracy.

  14. Petrous apex lesions in the pediatric population

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Rupa [University of Cincinnati College of Medicine, Department of Radiology, Cincinnati, OH (United States); Cincinnati Children' s Hospital Medical Center, Department of Radiology, Cincinnati, OH (United States); Son, Hwa Jung [University of Cincinnati College of Medicine, Department of Otolaryngology-Head and Neck Surgery, Cincinnati, OH (United States); Koch, Bernadette L. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, Cincinnati, OH (United States)

    2014-03-15

    A variety of abnormal imaging findings of the petrous apex are encountered in children. Many petrous apex lesions are identified incidentally while images of the brain or head and neck are being obtained for indications unrelated to the temporal bone. Differential considerations of petrous apex lesions in children include ''leave me alone'' lesions, infectious or inflammatory lesions, fibro-osseous lesions, neoplasms and neoplasm-like lesions, as well as a few rare miscellaneous conditions. Some lesions are similar to those encountered in adults, and some are unique to children. Langerhans cell histiocytosis (LCH) and primary and metastatic pediatric malignancies such as neuroblastoma, rhabomyosarcoma and Ewing sarcoma are more likely to be encountered in children. Lesions such as petrous apex cholesterol granuloma, cholesteatoma and chondrosarcoma are more common in adults and are rarely a diagnostic consideration in children. We present a comprehensive pictorial review of CT and MRI appearances of pediatric petrous apex lesions. (orig.)

  15. Operating cost budgeting methods: quantitative methods to improve the process

    Directory of Open Access Journals (Sweden)

    José Olegário Rodrigues da Silva

    Full Text Available Abstract Operating cost forecasts are used in economic feasibility studies of projects and in budgeting process. Studies have pointed out that some companies are not satisfied with the budgeting process and chief executive officers want updates more frequently. In these cases, the main problem lies in the costs versus benefits. Companies seek simple and cheap forecasting methods without, at the same time, conceding in terms of quality of the resulting information. This study aims to compare operating cost forecasting models to identify the ones that are relatively easy to implement and turn out less deviation. For this purpose, we applied ARIMA (autoregressive integrated moving average and distributed dynamic lag models to data from a Brazilian petroleum company. The results suggest that the models have potential application, and that multivariate models fitted better and showed itself a better way to forecast costs than univariate models.

  16. Application of triple potential step amperometry method for quantitative electroanalysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A novel quantitative electroanalysis method, triple potential step amperometry (TPSA), was developed and explained with an example of nitrobenzene analyzing in water. The selectivity of TPSA was improved by controlling the potential step within a narrow interval and using enzyme-modified electrode, the narrow potential step makes the method avoid most interferents, and enzyme-modified electrode can enhance the response of target substance selectively. The peak area was investigated for quantitative calibration, such as nitrobenzene concentration showing a linear relation with the peak area, with the correlation coefficients being 0.9995. The t-test and F-test were applied to evaluating the reliability of TPSA, the results showed that there was no evidence of systematic error for TPSA, and the method was of no significant difference from CV. The merit of fast detecting and few potential changing times make the TPSA suitably applicable to low-cost automatic monitoring equipments.

  17. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  18. Novel method for ANA quantitation using IIF imaging system.

    Science.gov (United States)

    Peng, Xiaodong; Tang, Jiangtao; Wu, Yongkang; Yang, Bin; Hu, Jing

    2014-02-01

    A variety of antinuclear antibodies (ANAs) are found in the serum of patients with autoimmune diseases. The detection of abnormal ANA titers is a critical criterion for diagnosis of systemic lupus erythematosus (SLE) and other connective tissue diseases. Indirect immunofluorescence assay (IIF) on HEp-2 cells is the gold standard method to determine the presence of ANA and therefore provides information about the localization of autoantigens that are useful for diagnosis. However, its utility was limited in prognosing and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. On the other hand, ELISA for the detection of ANA can quantitate ANA but could not provide further information about the localization of the autoantigens. It would be ideal to integrate both of the quantitative and qualitative methods. To address this issue, this study was conducted to quantitatively detect ANAs by using IIF imaging analysis system. Serum samples from patients with ANA positive (including speckled, homogeneous, nuclear mixture and cytoplasmic mixture patterns) and negative were detected for ANA titers by the classical IIF and analyzed by an image system, the image of each sample was acquired by the digital imaging system and the green fluorescence intensity was quantified by the Image-Pro plus software. A good correlation was found in between two methods and the correlation coefficients (R(2)) of various ANA patterns were 0.942 (speckled), 0.942 (homogeneous), 0.923 (nuclear mixture) and 0.760 (cytoplasmic mixture), respectively. The fluorescence density was linearly correlated with the log of ANA titers in various ANA patterns (R(2)>0.95). Moreover, the novel ANA quantitation method showed good reproducibility (F=0.091, p>0.05) with mean±SD and CV% of positive, and negative quality controls were equal to 126.4±9.6 and 7.6%, 10.4±1.25 and 12

  19. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  20. Quantitative methods for somatosensory evaluation in atypical odontalgia

    DEFF Research Database (Denmark)

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana;

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST......) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection...... compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests....

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  2. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    Science.gov (United States)

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  3. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  4. Quantitative method of measuring cancer cell urokinase and metastatic potential

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  5. Analysis of Preoperative Detection for Apex Prostate Cancer by Transrectal Biopsy

    Directory of Open Access Journals (Sweden)

    Tomokazu Sazuka

    2013-01-01

    Full Text Available Background. The aim of this study was to determine concordance rates for prostatectomy specimens and transrectal needle biopsy samples in various areas of the prostate in order to assess diagnostic accuracy of the transrectal biopsy approach, especially for presurgical detection of cancer in the prostatic apex. Materials and Methods. From 2006 to 2011, 158 patients whose radical prostatectomy specimens had been evaluated were retrospectively enrolled in this study. Concordance rates for histopathology results of prostatectomy specimens and needle biopsy samples were evaluated in 8 prostatic sections (apex, middle, base, and transitional zones bilaterally from 73 patients diagnosed at this institution, besides factors for detecting apex cancer in total 118 true positive and false negative apex cancers. Results. Prostate cancer was found most frequently (85% in the apex of all patients. Of 584 histopathology sections, 153 (49% from all areas were false negatives, as were 45% of apex biopsy samples. No readily available preoperative factors for detecting apex cancer were identified. Conclusions. In Japanese patients, the most frequent location of prostate cancer is in the apex. There is a high false negative rate for transrectal biopsy samples. To improve the detection rate, transperitoneal biopsy or more accurate imaging technology is needed.

  6. Quantitative, qualitative, and collaborative methods: approaching indigenous ecological knowledge heterogeneity

    Directory of Open Access Journals (Sweden)

    Jeremy Spoon

    2014-09-01

    Full Text Available I discuss the use of quantitative, qualitative, and collaborative methods to document and operationalize Indigenous ecological knowledge, using case studies from the Nepalese Himalaya and Great Basin. Both case studies applied results to natural and cultural resource management and interpretation for the public. These approaches attempt to reposition the interview subjects to serve as active contributors to the research and its outcomes. I argue that the study of any body of Indigenous knowledge requires a context-specific methodology and mutually agreed upon processes and outcomes. In the Nepalese Himalaya, I utilized linked quantitative and qualitative methods to understand how tourism influenced Sherpa place-based spiritual concepts, species, and landscape knowledge inside Sagarmatha (Mount Everest National Park and Buffer Zone. In this method, Sherpa collaborated in the development of the research questions, the design, and in the review of results. The research in the Great Basin employed collaborative qualitative methods to document Numic (Southern Paiute and Western Shoshone ecological knowledge of federal lands within their ancestral territory and attempted to piece together fragmented and contested histories of place. In this method, Numic peoples collaborated on the development of research questions and design; however they also conducted most of the interviews. In both cases, I selected particular suites of methods depending on the context and created forums for the translation of this information to applied outcomes. The methods were also improved and innovated through praxis.

  7. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  8. Informatics methods to enable sharing of quantitative imaging research data.

    Science.gov (United States)

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Apex Locator: A Reliable and Easy Guide

    OpenAIRE

    Meza DDS, Marco

    2015-01-01

    Now days is more the trust we can have to the Apex-locator in endodontics. Now we can expect more accuracy results working length measure, not only of the apical constriction but the total length of the roots. Endodontics can’t left behind the use of the Apex-locator, because of its useful work, not only on the determination of the works length, also in the diagnostic of perforations or fractures. Even do, this article displays a reliable and simple guide in the use of apex locator during eno...

  10. Quantitative phosphoproteomic analysis using iTRAQ method.

    Science.gov (United States)

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  11. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  12. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  13. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  14. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  15. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  16. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...... accepted criteria by at least 2 clinicians. EEGs were recorded in a standardized way and analyzed independently of the clinical diagnoses, using the SPR method. RESULTS: In receiver operating characteristic curve analyses, the qEEGs separated AD patients from healthy elderly individuals with an area under...... the curve (AUC) of 0.90, representing a sensitivity of 84% and a specificity of 81%. The qEEGs further separated patients with Lewy body dementia or Parkinson's disease dementia from AD patients with an AUC of 0.9, a sensitivity of 85% and a specificity of 87%. CONCLUSION: qEEG using the SPR method could...

  17. A quantitative method for determining the robustness of complex networks

    Science.gov (United States)

    Qin, Jun; Wu, Hongrun; Tong, Xiaonian; Zheng, Bojin

    2013-06-01

    Most current studies estimate the invulnerability of complex networks using a qualitative method that analyzes the decay rate of network performance. This method results in confusion over the invulnerability of various types of complex networks. By normalizing network performance and defining a baseline, this paper defines the invulnerability index as the integral of the normalized network performance curve minus the baseline. This quantitative method seeks to measure network invulnerability under both edge and node attacks and provides a definition on the distinguishment of the robustness and fragility of networks. To demonstrate the proposed method, three small-world networks were selected as test beds. The simulation results indicate that the proposed invulnerability index can effectively and accurately quantify network resilience and can deal with both the node and edge attacks. The index can provide a valuable reference for determining network invulnerability in future research.

  18. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    Directory of Open Access Journals (Sweden)

    Erin E Conners

    Full Text Available Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC, whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1 Participatory mapping; 2 Quantitative interviews; 3 Sex work venue field observation; 4 Time-location-activity diaries; 5 In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  19. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    Science.gov (United States)

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  20. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine

    Science.gov (United States)

    Poza, Cristian; Contreras, David; Yáñez, Jorge; Nacaratte, Fallon; Toral, M. Inés

    2017-01-01

    Amorolfine (AOF) is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time), pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10−6–3.22 × 10−5 mol L−1, with detection and quantitation limits of 2.49 × 10−6 mol L−1 and 7.56 × 10−6 mol L−1, respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0–105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product.

  1. A New Kinetic Spectrophotometric Method for the Quantitation of Amorolfine

    Directory of Open Access Journals (Sweden)

    César Soto

    2017-01-01

    Full Text Available Amorolfine (AOF is a compound with fungicide activity based on the dual inhibition of growth of the fungal cell membrane, the biosynthesis and accumulation of sterols, and the reduction of ergosterol. In this work a sensitive kinetic and spectrophotometric method for the AOF quantitation based on the AOF oxidation by means of KMnO4 at 30 min (fixed time, pH alkaline, and ionic strength controlled was developed. Measurements of changes in absorbance at 610 nm were used as criterion of the oxidation progress. In order to maximize the sensitivity, different experimental reaction parameters were carefully studied via factorial screening and optimized by multivariate method. The linearity, intraday, and interday assay precision and accuracy were determined. The absorbance-concentration plot corresponding to tap water spiked samples was rectilinear, over the range of 7.56 × 10−6–3.22 × 10−5 mol L−1, with detection and quantitation limits of 2.49 × 10−6 mol L−1 and 7.56 × 10−6 mol L−1, respectively. The proposed method was successfully validated for the application of the determination of the drug in the spiked tap water samples and the percentage recoveries were 94.0–105.0%. The method is simple and does not require expensive instruments or complicated extraction steps of the reaction product.

  2. Apex Predators Program Age and Growth Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Apex Predators Program staff have collected vertebral centra from sportfishing tournaments, cruises, commercial fishermen and strandings in the Northeast US since...

  3. Apex Predators Program Sportfishing Tournament Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Apex Predators Program staff have collected shark sportfishing tournamant data from the Northeast US since the 1960's. These tournaments offer a unique opportunity...

  4. [Electronic Apex Locator as a dental instrument].

    Science.gov (United States)

    Lin, S; Winocur-Arias, O; Slutzky-Goldberg, I

    2009-04-01

    Electronic Apex Locators (EAL) have become widely used in the last decade. The first apex locator was introduced in 1962, based on a constant electrical resistance (6.5 K.) between the oral mucosa and periodontal ligament. The first and second generations of EAL were inaccurate and could not detect the apex in the presence of conducting fluids. The third generation solved this problem by using two alternating frequencies and calculating the impedance between them. This provided reliable and accurate results in dry canals, or in the presence of blood, electrolytes or other fluid in the root canals, when the pulp was necrotic or when there was a perforation along the root. The Root ZX and Apit (Endex) are the most documented devices. The new fourth generation of apex locators is a diverse group: some use multifrequency currents, others use a "lookup matrix" rather than calculate the readings. Several of the newer EALs are smaller, and others connect to computers.

  5. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Petrous apex mucocele: high resolution CT

    Energy Technology Data Exchange (ETDEWEB)

    Memis, A. [Dept. of Radiology, Hospital of Ege Univ., Bornova, Izmir (Turkey); Memis, A. [Dept. of Radiology, Hospital of Ege Univ., Bornova, Izmir (Turkey); Alper, H. [Dept. of Radiology, Hospital of Ege Univ., Bornova, Izmir (Turkey); Calli, C. [Dept. of Radiology, Hospital of Ege Univ., Bornova, Izmir (Turkey); Ozer, H. [Dept. of Radiology, Hospital of Ege Univ., Bornova, Izmir (Turkey); Ozdamar, N. [Dept. of Neurosurgery, Hospital of Ege Univ., Bornova, Izmir (Turkey)

    1994-11-01

    Mucocele of the petrous apex is very rare, only three cases having been reported. Since this area is inaccessible to direct examination, imaging, preferably high resolution computed tomography (HR CT) is essential. We report a case showing an eroding, non enhancing mass with sharp, lobulated contours, within the petrous apex. The presence of a large air cell on the opposite side suggested a mucocele. (orig.)

  7. Correlation between two methods of florbetapir PET quantitative analysis.

    Science.gov (United States)

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  8. VERIFICATION HPLC METHOD OF QUANTITATIVE DETERMINATION OF AMLODIPINE IN TABLETS

    Directory of Open Access Journals (Sweden)

    Khanin V. A

    2014-10-01

    Full Text Available Introduction. Amlodipine ((±-2-[(2-aminoetoksimethyl]-4-(2-chlorophenyl-1,4-dihydro-6-methyl-3,5-pyridine dicarboxylic acid 3-ethyl 5-methyl ester as besylate and small tally belongs to the group of selective long-acting calcium channel blockers, dihydropyridine derivatives. In clinical practice, as antianginal and antihypertensive agent for the treatment of cardiovascular diseases. It is produced in powder form, substance and finished dosage forms (tablets of 2.5, 5 and 10 mg. The scientific literature describes methods of quantitative determination of the drug by spectrophotometry – by his own light absorption and by reaction product with aloksan, chromatography techniques, kinetic-spectrophotometric method in substances and preparations and methods chromatomass spectrometry and stripping voltammetry. For the quantitative determination of amlodipine besylate British Pharmacopoeia and European Pharmacopoeia recommend the use of liquid chromatography method. In connection with the establishment of the second edition of SPhU and when it is comprised of articles on the finished product, we set out to analyze the characteristics of the validation of chromatographic quantitative determination of amlodipine besylate tablets and to verify the analytical procedure. Material & methods. In conducting research using substance amlodipine besylate series number AB0401013. Analysis subject pill “Amlodipine” series number 20113 manufacturer of “Pharmaceutical company “Zdorovye”. Analytical equipment used is: 2695 chromatograph with diode array detector 2996 firms Waters Corp. USA using column Nova-Pak C18 300 x 3,9 mm with a particle size of 4 μm, weight ER-182 company AND Japan, measuring vessel class A. Preparation of the test solution. To accurately sample powder tablets equivalent to 50 mg amlodipine, add 30 ml of methanol, shake for 30 minutes, dilute the solution to 50.0 ml with methanol and filtered. 5 ml of methanol solution adjusted to

  9. [Quantitative and qualitative research methods, can they coexist yet?].

    Science.gov (United States)

    Hunt, Elena; Lavoie, Anne-Marise

    2011-06-01

    Qualitative design is gaining ground in Nursing research. In spite of a relative progress however, the evidence based practice movement continues to dominate and to underline the exclusive value of quantitative design (particularly that of randomized clinical trials) for clinical decision making. In the actual context convenient to those in power making utilitarian decisions on one hand, and facing nursing criticism of the establishment in favor of qualitative research on the other hand, it is difficult to chose a practical and ethical path that values the nursing role within the health care system, keeping us committed to quality care and maintaining researcher's integrity. Both qualitative and quantitative methods have advantages and disadvantages, and clearly, none of them can, by itself, capture, describe and explain reality adequately. Therefore, a balance between the two methods is needed. Researchers bare responsibility to society and science, and they should opt for the appropriate design susceptible to answering the research question, not promote the design favored by the research funding distributors.

  10. Researching and deploying an APEX security scanning tool

    CERN Document Server

    Vali, Silvia

    2016-01-01

    Most of the APEX applications have not been developed considering security in mind or were developed many years ago, as well as the old version of APEX used exposes those type of applications to a variety of potential security risks. CERN develops and uses many APEX applications, but none of the currently used tools provides a sufficient way of vulnerability scanning for such applications. The current version of APEX used in CERN is 4.2.6 whilst the latest version is 5.1. This report provides the reader with the overview on APEX and the APEX-SERT vulnerability scanning tool as well as the summary of testing the APEX-SERT tool on existing APEX applications used in CERN and the samples, created during this project. The goal of this project was to research on existing tools for vulnerability scanning of APEX applications and to deploy the tool to be used APEX developers.

  11. A novel semi-quantitative method for measuring tissue bleeding.

    Science.gov (United States)

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  12. New method for quantitative evaluation of esophageal sensibility.

    Science.gov (United States)

    López-Merino, V; Benages, A; Molina, R; Marcos-Buscheck, C; Tomás-Ridocci, M; Mora, F; Moreno-Osset, E; Mínguez, M

    1986-06-01

    A method for quantitating esophagus sensibility by an electric stimulation test is described. Square stimulus waveform at different voltages and durations were transmitted to the esophagus, three series of electric stimuli being used in successive durations (0.5, 1, 2, 4, 8 and 16 ms); in each series the voltage discharge was increased progressively from 0 mV, until the subject noted the first sensation. This procedure was carried out at all esophageal levels. The following parameters were analyzed: sensitive threshold along the esophagus; the relation of threshold sensibility (mV) duration of stimulus (ms), and reobase and cronaxia for each esophageal level. At all esophageal levels, the sensitive threshold was regular and coherent; in the middle esophagus a zone was found having higher sensitive threshold than the proximal and distal esophageal zones. The relationship between sensitive threshold and inverse of the stimulus duration indicated that esophageal sensibility follows the basic law of excitation of WEISS, at least with this type of stimulus, reobase and cronaxia being representative of the sensibility threshold along the esophagus. Quantitative esophageal sensibility, therefore is concluded to be particularly suited to evaluation by electric stimulation.

  13. Clinical implications of APEX1 and Jagged1 as chemoresistance factors in biliary tract cancer

    Science.gov (United States)

    Kim, Hong-Beum; Cho, Won Jin; Choi, Nam Gyu; Kim, Sung-Soo; Park, Jun Hee; Lee, Hee-Jeong

    2017-01-01

    Purpose Biliary cancer is a highly malignant neoplasm with poor prognosis and most patients need to undergo palliative chemotherapy, however major clinical problem associated with the use of chemotherapy is chemoresistance. So far, we aimed at investigating clinical implications of apurinic/apyrimidinic endodeoxyribonuclease 1 (APEX1) and Jagged1 as chemoresistance factors in biliary tract cancer. Methods We used 5 human biliary tract cancer cell lines (SNU-245, SNU-308, SNU-478, SNU-1079, and SNU-1196), and investigated the chemosensitivity of APEX1 and Jagged1 through 3-(4, 5-dimethylthiazol-2-yl)-2, 5-diphenyltetrazolium bromide (MTT) assay and Western blot. Alternately, the 10 patients of advanced biliary cancer consist of 2 group according to the chemotherapy response examined by immunohistochemistry using APEX1 and Jagged1 antibody, and protein expression level was scored for staining intensity and percent positive cell. Results The result of MTT assay after APEX1 knockdown showed that strong coexpression of APEX1 and Jagged1 cell line (SNU-245, SNU-1079, and SNU-1196) showed a greater decrease in IC50 of chemotherapeutic agent (5-fluorouracil, gemcitabine and cisplatin). The Western blot analysis of APEX1 and Jagged1 expression in biliary cancer cell lines after APEX1 knockdown definitively demonstrated decreased Jagged1 expression. The APEX1 and Jagged1expression level of immunohistochemistry represented that chemorefractory patients had higher than chemoresponsive patients. Conclusion These results demonstrate that simultaneous high expression of APEX1 and Jagged1 is associated with chemoresistance in biliary cancer and suggest that is a potential therapeutic target for chemoresistance in advanced biliary cancer. PMID:28090501

  14. Assessing polarization effects for the Airborne imaging spectrometer APEX

    Directory of Open Access Journals (Sweden)

    U. Böttger

    2006-01-01

    Full Text Available In the scope of hyperspectral airborne imaging spectrometer (APEX design activities, the acceptable sensitivity of linear polarization of the spectrometer is analyzed by assessing the amount of polarization of reflected light in the atmosphere-surface system. A large number of calculations is performed for a wide variaty of viewing geometries to study the influences of aerosol models, natural surfaces and flight altitudes over the spectral range from the near-UV to the short-wave infrared (SWIR. Thereinafter the design of the imaging spectrometer is outlined accounting for these requirements and a method of partially correcting the instrument polarization sensitivity is briefly introduced. APEX design and post-processing capabilities will enable to reduce the influence of polarization sensitivity of at-sensor radiance and its higher-level products generated for most of the observation conditions.

  15. A new method for quantitatively characterizing atmospheric oxidation capacity

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on atmospheric chemical kinetics, the rate constant of overall pseudo-first order oxidation re-moval of gaseous pollutants (Kpor,T) is proposed to characterize the atmospheric oxidation capacity in troposphere. Being a quantitative parameter, Kpor,T can be used to address the issues related to at-mospheric oxidation capacity. By applying this method, the regional oxidation capacity of the atmos-phere in Pearl River Delta (PRD) is numerically simulated based on CBM-IV chemical mechanism. Re-sults show the significant spatio-temporal variation of the atmospheric oxidation capacity in PRD. It is found that OH initiated oxidations, heterogeneous oxidation of SO2, and photolysis of aldehydes are the three most important oxidation processes influencing the atmospheric oxidation capacity in PRD.

  16. A Quantitative Vainberg Method for Black Box Scattering

    Science.gov (United States)

    Galkowski, Jeffrey

    2017-01-01

    We give a quantitative version of Vainberg's method relating pole free regions to propagation of singularities for black box scatterers. In particular, we show that there is a logarithmic resonance free region near the real axis of size {τ} with polynomial bounds on the resolvent if and only if the wave propagator gains derivatives at rate {τ}. Next we show that if there exist singularities in the wave trace at times tending to infinity which smooth at rate {τ}, then there are resonances in logarithmic strips whose width is given by {τ}. As our main application of these results, we give sharp bounds on the size of resonance free regions in scattering on geometrically nontrapping manifolds with conic points. Moreover, these bounds are generically optimal on exteriors of nontrapping polygonal domains.

  17. A quantitative assessment method for Ascaris eggs on hands.

    Directory of Open Access Journals (Sweden)

    Aurelie Jeandron

    Full Text Available The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate] and two egg detection methods (McMaster technique and FLOTAC. A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1% also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6% was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  18. The large APEX bolometer camera LABOCA

    Science.gov (United States)

    Siringo, Giorgio; Kreysa, Ernst; Kovacs, Attila; Schuller, Frederic; Weiß, Axel; Esch, Walter; Gemünd, Hans-Peter; Jethava, Nikhil; Lundershausen, Gundula; Güsten, Rolf; Menten, Karl M.; Beelen, Alexandre; Bertoldi, Frank; Beeman, Jeffrey W.; Haller, Eugene E.; Colin, Angel

    2008-07-01

    A new facility instrument, the Large APEX Bolometer Camera (LABOCA), developed by the Max-Planck-Institut für Radioastronomie (MPIfR, Bonn, Germany), has been commissioned in May 2007 for operation on the Atacama Pathfinder Experiment telescope (APEX), a 12 m submillimeter radio telescope located at 5100 m altitude on Llano de Chajnantor in northern Chile. For mapping, this 295-bolometer camera for the 870 micron atmospheric window operates in total power mode without wobbling the secondary mirror. One LABOCA beam is 19 arcsec FWHM and the field of view of the complete array covers 100 square arcmin. Combined with the high efficiency of APEX and the excellent atmospheric transmission at the site, LABOCA offers unprecedented capability in large scale mapping of submillimeter continuum emission. Details of design and operation are presented.

  19. A Method for Quantitative Determination of Biofilm Viability

    Directory of Open Access Journals (Sweden)

    Maria Strømme

    2012-06-01

    Full Text Available In this study we present a scheme for quantitative determination of biofilm viability offering significant improvement over existing methods with metabolic assays. Existing metabolic assays for quantifying viable bacteria in biofilms usually utilize calibration curves derived from planktonic bacteria, which can introduce large errors due to significant differences in the metabolic and/or growth rates of biofilm bacteria in the assay media compared to their planktonic counterparts. In the presented method we derive the specific growth rate of Streptococcus mutans bacteria biofilm from a series of metabolic assays using the pH indicator phenol red, and show that this information could be used to more accurately quantify the relative number of viable bacteria in a biofilm. We found that the specific growth rate of S. mutans in biofilm mode of growth was 0.70 h−1, compared to 1.09 h−1 in planktonic growth. This method should be applicable to other bacteria types, as well as other metabolic assays, and, for example, to quantify the effect of antibacterial treatments or the performance of bactericidal implant surfaces.

  20. Comparative Evaluation of Quantitative Test Methods for Gases on a Hard Surface

    Science.gov (United States)

    2017-02-01

    COMPARATIVE EVALUATION OF QUANTITATIVE TEST METHODS FOR GASES ON A HARD SURFACE ECBC-TR-1426 Vipin Rastogi... Quantitative Test Methods for Gases on a Hard Surface 5a. CONTRACT NUMBER IAG DW-21-83021801-2 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...ABSTRACT: In this study, two quantitative test methods were evaluated (the Three Step Method [TSM] and the Quantitative Disk Carrier Test [QCT]) for

  1. Breast tumour visualization using 3D quantitative ultrasound methods

    Science.gov (United States)

    Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.

    2016-04-01

    Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.

  2. Petrous apex lesions outcome in 21 cases

    Directory of Open Access Journals (Sweden)

    Hekmatara M

    1997-09-01

    Full Text Available Petrous apex lesions of temporal bone progress slowly. Most of the time not only destruct this area but also involve neighbouring element. The symptoms of the neighbouring neuro-vasculare involvement we can recognize these lesions. The most common symptoms of involvement of the petrous apex are: headache, conductive hearing loss or sensorineural type, paresthesia and anesthesia of the trigeminal nerve, paresia and paralysis of the facial nerve, abducent nerve. In retrospective study which has been in the ENT and HNS wards of Amiralam hospital, 148 patients have been operated due to temporal bone tumor; from these numbers, 21 (13.6% patients had petrous apex lesions of temporal bone. Eleven (52.9% patients of these 21 persons were men and the remaining 10 (47-6% were women. The average age of the patients was 37 years. The common pathology of these patients were glomus jugulare tumors, hemangioma, schwannoma, meningioma, congenital cholesteatoma, giant cell granuloma. The kind of operations that have been done on these patients were: infratemporal, translabyrinthine and middle fossa approaches. The conclusion of this study shows that petrous apex area is an occult site. The symptoms of this lesion are not characteristic, meticulous attention to the history and physical examination are very helpful to recognition of these lesions and it's extention.

  3. CHAMP+: a powerful array receiver for APEX

    NARCIS (Netherlands)

    Kasemann, C.; Güsten, R.; Heyminck, S.; Klein, B.; Klein, T.; Philipp, S.D.; Korn, A.; Schneider, G.; Henseler, A.; Baryshev, A.; Klapwijk, T.M.

    2006-01-01

    CHAMP+, a dual-color 2 × 7 element heterodyne array for operation in the 450 μm and 350 μm atmospheric windows is under development. The instrument, which is currently undergoing final evaluation in the laboratories, will be deployed for commissioning at the APEX telescope in August this year. With

  4. CHAMP+: a powerful array receiver for APEX

    NARCIS (Netherlands)

    Kasemann, C.; Güsten, R.; Heyminck, S.; Klein, B.; Klein, T.; Philipp, S.D.; Korn, A.; Schneider, G.; Henseler, A.; Baryshev, A.; Klapwijk, T.M.

    2006-01-01

    CHAMP+, a dual-color 2 × 7 element heterodyne array for operation in the 450 μm and 350 μm atmospheric windows is under development. The instrument, which is currently undergoing final evaluation in the laboratories, will be deployed for commissioning at the APEX telescope in August this year. With

  5. A Quantitative Method for Localizing User Interface Problems: The D-TEO Method

    Directory of Open Access Journals (Sweden)

    Juha Lamminen

    2009-01-01

    Full Text Available A large array of evaluation methods have been proposed to identify Website usability problems. In log-based evaluation, information about the performance of users is collected and stored into log files, and used to find problems and deficiencies in Web page designs. Most methods require the programming and modeling of large task models, which are cumbersome processes for evaluators. Also, because much statistical data is collected onto log files, recognizing which Web pages require deeper usability analysis is difficult. This paper suggests a novel quantitative method, called the D-TEO, for locating problematic Web pages. This semiautomated method explores the decomposition of interaction tasks of directed information search into elementary operations, deploying two quantitative usability criteria, search success and search time, to reveal how a user navigates within a web of hypertext.

  6. Quantitative methods to study epithelial morphogenesis and polarity.

    Science.gov (United States)

    Aigouy, B; Collinet, C; Merkel, M; Sagner, A

    2017-01-01

    Morphogenesis of an epithelial tissue emerges from the behavior of its constituent cells, including changes in shape, rearrangements, and divisions. In many instances the directionality of these cellular events is controlled by the polarized distribution of specific molecular components. In recent years, our understanding of morphogenesis and polarity highly benefited from advances in genetics, microscopy, and image analysis. They now make it possible to measure cellular dynamics and polarity with unprecedented precision for entire tissues throughout their development. Here we review recent approaches to visualize and measure cell polarity and tissue morphogenesis. The chapter is organized like an experiment. We first discuss the choice of cell and polarity reporters and describe the use of mosaics to reveal hidden cell polarities or local morphogenetic events. Then, we outline application-specific advantages and disadvantages of different microscopy techniques and image projection algorithms. Next, we present methods to extract cell outlines to measure cell polarity and detect cellular events underlying morphogenesis. Finally, we bridge scales by presenting approaches to quantify the specific contribution of each cellular event to global tissue deformation. Taken together, we provide an in-depth description of available tools and theoretical concepts to quantitatively study cell polarity and tissue morphogenesis over multiple scales.

  7. Quantitative Methods for Comparing Different Polyline Stream Network Models

    Energy Technology Data Exchange (ETDEWEB)

    Danny L. Anderson; Daniel P. Ames; Ping Yang

    2014-04-01

    Two techniques for exploring relative horizontal accuracy of complex linear spatial features are described and sample source code (pseudo code) is presented for this purpose. The first technique, relative sinuosity, is presented as a measure of the complexity or detail of a polyline network in comparison to a reference network. We term the second technique longitudinal root mean squared error (LRMSE) and present it as a means for quantitatively assessing the horizontal variance between two polyline data sets representing digitized (reference) and derived stream and river networks. Both relative sinuosity and LRMSE are shown to be suitable measures of horizontal stream network accuracy for assessing quality and variation in linear features. Both techniques have been used in two recent investigations involving extracting of hydrographic features from LiDAR elevation data. One confirmed that, with the greatly increased resolution of LiDAR data, smaller cell sizes yielded better stream network delineations, based on sinuosity and LRMSE, when using LiDAR-derived DEMs. The other demonstrated a new method of delineating stream channels directly from LiDAR point clouds, without the intermediate step of deriving a DEM, showing that the direct delineation from LiDAR point clouds yielded an excellent and much better match, as indicated by the LRMSE.

  8. Quantitative Methods for Analysing Joint Questionnaire Data: Exploring the Role of Joint in Force Design

    Science.gov (United States)

    2015-08-01

    UNCLASSIFIED UNCLASSIFIED Quantitative Methods for Analysing Joint Questionnaire Data: Exploring the Role of Joint in Force Design David...joint activities selected by respondents, and the small sample size, quantitative analysis was conducted on the collected data. This statistical...APPROVED FOR PUBLIC RELEASE UNCLASSIFIED UNCLASSIFIED Quantitative Methods for Analysing Joint Questionnaire Data: Exploring the Role of

  9. Comparative morphology of dendritic arbors in populations of Purkinje cells in mouse sulcus and apex.

    Science.gov (United States)

    Nedelescu, Hermina; Abdelhack, Mohamed

    2013-01-01

    Foliation divides the mammalian cerebellum into structurally distinct subdivisions, including the concave sulcus and the convex apex. Purkinje cell (PC) dendritic morphology varies between subdivisions and changes significantly ontogenetically. Since dendritic morphology both enables and limits sensory-motor circuit function, it is important to understand how neuronal architectures differ between brain regions. This study employed quantitative confocal microcopy to reconstruct dendritic arbors of cerebellar PCs expressing green fluorescent protein and compared arbor morphology between PCs of sulcus and apex in young and old mice. Arbors were digitized from high z-resolution (0.25 µm) image stacks using an adaptation of Neurolucida's (MBF Bioscience) continuous contour tracing tool, designed for drawing neuronal somata. Reconstructed morphologies reveal that dendritic arbors of sulcus and apex exhibit profound differences. In sulcus, 72% of the young PC population possesses two primary dendrites, whereas in apex, only 28% do. Spatial constraints in the young sulcus cause significantly more dendritic arbor overlap than in young apex, a distinction that disappears in adulthood. However, adult sulcus PC arbors develop a greater number of branch crossings. These results suggest developmental neuronal plasticity that enables cerebellar PCs to attain correct functional adult architecture under different spatial constraints.

  10. A fast semi-quantitative method for Plutonium determination in an alpine firn/ice core

    Science.gov (United States)

    Gabrieli, J.; Cozzi, G.; Vallelonga, P.; Schwikowski, M.; Sigl, M.; Boutron, C.; Barbante, C.

    2009-04-01

    deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu profiles with smaller peaks (about 20-30% compared to the 1964 peak) which could be due to French and Chinese tests. Comparison with the Pu profiles obtained from the Col du Dome and Belukha ice cores by AMS (Accelerator Mass Spectrometry) shows very good agreement. Considering the semi-quantitative method and the analytical uncertainty, the results are also quantitatively comparable. However, the Pu concentrations at Colle Gnifetti are normally 2-3 times greater than in Col du Dome. This could be explained by different air mass transport or, more likely, different accumulation rates at each site.

  11. In vitro evaluation of the accuracy of five different electronic apex locators

    Directory of Open Access Journals (Sweden)

    Nasil Sakkir

    2015-01-01

    Full Text Available Objective of Study: To evaluate in vitro the efficacy of five different electronic apex locators (Root ZX II, i-Root, Endo Master, Triauto ZX, and Elements apex locator in locating the minor diameter. Materials and Methods: Thirty freshly extracted single-rooted maxillary central incisors were used for the study. Standard access preparation was carried out and the teeth were glued to three plastic frames containing alginate. Electronic working length measurement was determined using all the five apex locators. Following this, the actual canal length was determined by introducing a size 15 K-file into the canal until the tip of the file became visible at the apical foramen under microscope. The mean values of actual length and electronic working length readings were compared using Student t-test and multiple comparison procedures. Results: The average value for actual root canal length was 22.483 ± 1.8731 mm; and the mean electronic root canal length values for Root ZX II, i-Root, Elements, Endo Master, and Triauto ZX apex locators was 22.483 ± 1.7640 mm, 22.400 ± 1.7390 mm, 22.717 ± 1.9462 mm, 22.767 ± 1.9061 mm, and 22.417 ± 1.7523 mm, respectively. P > 0.05 for all the five tested apex locators. Conclusion: All the five modern apex locators tested in this study can determine the working length with high precision and greater predictability.

  12. Comparison of a quantitative microtiter method, a quantitative automated method, and the plate-count method for determining microbial complement resistance.

    Science.gov (United States)

    Lee, M D; Wooley, R E; Brown, J; Spears, K R; Nolan, L K; Shotts, E B

    1991-01-01

    A quantitative microtiter method for determining the degree of complement resistance or sensitivity of microorganisms is described. The microtiter method is compared with a quantitative automated system and the standard plate-count technique. Data were accumulated from 30 avian Escherichia coli isolates incubated at 35 C with either chicken plasma or heat-inactivated chicken plasma. Analysis of data generated by the automated system and plate-count techniques resulted in a classification of the microorganisms into three groups: those sensitive to the action of complement; those of intermediate sensitivity to the action of complement; and those resistant to the action of complement. Although the three methods studied did not agree absolutely, there were statistically significant correlations among them.

  13. Thermography as a quantitative imaging method for assessing postoperative inflammation

    Science.gov (United States)

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  14. Electronic apex locator: A comprehensive literature review — Part II: Effect of different clinical and technical conditions on electronic apex locator′s accuracy

    Directory of Open Access Journals (Sweden)

    Hamid Razavian

    2014-01-01

    Full Text Available Introduction: To investigate the effects of different clinical and technical conditions on the accuracy of electronic apex locators (EALs. Materials and Methods: "Tooth apex," "dental instrument," "odontometry," "electronic medical," and "electronic apex locator" were searched as primary identifiers via Medline/PubMed, Cochrane library, and Scopus data base up to 30 July 2013. Original articles that fulfilled the inclusion criteria were selected and reviewed. Results: Out of 402 relevant studies, 183 were selected based on the inclusion criteria. In this part, 75 studies are presented. Pulp vitality conditions and root resorption, types of files and irrigating materials do not affect an EAL′s accuracy; however, the file size and foramen diameter can affect its accuracy. Conclusions: Various clinical conditions such as the file size and foramen diameter may affect EALs′ accuracy. However, more randomized clinical trials are needed for definitive conclusion.

  15. Using Apex To Construct CPM-GOMS Models

    Science.gov (United States)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2006-01-01

    process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement

  16. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  17. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  18. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  19. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    OpenAIRE

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative res...

  20. Advancing the study of violence against women using mixed methods: integrating qualitative methods into a quantitative research program.

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol

    2011-02-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.

  1. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    OpenAIRE

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative res...

  2. Asthma education: different viewpoints elicited by qualitative and quantitative methods.

    Science.gov (United States)

    Damon, Scott A; Tardif, Richard R

    2015-04-01

    This project began as a qualitative examination of how asthma education provided by health professionals could be improved. Unexpected qualitative findings regarding the use of Asthma Action Plans and the importance of insurance reimbursement for asthma education prompted further quantitative examination. Qualitative individual interviews were conducted with primary care physicians in private practice who routinely provide initial diagnoses of asthma and focus groups were conducted with other clinicians in private primary care practices who routinely provide asthma education. Using the DocStyles quantitative tool two questions regarding Asthma Action Plans and insurance reimbursement were asked of a representative sample of physicians and other clinicians. The utility of Asthma Action Plans was questioned in the 2012 qualitative study. Qualitative findings also raised questions regarding whether reimbursement is the barrier to asthma education for patients performed by medical professionals it is thought to be. 2013 quantitative findings show that the majority of clinicians see Asthma Action Plans as useful. The question of whether reimbursement is a barrier to providing asthma education to patients was not resolved by the quantitative data. The majority of clinicians see Asthma Action Plans as a useful tool for patient education. Clinicians had less clear opinions on whether the lack of defined reimbursement codes acted as a barrier to asthma education. The study also provided useful audience data for design of new asthma educational tools developed by CDC.

  3. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    Science.gov (United States)

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  4. Guidelines for reporting quantitative methods and results in primary research

    NARCIS (Netherlands)

    Norris, J.M.; Plonsky, L.; Ross, S.J.; Schoonen, R.

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of Language Learning, out

  5. Guidelines for reporting quantitative methods and results in primary research

    NARCIS (Netherlands)

    Norris, J.M.; Plonsky, L.; Ross, S.J.; Schoonen, R.

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of Language Learning,

  6. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    Science.gov (United States)

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  7. Data acquisition with the APEX hyperspectral sensor

    Directory of Open Access Journals (Sweden)

    Vreys Kristin

    2016-03-01

    Full Text Available APEX (Airborne Prism EXperiment is a high spectral and spatial resolution hyperspectral sensor developed by a Swiss-Belgian consortium on behalf of the European Space Agency. Since the acceptance of the instrument in 2010, it has been operated jointly by the Flemish Institute for Technological Research (VITO, Mol, Belgium and the Remote Sensing Laboratories (RSL, Zurich, Switzerland. During this period, several flight campaigns have been performed across Europe, gathering over 4 Terabytes of raw data. Following radiometric, geometric and atmospheric processing, this data has been provided to a multitude of Belgian and European researchers, institutes and agencies, including the European Space Agency (ESA, the European Facility for Airborne Research (EUFAR and the Belgian Science Policy Office (BelSPO. The applications of APEX data span a wide range of research topics, e.g. landcover mapping (mountainous, coastal, countryside and urban regions, the assessment of important structural and (biophysical characteristics of vegetative and non-vegetative species, the tracing of atmospheric gases, and water content analysis (chlorophyll, suspended matter. Recurrent instrument calibration, accurate flight planning and preparation, and experienced pilots and instrument operators are crucial to successful data acquisition campaigns. In this paper, we highlight in detail these practical aspects of a typical APEX data acquisition campaign.

  8. Rhabdomyolysis Presenting as Orbital Apex Syndrome.

    Science.gov (United States)

    Wi, Jae Min; Chi, Mijung

    2016-01-01

    Rhabdomyolysis is a condition in which striated muscle tissue breaks down rapidly and releases muscular cell constituents into extracellular fluid and the circulation. Renal symptoms, such as acute renal failure, are major complications of rhabdomyolysis. However, no previous report of rhabdomyolysis associated with orbital complication has been issued. Here, we report the first patient of rhabdomyolysis presenting as orbital apex syndrome. A 66-year-old man presented with right periorbital swelling with erythematous patches and conjunctival chemosis. In addition, swelling, redness, and vesicles were observed in both lower legs. He was found in a drunken state with the right side of his face pressed against a table. Ophthalmic examination showed right eye fixation in all directions and ischemic change of retina. Blood testing showed elevated muscle enzyme associated with muscle destruction. And computed tomography of the orbit showed swelling of right extraocular muscles and crowding of right orbital apex. Under a diagnosis of rhabdomyolysis-associated orbital apex syndrome and central retinal artery occlusion, intravenous steroid and antibiotics therapy with intraocular pressure-lowering topicals were begun. Clinical presentation, treatment course, and follow-up are discussed.

  9. A Quantitative Method for Long-Term Water Erosion Impacts on Productivity with a Lack of Field Experiments: A Case Study in Huaihe Watershed, China

    Directory of Open Access Journals (Sweden)

    Degen Lin

    2016-07-01

    Full Text Available Water erosion causes reduced farmland productivity, and with a longer period of cultivation, agricultural productivity becomes increasingly vulnerable. The vulnerability of farmland productivity needs assessment due to long-term water erosion. The key to quantitative assessment is to propose a quantitative method with water loss scenarios to calculate productivity losses due to long-term water erosion. This study uses the agricultural policy environmental extender (APEX model and the global hydrological watershed unit and selects the Huaihe River watershed as a case study to describe the methodology. An erosion-variable control method considering soil and water conservation measure scenarios was used to study the relationship between long-term erosion and productivity losses and to fit with 3D surface (to come up with three elements, which are time, the cumulative amount of water erosion and productivity losses to measure long-term water erosion. Results showed that: (1 the 3D surfaces fit significantly well; fitting by the 3D surface can more accurately reflect the impact of long-term water erosion on productivity than fitting by the 2D curve (to come up with two elements, which are water erosion and productivity losses; (2 the cumulative loss surface can reflect differences in productivity loss caused by long-term water erosion.

  10. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    National Research Council Canada - National Science Library

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-01-01

    .... We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic...

  11. Development of a Fluorescence Quantitative PCR Method for Detection of Marteilia refringens in Shellfish

    Institute of Scientific and Technical Information of China (English)

    Liji XIE; Zhixun XIE; Yaoshan PANG; Jiabo LIU; Xianwen DENG; Zhiqin XIE

    2012-01-01

    Abstract [Objective] This paper was to develop a fluorescence quantitative PCR method for detection of M. refringens in shellfish. [Method] A pair of primers and a TaqMan probe were designed and synthesized according to the conserved gene se- quences of M. refringens in GenBank, so as to develop a fluorescence quantitative PCR method for detection of M. refringens. The developed fluorescence quantitative PCR method was compared with conventional PCR detection. [Result] The fluores- cence quantitative PCR could detect 40 template copies of plasmid DNA, and its sensitivity was 100 times higher than the conventional PCR. The detection results of Perkinsus sp, Haplosporidium sp, Aeromonas hydrophila, Pseudomonas fluorescens, Vibrio parahaemolyticu, Vibrio alginolyticu, Vibrio rluvialis and Vibrio mimicus were negtive. [Conclusion] The fluorescence quantitative PCR method for M. refringens es- tablished in this paper is specific, sensitive, rapid and quantitative with good re- peatability, which can be used for clinical detection of M. refringens infection.

  12. Quantitative risk assessment methods for cancer and noncancer effects.

    Science.gov (United States)

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Terminology input in esp course-books: quantitative evaluation method

    OpenAIRE

    JENDRYCH ELZBIETA

    2016-01-01

    The effectiveness of any teaching process depends, among other determinants, on the quality of input material and didactic relevance of teaching materials. ESP teachers try to select course-books and supplementary materials which can teach better than others. What criteria of selection do they use? Intuition and experience. Both may certainly be useful but they cannot give the hard evidence that quantitative language studies provide. Today, this hard evidence is needed more than ever before b...

  14. A quantitative SMRT cell sequencing method for ribosomal amplicons.

    Science.gov (United States)

    Jones, Bethan M; Kustka, Adam B

    2017-04-01

    Advances in sequencing technologies continue to provide unprecedented opportunities to characterize microbial communities. For example, the Pacific Biosciences Single Molecule Real-Time (SMRT) platform has emerged as a unique approach harnessing DNA polymerase activity to sequence template molecules, enabling long reads at low costs. With the aim to simultaneously classify and enumerate in situ microbial populations, we developed a quantitative SMRT (qSMRT) approach that involves the addition of exogenous standards to quantify ribosomal amplicons derived from environmental samples. The V7-9 regions of 18S SSU rDNA were targeted and quantified from protistan community samples collected in the Ross Sea during the Austral summer of 2011. We used three standards of different length and optimized conditions to obtain accurate quantitative retrieval across the range of expected amplicon sizes, a necessary criterion for analyzing taxonomically diverse 18S rDNA molecules from natural environments. The ability to concurrently identify and quantify microorganisms in their natural environment makes qSMRT a powerful, rapid and cost-effective approach for defining ecosystem diversity and function.

  15. Efficiency of 2 electronic apex locators on working length determination: A clinical study

    Directory of Open Access Journals (Sweden)

    Sibel Koçak

    2013-01-01

    Full Text Available Aims: The aim of this clinical study was to evaluate the clinical accuracy of two electronic apex locators (EALs. Materials and Methods: A total of 120 patients with 283 roots were randomized into three groups including, traditional radiographic method, EAL (Root ZX mini, and apex locating endodontic motor (VDW Gold for working length (WL determination. Root canals were instrumented to a size ProTaper F3 nickel titanium file. The obturation quality of matched tapered master cone (ProTaper F3 was determined for the accuracy of WL. Statistical Analysis Used: Descriptive statistics were expressed as numbers and percentages. Pearson Chi-square test was used to determine for differences between groups. P < 0.05 was considered statistically significant for all tests. Results: There was no statistically significant difference between the three tested groups ( P = 0.894. Conclusions: The success of both apex locators was similar to the radiographic WL determination technique.

  16. To develop a quantitative method for predicting shrinkage porosity in squeeze casting

    Institute of Scientific and Technical Information of China (English)

    Shaomin Li; Kenichiro Mine; Shinji Sanakanishi; Koichi Anzai

    2009-01-01

    In order to secure high strength and high elongation of suspension parts, it is critical to predict shrinkage porosity quantitatively. A new simulation method for quantitative predic'don of shrinkage porosity when replenishing molten metal has been proposed for squeeze casting process. To examine the accuracy of the calculation model, the proposed method was applied to a plate model.

  17. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  18. Studying learning in the healthcare setting : the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  19. Difficulties Experienced by Education and Sociology Students in Quantitative Methods Courses.

    Science.gov (United States)

    Murtonen, Mari; Lehtinen, Erno

    2003-01-01

    Examined difficulties Finnish university students experienced in learning quantitative methods. Education and sociology students rated different topics on the basis of their difficulty. Overall, students considered statistics and quantitative methods more difficult than other domains. They tended to polarize academic subjects into…

  20. Studying learning in the healthcare setting : the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the fiel

  1. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the fiel

  2. Measurement and theoretical analysis of the under-seal pressure of rotary engine apex seals. Rotary engine no apex seal no haiatsu no sokutei to kaiseki

    Energy Technology Data Exchange (ETDEWEB)

    Matsuura, K.; Terasaki, K. (Aoyama gakuin Univ., Tokyo (Japan). School of Science and Technology)

    1991-08-25

    A fundamental structure of an experimental rotary engine, which satisfies the conditions for the actual machine and can provide the phenomena on the rotor in multichannel electrical signals, and a method of taking out the signals have been developed. The pressures under two apex seals at medium speed and up to high load were measured to obtain some new knowledges. The pressure under the apex seal of 6mm thickness decreased considerably than that of the working chamber when the rotation speed increased with high loads. On the other hand, the decrease was little with the apex seal of 3mm thickness. The temperature of the gas flowing into the underseal chamber, temperature of the sealing groove wall, and the values of the pressures in the working chamber before and after apex seals measured by a pressure sensor provided in the housing were compared with the values calculated by theoretical analysis method for the underseal pressure to clarify the effect of each factor on the underseal pressure. 13 refs., 18 figs., 1 tab.

  3. Neurothekeoma of petrous apex: A rare entity

    Directory of Open Access Journals (Sweden)

    Zarina Abdul Assis

    2013-01-01

    Full Text Available Intraosseous nerve sheath tumors are very rare tumors accounting for lesser than 0.2% of primary bone tumors. We present an 18-year-old female who presented with left facial paresis for the last 1 year. Magnetic resonance imaging (MRI demonstrated expansile, multiseptated, enhancing bony lesion in the left petrous apex. There was also abnormal enhancement of the 7-8 th nerve complex within the internal auditory canal. Tumor was excised by subtemporal extradural approach. The lesion was diagnosed as intraosseous neurothekeoma on histopathology. This is an extremely rare tumor and its MRI appearance in this location is being described for the first time in literature.

  4. Quantitative XRD Analysis of Cement Clinker by the Multiphase Rietveld Method

    Institute of Scientific and Technical Information of China (English)

    HONG Han-lie; FU Zheng-yi; MIN Xin-min

    2003-01-01

    Quantitative phase analysis of Portland cement clinker samples was performed using an adaptation of the Rietveld method.The Rietveld quantitative analysis program,originally in Fortran 77 code,was significantly modified in visual basic code with windows 9X graph-user interface,which is free from the constraint of direct utilizable memory 640 k,and can be conveniently operated under the windows environment.The Rietveld quantitative method provides numerous advantages over conventional XRD quantitative method,especially in the intensity anomalies and superposition problems.Examples of its use are given with the results from other methods.It is concluded that,at present,the Rietveld method is the most suitable one for quantitative phase analysis of Portland cement clinker.

  5. Accuracy of three electronic apex locators in the presence of different irrigating solutions

    Directory of Open Access Journals (Sweden)

    Ana Laura Pion Carvalho

    2010-12-01

    Full Text Available The present study compared the accuracy of three electronic apex locators (EALs - Elements Diagnostic®, Root ZX® and Apex DSP® - in the presence of different irrigating solutions (0.9% saline solution and 1% sodium hypochlorite. The electronic measurements were carried out by three examiners, using twenty extracted human permanent maxillary central incisors. A size 10 K file was introduced into the root canals until reaching the 0.0 mark, and was subsequently retracted to the 1.0 mark. The gold standard (GS measurement was obtained by combining visual and radiographic methods, and was set 1 mm short of the apical foramen. Electronic length values closer to the GS (± 0.5 mm were considered as accurate measures. Intraclass correlation coefficients (ICCs were used to verify inter-examiner agreement. The comparison among the EALs was performed using the McNemar and Kruskal-Wallis tests (p 0.05, independent of the irrigating solutions used. The measurements taken with these two EALs were more accurate than those taken with Apex DSP®, regardless of the irrigating solution used (p < 0.05. It was concluded that Elements Diagnostic® and Root ZX® apex locators are able to locate the cementum-dentine junction more precisely than Apex DSP®. The presence of irrigating solutions does not interfere with the performance of the EALs.

  6. [Quantitative analysis of alloy steel based on laser induced breakdown spectroscopy with partial least squares method].

    Science.gov (United States)

    Cong, Zhi-Bo; Sun, Lan-Xiang; Xin, Yong; Li, Yang; Qi, Li-Feng; Yang, Zhi-Jia

    2014-02-01

    In the present paper both the partial least squares (PLS) method and the calibration curve (CC) method are used to quantitatively analyze the laser induced breakdown spectroscopy data obtained from the standard alloy steel samples. Both the major and trace elements were quantitatively analyzed. By comparing the results of two different calibration methods some useful results were obtained: for major elements, the PLS method is better than the CC method in quantitative analysis; more importantly, for the trace elements, the CC method can not give the quantitative results due to the extremely weak characteristic spectral lines, but the PLS method still has a good ability of quantitative analysis. And the regression coefficient of PLS method is compared with the original spectral data with background interference to explain the advantage of the PLS method in the LIBS quantitative analysis. Results proved that the PLS method used in laser induced breakdown spectroscopy is suitable for quantitative analysis of trace elements such as C in the metallurgical industry.

  7. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  8. New method to quantitatively evaluate the homogeneity of asphalt mixtures

    Institute of Scientific and Technical Information of China (English)

    WU Wen-liang; ZHANG Xiao-ning; WANG Duan-yi; LI Zhi

    2009-01-01

    To evaluate the homogeneity of asphalt mixtures, the images of sections obtained by cutting the as-phah mixtures specimen horizontally or vertically were analyzed with digital image processing techniques, and the particle area ratio was achieved by applying sector scan for horizontal specimen and vertical scan for vertical one. The research result indicates that the influence of random distribution of aggregates in cutting the specimen can be eliminated by using colored aggregates to distinguish coarse and fine aggregates and using color threshold to segment the images. Choosing three typical gradations, proving particle area ratio obeying normal distribution and using the variability of particle area ratio as an index, it is feasible to quantitatively evaluate the homogenei-ty of asphalt mixtures.

  9. [Research progress of real-time quantitative PCR method for group A rotavirus detection].

    Science.gov (United States)

    Guo, Yan-Qing; Li, Dan-Di; Duan, Zhao-Jun

    2013-11-01

    Group A rotavirus is one of the most significant etiological agents which causes acute gastroenteritis among infants and young children worldwide. So far, several method which includes electron microscopy (EM), enzyme immunoassay (EIA), reverse transcription-polymerase chain reaction (RT-PCR)and Real-time Quantitative PCR has been established for the detection of rotavirus. Compared with other methods, Real-time quantitative PCR have advantages in specificity, sensitivity, genotyping and quantitative accuracy. This article shows a overview of the application of real-time quantitative PCR technique to detecte group A rotavirus.

  10. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    Science.gov (United States)

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  11. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L

    2016-08-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.

  12. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    Science.gov (United States)

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

  13. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    Directory of Open Access Journals (Sweden)

    Christine Sapienza

    2005-06-01

    Full Text Available Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW and the Itakura-Saito (IS distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  14. Geometric correction of APEX hyperspectral data

    Directory of Open Access Journals (Sweden)

    Vreys Kristin

    2016-03-01

    Full Text Available Hyperspectral imagery originating from airborne sensors is nowadays widely used for the detailed characterization of land surface. The correct mapping of the pixel positions to ground locations largely contributes to the success of the applications. Accurate geometric correction, also referred to as “orthorectification”, is thus an important prerequisite which must be performed prior to using airborne imagery for evaluations like change detection, or mapping or overlaying the imagery with existing data sets or maps. A so-called “ortho-image” provides an accurate representation of the earth’s surface, having been adjusted for lens distortions, camera tilt and topographic relief. In this paper, we describe the different steps in the geometric correction process of APEX hyperspectral data, as applied in the Central Data Processing Center (CDPC at the Flemish Institute for Technological Research (VITO, Mol, Belgium. APEX ortho-images are generated through direct georeferencing of the raw images, thereby making use of sensor interior and exterior orientation data, boresight calibration data and elevation data. They can be referenced to any userspecified output projection system and can be resampled to any output pixel size.

  15. APEX - the Hyperspectral ESA Airborne Prism Experiment

    Directory of Open Access Journals (Sweden)

    Koen Meuleman

    2008-10-01

    Full Text Available The airborne ESA-APEX (Airborne Prism Experiment hyperspectral mission simulator is described with its distinct specifications to provide high quality remote sensing data. The concept of an automatic calibration, performed in the Calibration Home Base (CHB by using the Control Test Master (CTM, the In-Flight Calibration facility (IFC, quality flagging (QF and specific processing in a dedicated Processing and Archiving Facility (PAF, and vicarious calibration experiments are presented. A preview on major applications and the corresponding development efforts to provide scientific data products up to level 2/3 to the user is presented for limnology, vegetation, aerosols, general classification routines and rapid mapping tasks. BRDF (Bidirectional Reflectance Distribution Function issues are discussed and the spectral database SPECCHIO (Spectral Input/Output introduced. The optical performance as well as the dedicated software utilities make APEX a state-of-the-art hyperspectral sensor, capable of (a satisfying the needs of several research communities and (b helping the understanding of the Earth’s complex mechanisms.

  16. Intraosseous Schwannoma of the Petrous Apex.

    Science.gov (United States)

    Tamura, Ryota; Takahashi, Satoshi; Kohno, Maya; Kameyama, Kaori; Fujiwara, Hirokazu; Yoshida, Kazunari

    2015-07-01

    Background and Importance Intraosseous schwannoma is a relatively rare clinical entity that typically arises in vertebral and mandibular bone. Intraosseous schwannoma located entirely within the petrous bone is exceedingly rare, and only two cases have been reported to date. Clinical Presentation A 47-year-old Asian man was referred to our hospital with a chief complaint of double vision. Neurologic examination revealed left abducens nerve palsy. Radiologic imaging showed a 35-mm osteolytic expansive lesion located in the left petrous apex. We made a preoperative diagnosis of chondrosarcoma and performed surgical resection. Surgery was performed via a left subtemporal epidural approach with anterior petrosectomy. The histopathologic diagnosis of the tumor was schwannoma. Schwannoma arising from cranial nerves was excluded from intraoperative findings in conjunction with the results for cranial nerves, and intraosseous schwannoma was diagnosed. Postoperative course was uneventful, and abducens nerve palsy resolved immediately after surgery. Conclusion The differential diagnosis of intraosseous schwannoma should be considered for an osteolytic mass lesion within the petrous apex. Subcapsular tumor removal was considered ideal in terms of preservation of the cranial nerves and vessels around the tumor.

  17. ASTRO APEx(®) and RO-ILS™ are applicable to medical malpractice in radiation oncology.

    Science.gov (United States)

    Zaorsky, Nicholas G; Ricco, Anthony G; Churilla, Thomas M; Horwitz, Eric M; Den, Robert B

    2016-11-01

    To analyze malpractice trials in radiation oncology and assess how ASTRO APEx(®) and RO-ILS™ apply to such cases. The Westlaw database was reviewed using PICOS/PRISMA methods. Fisher's exact and Mann-Whitney U tests were used to find factors associated with outcomes. Of 34 cases identified, external beam was used in 26 (77%). The most common factors behind malpractice were excessive toxicity (80%) and lack of informed consent (66%). ASTRO APEx pillars and ROI-LS had applicability to all but one case. Factors favoring the defendant included statute of limitations (odds ratio: 8.1; 95% CI: 1.3-50); those favoring the plaintiff included patient death (odds ratio: 0.7; 95% CI: 0.54-0.94). APEx and RO-ILS are applicable to malpractice trials in radiation oncology.

  18. Cholesterol granuloma of the petrous apex: CT diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, W.W.M.; Solti-Bohman, L.G.; Brackmann, D.E.; Gruskin, P.

    1984-12-01

    Cholesterol granuloma of the petrous apex is a readily recognizable and treatable entity that is more common than previously realized. Cholesterol granuloma grows slowly in the petrous apex as a mass lesion until it produces hearing loss, tinnitus, vertigo, and facial twitching. Twelve cases of cholesterol granuloma of the petrous apex are illustrated; ten of these analyzed in detail, especially with respect to CT findings. A sharply and smoothly marginated expansile lesion in the petrous apex, isodense with plain and nonenhancing on CT, is in all probability a cholesterol granuloma. Preoperative recognition by CT is important for planning proper treatment.

  19. Semi-quantitative method to estimate levels of Campylobacter

    Science.gov (United States)

    Introduction: Research projects utilizing live animals and/or systems often require reliable, accurate quantification of Campylobacter following treatments. Even with marker strains, conventional methods designed to quantify are labor and material intensive requiring either serial dilutions or MPN ...

  20. Reconstruction-classification method for quantitative photoacoustic tomography

    CERN Document Server

    Malone, Emma; Cox, Ben T; Arridge, Simon R

    2015-01-01

    We propose a combined reconstruction-classification method for simultaneously recovering absorption and scattering in turbid media from images of absorbed optical energy. This method exploits knowledge that optical parameters are determined by a limited number of classes to iteratively improve their estimate. Numerical experiments show that the proposed approach allows for accurate recovery of absorption and scattering in 2 and 3 dimensions, and delivers superior image quality with respect to traditional reconstruction-only approaches.

  1. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  2. Quantitative Diagnostic Method for Biceps Long Head Tendinitis by Using Ultrasound

    OpenAIRE

    Shih-Wei Huang; Wei-Te Wang

    2013-01-01

    Objective. To investigate the feasibility of grayscale quantitative diagnostic method for biceps tendinitis and determine the cut-off points of a quantitative biceps ultrasound (US) method to diagnose biceps tendinitis. Design. Prospective cross-sectional case controlled study. Setting. Outpatient rehabilitation service. Methods. A total of 336 shoulder pain patients with suspected biceps tendinitis were recruited in this prospective observational study. The grayscale pixel data of the range ...

  3. A REVIEW OF QUANTITATIVE METHODS FOR STUDIES OF MINERAL-CONTENT OF INTRAORAL INCIPIENT CARIES LESIONS

    NARCIS (Netherlands)

    TENBOSCH, JJ; ANGMARMANSSON, B

    1991-01-01

    Modern prospective caries studies require the measurement of small changes in tooth mineral content. Quantitative measurements of changes in mineral content in a single caries lesion is desirable. Quantitative methods can be either destructive or non-destructive. The latter type permits longitudinal

  4. A REVIEW OF QUANTITATIVE METHODS FOR STUDIES OF MINERAL-CONTENT OF INTRAORAL INCIPIENT CARIES LESIONS

    NARCIS (Netherlands)

    TENBOSCH, JJ; ANGMARMANSSON, B

    Modern prospective caries studies require the measurement of small changes in tooth mineral content. Quantitative measurements of changes in mineral content in a single caries lesion is desirable. Quantitative methods can be either destructive or non-destructive. The latter type permits longitudinal

  5. University Students' Research Orientations: Do Negative Attitudes Exist toward Quantitative Methods?

    Science.gov (United States)

    Murtonen, Mari

    2005-01-01

    This paper examines university social science and education students' views of research methodology, especially asking whether a negative research orientation towards quantitative methods exists. Finnish (n = 196) and US (n = 122) students answered a questionnaire concerning their views on quantitative, qualitative, empirical, and theoretical…

  6. Quantitative Measurement Method for Possible Rib Fractures in Chest Radiographs

    Science.gov (United States)

    Kim, Jaeil; Kim, Sungjun; Kim, Young Jae

    2013-01-01

    Objectives This paper proposes a measurement method to quantify the abnormal characteristics of the broken parts of ribs using local texture and shape features in chest radiographs. Methods Our measurement method comprises two steps: a measurement area assignment and sampling step using a spline curve and sampling lines orthogonal to the spline curve, and a fracture-ness measurement step with three measures, asymmetry and gray-level co-occurrence matrix based measures (contrast and homogeneity). They were designed to quantify the regional shape and texture features of ribs along the centerline. The discriminating ability of our method was evaluated through region of interest (ROI) analysis and rib fracture classification test using support vector machine. Results The statistically significant difference was found between the measured values from fracture and normal ROIs; asymmetry (p rib fracture classifier, trained with the measured values in ROI analysis, detected every rib fracture from chest radiographs used for ROI analysis, but it also classified some unbroken parts of ribs as abnormal parts (8 to 17 line sets; length of each line set, 2.998 ± 2.652 mm; length of centerlines, 131.067 ± 29.460 mm). Conclusions Our measurement method, which includes a flexible measurement technique for the curved shape of ribs and the proposed shape and texture measures, could discriminate the suspicious regions of ribs for possible rib fractures in chest radiographs. PMID:24175118

  7. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    Science.gov (United States)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  8. A method for quantitative measurement of lumbar intervertebral disc structures

    DEFF Research Database (Denmark)

    Tunset, Andreas; Kjær, Per; Samir Chreiteh, Shadi

    2013-01-01

    There is a shortage of agreement studies relevant for measuring changes over time in lumbar intervertebral disc structures. The objectives of this study were: 1) to develop a method for measurement of intervertebral disc height, anterior and posterior disc material and dural sac diameter using MRI...

  9. Development of a method for quantitative measures of lumbar intervertebral disc structures

    DEFF Research Database (Denmark)

    Tunset, Andreas; Kjær, Per; Chreiteh, Shadi Samir

    2013-01-01

    applicable description of quantitative methods for measuring lumbar disc herniations and related structures on sagittal MRIs. The objectives of this study were: 1) to develop methods for quantitative measures of intervertebral discs, lumbar disc herniations and dural sac/spinal canal using MRIs, 2......) to evaluate the agreement of these methods, and 3) to identify factors in the measurement procedures that may compromise agreement. Methods: In this intra- and inter-rater agreement study, lumbar quantitative measurements were performed on magnetic resonance images from 32 participants from a study cohort...... representative of the Danish general population. A new method for quantitative measures of intervertebral discs and related structures was developed and systematically described. MRI-images were measured twice by one rater for intra-rater agreement and once by a second rater for inter-rater agreement. Length...

  10. Development of a method for quantitative measures of lumbar intervertebral disc structures

    DEFF Research Database (Denmark)

    Tunset, Andreas; Kjær, Per; Chreiteh, Shadi Samir;

    2013-01-01

    applicable description of quantitative methods for measuring lumbar disc herniations and related structures on sagittal MRIs. The objectives of this study were: 1) to develop methods for quantitative measures of intervertebral discs, lumbar disc herniations and dural sac/spinal canal using MRIs, 2......) to evaluate the agreement of these methods, and 3) to identify factors in the measurement procedures that may compromise agreement. Methods: In this intra- and inter-rater agreement study, lumbar quantitative measurements were performed on magnetic resonance images from 32 participants from a study cohort...... representative of the Danish general population. A new method for quantitative measures of intervertebral discs and related structures was developed and systematically described. MRI-images were measured twice by one rater for intra-rater agreement and once by a second rater for inter-rater agreement. Length...

  11. The Application of Quantitative Methods to the Landscape Architecture Science of China

    Institute of Scientific and Technical Information of China (English)

    QIAO Lifang; CAO Wei; ZHANG Shaowei

    2008-01-01

    Influenced by traditional methods in the landscape architecture, the studies in landscape architecture science of China are fundamentally subjective and qualitative, and aim at aesthetic elements. The quantitative methods are commonly used in the physicism domain, and there is a great potential for the application of quantitative methods to landscape architecture science. The paper summarized the progress in the application of quantitative methods to landscape architecture science in China, in which the evaluation of eco-effect, sightseeing effect, result of disposing, economic benefit, engineering quality, greenbelts landscape, landscape pattems and usage of greenbelts, etc. were involved. It proves that quantitative methods have aroused general concern and haye been well applied in certain fields and there are still some shortcomings in the researches to be improved. Finally, the important directions of Chinese landscape architecture science are proposed.

  12. Reassessing the trophic role of reef sharks as apex predators on coral reefs

    Science.gov (United States)

    Frisch, Ashley J.; Ireland, Matthew; Rizzari, Justin R.; Lönnstedt, Oona M.; Magnenat, Katalin A.; Mirbach, Christopher E.; Hobbs, Jean-Paul A.

    2016-06-01

    Apex predators often have strong top-down effects on ecosystem components and are therefore a priority for conservation and management. Due to their large size and conspicuous predatory behaviour, reef sharks are typically assumed to be apex predators, but their functional role is yet to be confirmed. In this study, we used stomach contents and stable isotopes to estimate diet, trophic position and carbon sources for three common species of reef shark ( Triaenodon obesus, Carcharhinus melanopterus and C. amblyrhynchos) from the Great Barrier Reef (Australia) and evaluated their assumed functional role as apex predators by qualitative and quantitative comparisons with other sharks and large predatory fishes. We found that reef sharks do not occupy the apex of coral reef food chains, but instead have functional roles similar to those of large predatory fishes such as snappers, emperors and groupers, which are typically regarded as high-level mesopredators. We hypothesise that a degree of functional redundancy exists within this guild of predators, potentially explaining why shark-induced trophic cascades are rare or subtle in coral reef ecosystems. We also found that reef sharks participate in multiple food webs (pelagic and benthic) and are sustained by multiple sources of primary production. We conclude that large conspicuous predators, be they elasmobranchs or any other taxon, should not axiomatically be regarded as apex predators without thorough analysis of their diet. In the case of reef sharks, our dietary analyses suggest they should be reassigned to an alternative trophic group such as high-level mesopredators. This change will facilitate improved understanding of how reef communities function and how removal of predators (e.g., via fishing) might affect ecosystem properties.

  13. Statistical Methods for Quantitatively Detecting Fungal Disease from Fruits’ Images

    OpenAIRE

    Jagadeesh D. Pujari; Yakkundimath, Rajesh Siddaramayya; Byadgi, Abdulmunaf Syedhusain

    2013-01-01

    In this paper we have proposed statistical methods for detecting fungal disease and classifying based on disease severity levels.  Most fruits diseases are caused by bacteria, fungi, virus, etc of which fungi are responsible for a large number of diseases in fruits. In this study images of fruits, affected by different fungal symptoms are collected and categorized based on disease severity. Statistical features like block wise, gray level co-occurrence matrix (GLCM), gray level runlength matr...

  14. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    Science.gov (United States)

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  15. Facile colorimetric methods for the quantitative determination of tetramisole hydrochloride

    Science.gov (United States)

    Amin, A. S.; Dessouki, H. A.

    2002-10-01

    A facile, rapid and sensitive methods for the determination of tetramisole hydrochloride in pure and in dosage forms are described. The procedures are based on the formation of coloured products with the chromogenic reagents alizarin blue BB (I), alizarin red S (II), alizarin violet 3R (III) and alizarin yellow G (IV). The coloured products showed absorption maxima at 605, 468, 631 and 388 nm for I-IV, respectively. The colours obtained were stable for 24 h. The colour system obeyed Beer's law in the concentration range 1.0-36, 0.8-32, 1.2-42 and 0.8-30 μg ml -1, respectively. The results obtained showed good recoveries with relative standard deviations of 1.27, 0.96, 1.13 and 1.35%, respectively. The detection and determination limits were found to be 1.0 and 3.8, 1.2 and 4.2, 1.0 and 3.9 and finally 1.4 and 4.8 ng ml -1 for I-IV complexes, respectively. Applications of the method to representative pharmaceutical formulations are represented and the validity assessed by applying the standard addition technique, which is comparable with that obtained using the official method.

  16. Towards Accurate Application Characterization for Exascale (APEX)

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Simon David [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  17. Comparison of Two Methods: Qualitative and Quantitative Study of C - Reactive Protein

    Directory of Open Access Journals (Sweden)

    Kiaei, MR. (BSc

    2014-06-01

    Full Text Available Background and Objective: C - reactive protein (CRP is an acute phase protein produced in liver. It is less than 5 mg per deciliter in the serum and body fluids of normal individuals, but it is increased suddenly within a few hours following inflammatory reaction. In bacterial and viral infections, active rheumatic fever, acute myocardial infarction and rheumatoid arthritis are also increased. The aim of this study was to investigate CRP level by Qualitative and Quantitative methods. Material and Methods: The CRP of 200 patients was investigated by quantitative and qualitative methods. Qualitative CRP testing was conducted three times by different people, using two kit of bionic and Omega, and then the mean of the results was reported. For quantitative CRP testing, Immunoturbidimetry was used. Results: In qualitative CRP test by Bionic kit: 180 (90% were negative, 6 (3% weakly positive, 9 (4.5% +1 and 5 (2.5 % were + 2. In qualitative CRP test by Omega Kit: 148 (74% were negative, 32 (16% weakly positive, 13 (6.5% +1, 4 (2% +2 and 3 (1.5% were +3. A high percentage of Qualitative results, which were weakly positive, became negative by Quantitative methods. The Qualitative results of +1 and the next became positive by Quantitative methods. Conclusion: It seems that in the early stages of inflammatory disease, quantitative methods are preferred to qualitative methods. Also, in cases that the CRP test results are weakly positive by qualitative methods, they should be controlled by quantitative methods too. Keywords: CRP; CRP Test Quantitative; Qualitative CRP Test

  18. A New Method for Quantitative Simulating Hydrocarbon Expulsion and Its Application

    Institute of Scientific and Technical Information of China (English)

    Mingcheng Li

    1994-01-01

    @@ Quantitative Simulating Oil Expulsion 1.Traditional method The oil expulsion (QE) can be calculated by subtracting oil residuum (QR) from oil generation (QG) and then making expulsion efficiency (fE) from QE divided by QG, as the following equations:

  19. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    National Research Council Canada - National Science Library

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-01-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development...

  20. APEX; current status of the airborne dispersive pushbroom imaging spectrometer

    NARCIS (Netherlands)

    Nieke, J.; Itten, K.I.; Kaiser, J.W.; Schlapfer, D.; Brazile, J.; Debruyn, W.; Meuleman, K.; Kempeneers, P.; Neukom, A.; Feusi, H.; Adolph, P.; Moser, R.; Schilliger, T.; Kohler, P.; Meng, M.; Piesbergen, J.; Strobl, P.; Schaepman, M.E.; Gavira, J.; Ulbrich, G.J.; Meynart, R.

    2004-01-01

    Recently, a joint Swiss/Belgian initiative started a project to build a new generation airborne imaging spectrometer, namely APEX (Airborne Prism Experiment) under the ESA funding scheme named PRODEX. APEX is a dispersive pushbroom imaging spectrometer operating in the spectral range between 380 - 2

  1. Petrous apex arachnoid cyst extending into Meckel's cave.

    Science.gov (United States)

    Batra, Arun; Tripathi, Rajendra Prasad; Singh, Anil Kumar; Tatke, Medha

    2002-09-01

    A rare case of arachnoid cyst involving the petrous apex with an unusual clinical presentation has been described with special emphasis in the imaging features and importance of accurate presurgical diagnosis. Differentiation from the other benign lesions involving the petrous apex and the role of newer MR techniques in the diagnosis of these lesions has been highlighted.

  2. Petrous apex cephalocele er en sjælden lidelse

    DEFF Research Database (Denmark)

    Ingolfsdottir, Harpa Maria; Martens, Pernille Christina; Kaltoft, Nicolai;

    2011-01-01

    Petrous apex cephalocele (PAC) is a rare lesion, which represents a herniation from the posteriolateral portion of Meckel's cave into the petrous apex. The pathologic explanation is still unknown. We report a patient with clinical symptoms of facial pain, hearing loss and cerebrospinal fluid...

  3. 21 CFR 870.2840 - Apex cardiographic transducer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Apex cardiographic transducer. 870.2840 Section 870.2840 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... cardiographic transducer. (a) Identification. An apex cardiographic transducer is a device used to detect motion...

  4. ATTENUATION OF DIFFRACTED MULTIPLES WITH AN APEX-SHIFTED TANGENT-SQUARED RADON TRANSFORM IN IMAGE SPACE

    Directory of Open Access Journals (Sweden)

    Alvarez Gabriel

    2006-12-01

    Full Text Available In this paper, we propose a method to attenuate diffracted multiples with an apex-shifted tangent-squared Radon transform in angle domain common image gathers (ADCIG . Usually, where diffracted multiples are a problem, the wave field propagation is complex and the moveout of primaries and multiples in data space is irregular. The method handles the complexity of the wave field propagation by wave-equation migration provided that migration velocities are reasonably accurate. As a result, the moveout of the multiples is well behaved in the ADCIGs. For 2D data, the apex-shifted tangent-squared Radon transform maps the 2D space image into a 3D space-cube model whose dimensions are depth, curvature and apex-shift distance.
    Well-corrected primaries map to or near the zero curvature plane and specularly-reflected multiples map to or near the zero apex-shift plane. Diffracted multiples map elsewhere in the cube according to their curvature and apex-shift distance. Thus, specularly reflected as well as diffracted multiples can be attenuated simultaneously. This approach is illustrated with a segment of a 2D seismic line over a large salt body in the Gulf of Mexico. It is shown that ignoring the apex shift compromises the attenuation of the diffracted multiples, whereas the approach proposed attenuates both the specularly-reflected and the diffracted multiples without compromising the primaries.

  5. Quantitative evaluation of solar wind time-shifting methods

    Science.gov (United States)

    Cameron, Taylor; Jackel, Brian

    2016-11-01

    Nine years of solar wind dynamic pressure and geosynchronous magnetic field data are used for a large-scale statistical comparison of uncertainties associated with several different algorithms for propagating solar wind measurements. The MVAB-0 scheme is best overall, performing on average a minute more accurately than a flat time-shift. We also evaluate the accuracy of these time-shifting methods as a function of solar wind magnetic field orientation. We find that all time-shifting algorithms perform significantly worse (>5 min) due to geometric effects when the solar wind magnetic field is radial (parallel or antiparallel to the Earth-Sun line). Finally, we present an empirical scheme that performs almost as well as MVAB-0 on average and slightly better than MVAB-0 for intervals with nonradial B.

  6. The Application of the Semi-quantitative Risk Assessment Method to Urban Natural Gas Pipelines

    OpenAIRE

    Bai, Yongqiang; LiangHai LV; Wang, Tong

    2013-01-01

    This paper provides a method of semi-quantitative risk assessment for urban gas pipelines, by modifying Kent analysis method. The influence factors of fault frequency and consequence for urban gas pipelines are analyzed, and the grade rules are studied on. The grade rules of fault frequency and consequence for urban natural gas pipelines are provided. Using semi-quantitative risk matrix, the risk grade of the urban gas pipelines is obtained, and the risk primary sort for gas pipel...

  7. Measuring in action research : four ways of integrating quantitative methods in participatory dynamics

    OpenAIRE

    Martí Olivé, Joel

    2015-01-01

    Background of INCASI Project H2020-MSCA-RISE-2015 GA 691004. WP1: Compilation Although action research uses both qualitative and quantitative methods, few contributions have addressed the specific role of the latter in this kind of research. This paper focuses on how quantitative methods can be integrated with participatory dynamics in action research designs. Four types of integration are defined and exemplified. The paper concludes with some reflections on how the integration of quantita...

  8. [An evaluation of methods for the quantitative determination of praziquantel as a substance].

    Science.gov (United States)

    Lopatin, B V; Bebris, N K; Lopatina, N B

    1989-01-01

    Feasibility of quantitative determination of a new helminthicide prasiquantel as a substance by spectral and chemical analytic procedures has been investigated. Chemical methods based on nitrogen measurement in the samples were shown to lack precision which is obligatory for drug analysis. The quantitative analytic procedures based on UV spectrophotometry are of low precision and selectivity. Infrared spectroscopy is the only method of prasiquantel assay that meets the requirements for drug substance measurement.

  9. Optimization of Quantitative PCR Methods for Enteropathogen Detection.

    Science.gov (United States)

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen's extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease.

  10. New methods for quantitative and qualitative facial studies: an overview.

    Science.gov (United States)

    Thomas, I T; Hintz, R J; Frias, J L

    1989-01-01

    The clinical study of birth defects has traditionally followed the Gestalt approach, with a trend, in recent years, toward more objective delineation. Data collection, however, has been largely restricted to measurements from X-rays and anthropometry. In other fields, new techniques are being applied that capitalize on the use of modern computer technology. One such technique is that of remote sensing, of which photogrammetry is a branch. Cartographers, surveyors and engineers, using specially designed cameras, have applied geometrical techniques to locate points on an object precisely. These techniques, in their long-range application, have become part of our industrial technology and have assumed great importance with the development of satellite-borne surveillance systems. The close-range application of similar techniques has the potential for extremely accurate clinical measurement. We are currently evaluating the application of remote sensing to facial measurement using three conventional 35 mm still cameras. The subject is photographed in front of a carefully measured grid, and digitization is then carried out on 35-mm slides specific landmarks on the cranioface are identified, along with points on the background grid and the four corners of the slide frame, and are registered as xy coordinates by a digitizer. These coordinates are then converted into precise locations in object space. The technique is capable of producing measurements to within 1/100th of an inch. We suggest that remote sensing methods such as this may well be of great value in the study of congenital malformations.

  11. A practical and sensitive method of quantitating lymphangiogenesis in vivo.

    Science.gov (United States)

    Majumder, Mousumi; Xin, Xiping; Lala, Peeyush K

    2013-07-01

    To address the inadequacy of current assays, we developed a directed in vivo lymphangiogenesis assay (DIVLA) by modifying an established directed in vivo angiogenesis assay. Silicon tubes (angioreactors) were implanted in the dorsal flanks of nude mice. Tubes contained either growth factor-reduced basement membrane extract (BME)-alone (negative control) or BME-containing vascular endothelial growth factor (VEGF)-D (positive control for lymphangiogenesis) or FGF-2/VEGF-A (positive control for angiogenesis) or a high VEGF-D-expressing breast cancer cell line MDA-MD-468LN (468-LN), or VEGF-D-silenced 468LN. Lymphangiogenesis was detected superficially with Evans Blue dye tracing and measured in the cellular contents of angioreactors by multiple approaches: lymphatic vessel endothelial hyaluronan receptor-1 (Lyve1) protein (immunofluorescence) and mRNA (qPCR) expression and a visual scoring of lymphatic vs blood capillaries with dual Lyve1 (or PROX-11 or Podoplanin)/Cd31 immunostaining in cryosections. Lymphangiogenesis was absent with BME, high with VEGF-D or VEGF-D-producing 468LN cells and low with VEGF-D-silenced 468LN. Angiogenesis was absent with BME, high with FGF-2/VEGF-A, moderate with 468LN or VEGF-D and low with VEGF-D-silenced 468LN. The method was reproduced in a syngeneic murine C3L5 tumor model in C3H/HeJ mice with dual Lyve1/Cd31 immunostaining. Thus, DIVLA presents a practical and sensitive assay of lymphangiogenesis, validated with multiple approaches and markers. It is highly suited to identifying pro- and anti-lymphangiogenic agents, as well as shared or distinct mechanisms regulating lymphangiogenesis vs angiogenesis, and is widely applicable to research in vascular/tumor biology.

  12. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    OpenAIRE

    Prasanna A. Datar

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  13. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    Science.gov (United States)

    Michalska, J.; Chmiela, B.

    2014-03-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods.

  14. The Ten Beads method: a novel way to collect quantitative data in rural Uganda

    Directory of Open Access Journals (Sweden)

    Francis Mulekya Bwambale

    2013-07-01

    Full Text Available This paper illustrates how locally appropriate methods can be used to collect quantitative data from illiterate respondents. This method uses local beads to represent quantities, which is a novel yet potentially valuable methodological improvement over standard Western survey methods.

  15. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    Science.gov (United States)

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  16. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  17. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Science.gov (United States)

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  18. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    Directory of Open Access Journals (Sweden)

    Melanie I Stefan

    2015-04-01

    Full Text Available The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014 show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  19. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    Science.gov (United States)

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  20. Validation of PCR methods for quantitation of genetically modified plants in food.

    Science.gov (United States)

    Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P

    2001-01-01

    For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.

  1. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  2. An in vitro comparison of two modern apex locators.

    Science.gov (United States)

    Weiger, R; John, C; Geigle, H; Löst, C

    1999-11-01

    Two apex locators were compared regarding their ability to accurately locate the apical constriction in the presence of various canal fluids at different meter readings. Forty-one root canals were filled with 1% NaOCl, 3% H2O2, and 0.9% NaCl, respectively. Electronic working length (EWL) measurements were recorded with the apex locators Root ZX (meter readings: "Apex", "0.5", and "1") and Apit (meter readings: "Apex" and "3"). The deviation of the EWL from the apical constriction was determined. The proportion of measurements within +/- 0.5 mm of the apical constriction ranged between 0.76 and 0.85 for Root ZX at the meter readings "Apex" and "0.5," regardless of the canal contents. Apit consistently displayed shorter measurement values than Root ZX and reached the highest proportions at the meter reading "Apex": 0.59 (1% NaOCl), 0.61 (3% H2O2), and 0.68 (0.9% NaCl). In the presence of NaOCl, Root ZX provides the more accurate EWL measurements at the meter reading "0.5" and "Apex."

  3. Accuracy of four electronic apex locators: an in vitro evaluation.

    Science.gov (United States)

    De Moor, R J; Hommez, G M; Martens, L C; De Boever, J G

    1999-04-01

    In the present study, the accuracy and operator dependency of four electronic canal length measuring devices (Apex Finder AFA Model 7005, Apex-Finder, Neosono Ultima EZ and Apit 2) were compared under a set of specified conditions. The electronic apex locators were tested in unflared dry, flared wet and flared dry canals, and in a gelatin as well as in a sodium hypochlorite sponge model. Fifteen extracted single-canaled teeth were selected. The differences between canal lengths obtained by the electronic apex locators and actual canal lengths were scored. Only the Apex-Finder was found to be unreliable (measurements higher than +/- 0.5 mm from the apical foramen). This device was also found to be particularly dependent on operator. A ranking based on a precision of +/- 0.1 mm from the apical foramen showed the Apex Finder AFA Model 7005 to be the most accurate. Early coronal flaring did not ensure better or more precise readings. The gelatin model was evaluated to be more suitable for testing electronic apex locators in vitro than the sodium hypochlorite model.

  4. Accuracy of three electronic apex locators in the presence of different irrigating solutions.

    Science.gov (United States)

    Carvalho, Ana Laura Pion; Moura-Netto, Cacio; Moura, Abilio Albuquerque Maranhão de; Marques, Márcia Martins; Davidowicz, Harry

    2010-01-01

    The present study compared the accuracy of three electronic apex locators (EALs) - Elements Diagnostic®, Root ZX® and Apex DSP® - in the presence of different irrigating solutions (0.9% saline solution and 1% sodium hypochlorite). The electronic measurements were carried out by three examiners, using twenty extracted human permanent maxillary central incisors. A size 10 K file was introduced into the root canals until reaching the 0.0 mark, and was subsequently retracted to the 1.0 mark. The gold standard (GS) measurement was obtained by combining visual and radiographic methods, and was set 1 mm short of the apical foramen. Electronic length values closer to the GS (± 0.5 mm) were considered as accurate measures. Intraclass correlation coefficients (ICCs) were used to verify inter-examiner agreement. The comparison among the EALs was performed using the McNemar and Kruskal-Wallis tests (p 0.05), independent of the irrigating solutions used. The measurements taken with these two EALs were more accurate than those taken with Apex DSP®, regardless of the irrigating solution used (p locators are able to locate the cementum-dentine junction more precisely than Apex DSP®. The presence of irrigating solutions does not interfere with the performance of the EALs.

  5. APEX version 2.0: latest version of the cross-platform analysis program for EXAFS.

    Science.gov (United States)

    Dimakis, N; Bunker, G

    2001-03-01

    This report describes recent progress on APEX, a free, open source, cross platform set of EXAFS data analysis software. In a previous report we described APEX 1.0 (Dimakis, N. and Bunker, G., 1999), a free and open source code suite of basic X-Ray Absorption Fine Structure (XAFS) data analysis programs for classical data reduction and single scattering analysis. The first version of APEX was the only cross platform (linux/irix/windows/MacOS) EXAFS analysis program to our knowledge, but it lacked important features like multiple scattering fitting, generic format conversion from ASCII to University of Washington (UW) binary-type files, and user friendly interactive graphics. In the enhanced version described here we have added cross-platform interactive graphics based on the BLT package, which is an extension to TCL/TK. Some of the utilities have been rewritten in native TCL/TK, allowing for faster and more integrated functionality with the main package. The package also has been ported to SunOS. APEX 2.0 in its current form is suitable for routine data analysis and training. Addition of more advanced methods of data analysis are planned.

  6. [A new method of calibration and positioning in quantitative analysis of multicomponents by single marker].

    Science.gov (United States)

    He, Bing; Yang, Shi-Yan; Zhang, Yan

    2012-12-01

    This paper aims to establish a new method of calibration and positioning in quantitative analysis of multicomponents by single marker (QAMS), using Shuanghuanglian oral liquid as the research object. Establishing relative correction factors with reference chlorogenic acid to other 11 active components (neochlorogenic acid, cryptochlorogenic acid, cafferic acid, forsythoside A, scutellarin, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, baicalin and phillyrin wogonoside) in Shuanghuanglian oral liquid by 3 correction methods (multipoint correction, slope correction and quantitative factor correction). At the same time chromatographic peak was positioned by linear regression method. Only one standard uas used to determine the content of 12 components in Shuanghuanglian oral liquid, in stead of needing too many reference substance in quality control. The results showed that within the linear ranges, no significant differences were found in the quantitative results of 12 active constituents in 3 batches of Shuanghuanglian oral liquid determined by 3 correction methods and external standard method (ESM) or standard curve method (SCM). And this method is simpler and quicker than literature methods. The results were accurate and reliable, and had good reproducibility. While the positioning chromatographic peaks by linear regression method was more accurate than relative retention time in literature. The slope and the quantitative factor correction controlling the quality of Chinese traditional medicine is feasible and accurate.

  7. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    Science.gov (United States)

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    Science.gov (United States)

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  9. Quantitative interferometric microscopic flow cytometer with expanded principal component analysis method

    Science.gov (United States)

    Wang, Shouyu; Jin, Ying; Yan, Keding; Xue, Liang; Liu, Fei; Li, Zhenhua

    2014-11-01

    Quantitative interferometric microscopy is used in biological and medical fields and a wealth of applications are proposed in order to detect different kinds of biological samples. Here, we develop a phase detecting cytometer based on quantitative interferometric microscopy with expanded principal component analysis phase retrieval method to obtain phase distributions of red blood cells with a spatial resolution ~1.5 μm. Since expanded principal component analysis method is a time-domain phase retrieval algorithm, it could avoid disadvantages of traditional frequency-domain algorithms. Additionally, the phase retrieval method realizes high-speed phase imaging from multiple microscopic interferograms captured by CCD camera when the biological cells are scanned in the field of view. We believe this method can be a powerful tool to quantitatively measure the phase distributions of different biological samples in biological and medical fields.

  10. QUANTITATIVE EVALUATION METHOD OF ELEMENTS PRIORITY OF CARTOGRAPHIC GENERALIZATION BASED ON TAXI TRAJECTORY DATA

    Directory of Open Access Journals (Sweden)

    Z. Long

    2017-09-01

    Full Text Available Considering the lack of quantitative criteria for the selection of elements in cartographic generalization, this study divided the hotspot areas of passengers into parts at three levels, gave them different weights, and then classified the elements from the different hotspots. On this basis, a method was proposed to quantify the priority of elements selection. Subsequently, the quantitative priority of different cartographic elements was summarized based on this method. In cartographic generalization, the method can be preferred to select the significant elements and discard those that are relatively non-significant.

  11. An improved transmutation method for quantitative determination of the components in multicomponent overlapping chromatograms.

    Science.gov (United States)

    Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong

    2004-06-01

    An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.

  12. Resection of the Tooth Apex with Diode Laser

    Directory of Open Access Journals (Sweden)

    Uzunov Tz.

    2014-06-01

    Full Text Available An “in vitro” experimental study has been carried out on 70 extracted teeth. A laser resection of the root apex has been carried out with diode laser beam with a wavelength of - 810 ± 10 nm. Sequentially a radiation with increasing power has been applied, as follows: 1,3 W, 2W, 3W, 4W, 5W, 6W, 7W, in electro surgery mode. Successful resection of the tooth apex has been performed at: 3W; 4W; 5W; 6W and 7W power. It was established that when laser resected the tooth apex carbonizes.

  13. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry

    Science.gov (United States)

    Chavez, Juan D.; Eng, Jimmy K.; Schweppe, Devin K.; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E.

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions. PMID:27997545

  14. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far.

  15. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review☆

    Institute of Scientific and Technical Information of China (English)

    Prasanna A. Datar

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharma-cology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC-MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quan-titative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enu-merates metabolites, and impurities reported so far.

  16. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    Science.gov (United States)

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies.

  17. A Method for Quantitative Analysis of Chemical Mixtures with THz Time Domain Spectroscopy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zeng-Yan; JI Te; YU Xiao-Han; XIAO Ti-Qiao; XU Hong-Jie

    2006-01-01

    @@ A method for analysing chemical mixtures quantitatively with terahertz time domain spectroscopy is proposed.The experimental results demonstrate the feasibility of this technique. Transmission coefficient of THz wave at the sample surface is taken into account to improve the analytic precision. Isomer mixtures are chosen as the experimental samples. Compared to similar techniques, the analytic precision could be improved evidently in this method.

  18. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    Science.gov (United States)

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  19. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    Science.gov (United States)

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  20. Quantitative Methods in Public Administration: Their Use and Development Through Time

    NARCIS (Netherlands)

    Groeneveld, S.; Tummers, L.; Bronkhorst, B.; Ashikali, T.; Thiel, S. van

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001–2010 in four leading journals: Journal of Public Administration Research

  1. Quantitative Methods in Public Administration: their use and development through time

    NARCIS (Netherlands)

    Groeneveld, S.M.; Tummers, L.G.; Bronkhorst, B.A.C.; Ashikali, T.S.; van Thiel, S.

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001-2010 in four leading journals: JPART, PAR, Governance and PA. Results show

  2. Can You Repeat That Please?: Using Monte Carlo Simulation in Graduate Quantitative Research Methods Classes

    Science.gov (United States)

    Carsey, Thomas M.; Harden, Jeffrey J.

    2015-01-01

    Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…

  3. A method for the quantitative determination of crystalline phases by X-ray

    Science.gov (United States)

    Petzenhauser, I.; Jaeger, P.

    1988-01-01

    A mineral analysis method is described for rapid quantitative determination of crystalline substances in those cases in which the sample is present in pure form or in a mixture of known composition. With this method there is no need for prior chemical analysis.

  4. Student Performance in a Quantitative Methods Course under Online and Face-to-Face Delivery

    Science.gov (United States)

    Verhoeven, Penny; Wakeling, Victor

    2011-01-01

    In a study conducted at a large public university, the authors assessed, for an upper-division quantitative methods business core course, the impact of delivery method (online versus face-toface) on the success rate (percentage of enrolled students earning a grade of A, B, or C in the course). The success rate of the 161 online students was 55.3%,…

  5. Developing Investigative Entry Points: Exploring the Use of Quantitative Methods in English Education Research

    Science.gov (United States)

    McGraner, Kristin L.; Robbins, Daniel

    2010-01-01

    Although many research questions in English education demand the use of qualitative methods, this paper will briefly explore how English education researchers and doctoral students may use statistics and quantitative methods to inform, complement, and/or deepen their inquiries. First, the authors will provide a general overview of the survey areas…

  6. Quantitative Methods in Public Administration: Their Use and Development Through Time

    NARCIS (Netherlands)

    Groeneveld, S.; Tummers, L.; Bronkhorst, B.; Ashikali, T.; Thiel, S. van

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001–2010 in four leading journals: Journal of Public Administration Research an

  7. Quantitative Methods in Public Administration: their use and development through time

    NARCIS (Netherlands)

    Groeneveld, S.M.; Tummers, L.G.; Bronkhorst, B.A.C.; Ashikali, T.S.; van Thiel, S.

    2015-01-01

    This article aims to contribute to recent debates on research methods in public administration by examining the use of quantitative methods in public administration research. We analyzed 1,605 articles published between 2001-2010 in four leading journals: JPART, PAR, Governance and PA. Results show

  8. Qualitative Methods Can Enrich Quantitative Research on Occupational Stress: An Example from One Occupational Group

    Science.gov (United States)

    Schonfeld, Irvin Sam; Farrell, Edwin

    2010-01-01

    The chapter examines the ways in which qualitative and quantitative methods support each other in research on occupational stress. Qualitative methods include eliciting from workers unconstrained descriptions of work experiences, careful first-hand observations of the workplace, and participant-observers describing "from the inside" a…

  9. Using the Taguchi method for rapid quantitative PCR optimization with SYBR Green I.

    Science.gov (United States)

    Thanakiatkrai, Phuvadol; Welch, Lindsey

    2012-01-01

    Here, we applied the Taguchi method, an engineering optimization process, to successfully determine the optimal conditions for three SYBR Green I-based quantitative PCR assays. This method balanced the effects of all factors and their associated levels by using an orthogonal array rather than a factorial array. Instead of running 27 experiments with the conventional factorial method, the Taguchi method achieved the same optimal conditions using only nine experiments, saving valuable resources.

  10. A quantitative PCR method to quantify ruminant DNA in porcine crude heparin.

    Science.gov (United States)

    Concannon, Sean P; Wimberley, P Brett; Workman, Wesley E

    2011-01-01

    Heparin is a well-known glycosaminoglycan extracted from porcine intestines. Increased vigilance for transmissible spongiform encephalopathy in animal-derived pharmaceuticals requires methods to prevent the introduction of heparin from ruminants into the supply chain. The sensitivity, specificity, and precision of the quantitative polymerase chain reaction (PCR) make it a superior analytical platform for screening heparin raw material for bovine-, ovine-, and caprine-derived material. A quantitative PCR probe and primer set homologous to the ruminant Bov-A2 short interspersed nuclear element (SINE) locus (Mendoza-Romero et al. J. Food Prot. 67:550-554, 2004) demonstrated nearly equivalent affinities for bovine, ovine, and caprine DNA targets, while exhibiting no cross-reactivity with porcine DNA in the quantitative PCR method. A second PCR primer and probe set, specific for the porcine PRE1 SINE sequence, was also developed to quantify the background porcine DNA level. DNA extraction and purification was not necessary for analysis of the raw heparin samples, although digestion of the sample with heparinase was employed. The method exhibits a quantitation range of 0.3-3,000 ppm ruminant DNA in heparin. Validation parameters of the method included accuracy, repeatability, precision, specificity, range, quantitation limit, and linearity.

  11. An efficient quantitation method of next-generation sequencing libraries by using MiSeq sequencer.

    Science.gov (United States)

    Katsuoka, Fumiki; Yokozawa, Junji; Tsuda, Kaoru; Ito, Shin; Pan, Xiaoqing; Nagasaki, Masao; Yasuda, Jun; Yamamoto, Masayuki

    2014-12-01

    Library quantitation is a critical step to obtain high data output in Illumina HiSeq sequencers. Here, we introduce a library quantitation method that uses the Illumina MiSeq sequencer designated as quantitative MiSeq (qMiSeq). In this procedure, 96 dual-index libraries, including control samples, are denatured, pooled in equal volume, and sequenced by MiSeq. We found that relative concentration of each library can be determined based on the observed index ratio and can be used to determine HiSeq run condition for each library. Thus, qMiSeq provides an efficient way to quantitate a large number of libraries at a time.

  12. Research on Petroleum Reservoir Diagenesis and Damage Using EDS Quantitative Analysis Method With Standard Samples

    Institute of Scientific and Technical Information of China (English)

    包书景; 陈文学; 等

    2000-01-01

    In recent years,the X-ray spectrometer has been devekloped not only just in enhancing resolution,but also towards dynamic analysis.Computer modeling processing,sampled quantitative analysis and supra-light element analysis.With the gradual sophistication of the quantitative analysis system software,the rationality and accuracy of the established sample deferential document have become the most important guarantee to the reliability of sample quantitative analysis.This work is an important technical subject in China Petroleum Reservoir Research.Through two years of research and experimental work,the EDS quantitative analysis method for petroleum geolgey and resevoir research has been established.and referential documents for five mineral(silicate,etc).specimen standards have been compiled.Closely combining the shape characters and compositional characters of the minerals together and applying them into reservoir diagenetic research and prevention of oil formations from damage,we have obtained obvious geological effects.

  13. Quantitative methods used in Australian health promotion research: a review of publications from 1992-2002.

    Science.gov (United States)

    Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret

    2006-04-01

    This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.

  14. [Study on the multivariate quantitative analysis method for steel alloy elements using LIBS].

    Science.gov (United States)

    Gu, Yan-hong; Li, Ying; Tian, Ye; Lu, Yuan

    2014-08-01

    Quantitative analysis of steel alloys was carried out using laser induced breakdown spectroscopy (LIBS) taking into account the complex matrix effects in steel alloy samples. The laser induced plasma was generated by a Q-switched Nd:YAG laser operating at 1064 nm with pulse width of 10 ns and repeated frequency of 10 Hz. The LIBS signal was coupled to the echelle spectrometer and recorded by a high sensitive ICCD detector. To get the best experimental conditions, some parameters, such as the detection delay, the CCDs integral gate width and the detecting position from the sample surface, were optimized. The experimental results showed that the optimum detection delay time was 1.5 micros, the optimal CCDs integral gate width was 2 micros and the best detecting position was 1.5 mm below the alloy sample's surface. The samples used in the experiments are ten standard steel alloy samples and two unknown steel alloy samples. The quantitative analysis was investigated with the optimum experimental parameters. Elements Cr and Ni in steel alloy samples were taken as the detection targets. The analysis was carried out with the methods based on conditional univariate quantitative analysis, multiple linear regression and partial least squares (PLS) respectively. It turned out that the correlation coefficients of calibration curves are not very high in the conditional univariate calibration method. The analysis results were obtained with the unsatisfied relative errors for the two predicted samples. So the con- ditional univariate quantitative analysis method can't effectively serve the quantitative analysis purpose for multi-components and complex matrix steel alloy samples. And with multiple linear regression method, the analysis accuracy was improved effectively. The method based on partial least squares (PLS) turned out to be the best method among all the three quantitative analysis methods applied. Based on PLS, the correlation coefficient of calibration curve for Cr is 0

  15. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  16. An overview of electronic apex locators: part 2.

    Science.gov (United States)

    Ali, R; Okechukwu, N C; Brunton, P; Nattress, B

    2013-03-01

    A number of electronic apex locators are available for use during endodontic treatment. The use of third and fourth generation electronic apex locators (EAL) are recommended to help clinicians determine the apical limit of the root canal system (RCS). The presence of different irrigating media in the RCS does not impact significantly on the performance of third/fourth generation apex locators. The devices are most accurate at determining the apical limit when the attached endodontic file contacts the periodontal ligament space and the visual analogue displays 'Apex' or '0.' Given the accuracies of modern generation EALs, the clinician should be able to consistently identify the apical limit of the tooth under treatment. Their use in conjunction with appropriate radiographs and the clinician's knowledge of average RCS lengths and anatomy will maximise the successful outcome of any orthograde endodontic treatment.

  17. On quantitative prediction of mine structure and application of the method

    Institute of Scientific and Technical Information of China (English)

    夏玉成; 樊怀仁

    2002-01-01

    Coal mining activity is often restricted by geologic structural conditions, so it is very important to know the distribution situation of mine structures in advance of mining. For this reason, traditional qualitative procedure must give way to quantitative prediction method backed by mathematics theory and computer technology. This paper explores some relevant problems with the method, introducing a software, MSPS, used to predict automatically and quantitatively the relative complexities of geologic structures in different blocks of a coal mining area, with an application example employing the software to select the most suitable mining sites.

  18. Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?

    Science.gov (United States)

    Happ, Mary Beth

    2010-01-01

    This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973

  19. Rapid quantitative pharmacodynamic imaging by a novel method: theory, simulation testing and proof of principle

    Directory of Open Access Journals (Sweden)

    Kevin J. Black

    2013-08-01

    Full Text Available Pharmacological challenge imaging has mapped, but rarely quantified, the sensitivity of a biological system to a given drug. We describe a novel method called rapid quantitative pharmacodynamic imaging. This method combines pharmacokinetic-pharmacodynamic modeling, repeated small doses of a challenge drug over a short time scale, and functional imaging to rapidly provide quantitative estimates of drug sensitivity including EC50 (the concentration of drug that produces half the maximum possible effect. We first test the method with simulated data, assuming a typical sigmoidal dose-response curve and assuming imperfect imaging that includes artifactual baseline signal drift and random error. With these few assumptions, rapid quantitative pharmacodynamic imaging reliably estimates EC50 from the simulated data, except when noise overwhelms the drug effect or when the effect occurs only at high doses. In preliminary fMRI studies of primate brain using a dopamine agonist, the observed noise level is modest compared with observed drug effects, and a quantitative EC50 can be obtained from some regional time-signal curves. Taken together, these results suggest that research and clinical applications for rapid quantitative pharmacodynamic imaging are realistic.

  20. A rapid, sensitive, and selective method for quantitation of lamprey migratory pheromones in river water.

    Science.gov (United States)

    Stewart, Michael; Baker, Cindy F; Cooney, Terry

    2011-11-01

    The methodology of using fish pheromones, or chemical signatures, as a tool to monitor or manage species of fish is rapidly gaining popularity. Unequivocal detection and accurate quantitation of extremely low concentrations of these chemicals in natural waters is paramount to using this technique as a management tool. Various species of lamprey are known to produce a mixture of three important migratory pheromones; petromyzonol sulfate (PS), petromyzonamine disulfate (PADS), and petromyzosterol disulfate (PSDS), but presently there are no established robust methods for quantitation of all three pheromones. In this study, we report a new, highly sensitive and selective method for the rapid identification and quantitation of these pheromones in river water samples. The procedure is based on pre-concentration, followed by liquid chromatography/tandem mass spectrometry (LC/MS/MS) analysis. The method is fast, with unambiguous pheromone determination. Practical quantitation limits of 0.25 ng/l were achieved for PS and PADS and 2.5 ng/l for PSDS in river water, using a 200-fold pre-concentration, However, lower quantitation limits can be achieved with greater pre-concentration. The methodology can be modified easily to include other chemicals of interest. Furthermore, the pre-concentration step can be applied easily in the field, circumventing potential stability issues of these chemicals.

  1. Effect of preflaring on Root ZX apex locators.

    Science.gov (United States)

    Ibarrola, J L; Chapman, B L; Howard, J H; Knowles, K I; Ludlow, M O

    1999-09-01

    The Root ZX apex locator is an example of a generation of apex locators that identify the terminus of the canal by measuring a ratio between two electrical impedances. Studies have shown this device to have a high degree of accuracy. However, the manufacturer warns that the performance of these devices is limited by the presence of calcifications and dentinal shaving obstructions. An in vitro study was designed to determine if preflaring of canals would facilitate the passage of files to the apical foramen by eliminating cervical interferences and to see what effect this would have on the performance of the Root ZX apex locator. Thirty-two canals were divided into two groups. Group 1 was not manipulated before use of the Root ZX apex locator and served as control. In group 2, the canals were preflared before the use of the Root Zx apex locator. The working length files were secured in place and measured with the linear measurement tool used by the Visilog 5 imaging program. Results of this study suggest that preflaring of canals will allow working length files to more consistently reach the apical foramen (p = 0.015), which in turn increases the efficacy of the Root ZX apex locator.

  2. Quantitative data analysis methods for bead-based DNA hybridization assays using generic flow cytometry platforms.

    Science.gov (United States)

    Corrie, S R; Lawrie, G A; Battersby, B J; Ford, K; Rühmann, A; Koehler, K; Sabath, D E; Trau, M

    2008-05-01

    Bead-based assays are in demand for rapid genomic and proteomic assays for both research and clinical purposes. Standard quantitative procedures addressing raw data quality and analysis are required to ensure the data are consistent and reproducible across laboratories independent of flow platform. Quantitative procedures have been introduced spanning raw histogram analysis through to absolute target quantitation. These included models developed to estimate the absolute number of sample molecules bound per bead (Langmuir isotherm), relative quantitative comparisons (two-sided t-tests), and statistical analyses investigating the quality of raw fluorescence data. The absolute target quantitation method revealed a concentration range (below probe saturation) of Cy5-labeled synthetic cytokeratin 19 (K19) RNA of c.a. 1 x 10(4) to 500 x 10(4) molecules/bead, with a binding constant of c.a. 1.6 nM. Raw hybridization frequency histograms were observed to be highly reproducible across 10 triplex assay replicates and only three assay replicates were required to distinguish overlapping peaks representing small sequence mismatches. This study provides a quantitative scheme for determining the absolute target concentration in nucleic acid hybridization reactions and the equilibrium binding constants for individual probe/target pairs. It is envisaged that such studies will form the basis of standard analytical procedures for bead-based cytometry assays to ensure reproducibility in inter- and intra-platform comparisons of data between laboratories. (c) 2008 International Society for Advancement of Cytometry.

  3. Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies.

    Science.gov (United States)

    Sandelowski, M

    2000-06-01

    Researchers have increasingly turned to mixed-method techniques to expand the scope and improve the analytic power of their studies. Yet there is still relatively little direction on and much confusion about how to combine qualitative and quantitative techniques. These techniques are neither paradigm- nor method-linked; researchers' orientations to inquiry and their methodological commitments will influence how they use them. Examples of sampling combinations include criterion sampling from instrument scores, random purposeful sampling, and stratified purposeful sampling. Examples of data collection combinations include the use of instruments for fuller qualitative description, for validation, as guides for purposeful sampling, and as elicitation devices in interviews. Examples of data analysis combinations include interpretively linking qualitative and quantitative data sets and the transformation processes of qualitizing and quantitizing.

  4. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  5. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    Science.gov (United States)

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  6. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  7. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    Science.gov (United States)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  8. Integrating Quantitative and Qualitative Data in Mixed Methods Research--Challenges and Benefits

    Science.gov (United States)

    Almalki, Sami

    2016-01-01

    This paper is concerned with investigating the integration of quantitative and qualitative data in mixed methods research and whether, in spite of its challenges, it can be of positive benefit to many investigative studies. The paper introduces the topic, defines the terms with which this subject deals and undertakes a literature review to outline…

  9. Implementation of a quantitative Foucault knife-edge method by means of isophotometry

    Science.gov (United States)

    Zhevlakov, A. P.; Zatsepina, M. E.; Kirillovskii, V. K.

    2014-06-01

    Detailed description of stages of computer processing of the shadowgrams during implementation of a modern quantitative Foucault knife-edge method is presented. The map of wave-front aberrations introduced by errors of an optical surface or a system, along with the results of calculation of the set of required characteristics of image quality, are shown.

  10. A GC-FID method for quantitative analysis of N,N-carbonyldiimidazole.

    Science.gov (United States)

    Lee, Claire; Mangion, Ian

    2016-03-20

    N,N-Carbonyldiimidazole (CDI), a common synthetic reagent used in commercial scale pharmaceutical synthesis, is known to be sensitive to hydrolysis from ambient moisture. This liability demands a simple, robust analytical method to quantitatively determine reagent quality to ensure reproducible performance in chemical reactions. This work describes a protocol for a rapid GC-FID based analysis of CDI.

  11. Overcoming Methods Anxiety: Qualitative First, Quantitative Next, Frequent Feedback along the Way

    Science.gov (United States)

    Bernstein, Jeffrey L.; Allen, Brooke Thomas

    2013-01-01

    Political Science research methods courses face two problems. First is what to cover, as there are too many techniques to explore in any one course. Second is dealing with student anxiety around quantitative material. We explore a novel way to approach these issues. Our students began by writing a qualitative paper. They followed with a term…

  12. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  13. Method for Quantitative Determination of Spatial Polymer Distribution in Alginate Beads Using Raman Spectroscopy

    NARCIS (Netherlands)

    Heinemann, Matthias; Meinberg, Holger; Büchs, Jochen; Koß, Hans-Jürgen; Ansorge-Schumacher, Marion B.

    2005-01-01

    A new method based on Raman spectroscopy is presented for non-invasive, quantitative determination of the spatial polymer distribution in alginate beads of approximately 4 mm diameter. With the experimental setup, a two-dimensional image is created along a thin measuring line through the bead compri

  14. Qualitative and Quantitative Research Methods: Old Wine in New Bottles? On Understanding and Interpreting Educational Phenomena

    Science.gov (United States)

    Smeyers, Paul

    2008-01-01

    Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…

  15. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    Science.gov (United States)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  16. Method for Quantitative Determination of Spatial Polymer Distribution in Alginate Beads Using Raman Spectroscopy

    NARCIS (Netherlands)

    Heinemann, Matthias; Meinberg, Holger; Büchs, Jochen; Koß, Hans-Jürgen; Ansorge-Schumacher, Marion B.

    2005-01-01

    A new method based on Raman spectroscopy is presented for non-invasive, quantitative determination of the spatial polymer distribution in alginate beads of approximately 4 mm diameter. With the experimental setup, a two-dimensional image is created along a thin measuring line through the bead

  17. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    Science.gov (United States)

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  18. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    Science.gov (United States)

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  19. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    Science.gov (United States)

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  20. Examination of Quantitative Methods Used in Early Intervention Research: Linkages with Recommended Practices.

    Science.gov (United States)

    Snyder, Patricia; Thompson, Bruce; McLean, Mary E.; Smith, Barbara J.

    2002-01-01

    Findings are reported related to the research methods and statistical techniques used in 450 group quantitative studies examined by the Council for Exceptional Children's Division for Early Childhood Recommended Practices Project. Studies were analyzed across seven dimensions including sampling procedures, variable selection, variable definition,…

  1. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    NARCIS (Netherlands)

    Bosschaart, Nienke; van Leeuwen, Ton; Aalders, Maurice C.G.; Faber, Dirk

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of

  2. New Performance Metrics for Quantitative Polymerase Chain Reaction-Based Microbial Source Tracking Methods

    Science.gov (United States)

    Binary sensitivity and specificity metrics are not adequate to describe the performance of quantitative microbial source tracking methods because the estimates depend on the amount of material tested and limit of detection. We introduce a new framework to compare the performance ...

  3. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    Science.gov (United States)

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  4. Examining Stress in Graduate Assistants: Combining Qualitative and Quantitative Survey Methods

    Science.gov (United States)

    Mazzola, Joseph J.; Walker, Erin J.; Shockley, Kristen M.; Spector, Paul E.

    2011-01-01

    The aim of this study was to employ qualitative and quantitative survey methods in a concurrent mixed model design to assess stressors and strains in graduate assistants. The stressors most frequently reported qualitatively were work overload, interpersonal conflict, and organizational constraints; the most frequently reported psychological…

  5. Counting Better? An Examination of the Impact of Quantitative Method Teaching on Statistical Anxiety and Confidence

    Science.gov (United States)

    Chamberlain, John Martyn; Hillier, John; Signoretta, Paola

    2015-01-01

    This article reports the results of research concerned with students' statistical anxiety and confidence to both complete and learn to complete statistical tasks. Data were collected at the beginning and end of a quantitative method statistics module. Students recognised the value of numeracy skills but felt they were not necessarily relevant for…

  6. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    Science.gov (United States)

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  7. Paradigms Lost and Pragmatism Regained: Methodological Implications of Combining Qualitative and Quantitative Methods

    Science.gov (United States)

    Morgan, David L.

    2007-01-01

    This article examines several methodological issues associated with combining qualitative and quantitative methods by comparing the increasing interest in this topic with the earlier renewal of interest in qualitative research during the 1980s. The first section argues for the value of Kuhn's concept of paradigm shifts as a tool for examining…

  8. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed semi-quantita

  9. MULTI-PEAK MATCH INTENSITY RATIO METHOD OF QUANTITATIVE X-RAY DIFFRACTION PHASE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    G. Chu; Y.F. Cong; H.J. You

    2003-01-01

    A new method for quantitative phase analysis is proposed by using X-ray diffraction multi-peak match intensity ratio. This method can obtain the multi-peak match intensity ratio among each phase in the mixture sample by using all diffraction peak data in the mixture sample X-ray diffraction spectrum and combining the relative intensity distribution data of each phase standard peak in JCPDS card to carry on the least square method regression analysis. It is benefit to improve the precision of quantitative phase analysis that the given single line ratio which is usually adopted is taken the place of the multi-peak match intensity ratio and is used in X-ray diffraction quantitative phase analysis of the mixture sample. By analyzing four-group mixture sample, adopting multi-peak match intensity ratio and X-ray diffraction quantitative phase analysis principle of combining the adiabatic and matrix flushing method, it is tested that the experimental results are identical with theory.

  10. A method for the rapid qualitative and quantitative analysis of 4,4-dimethyl sterols.

    Science.gov (United States)

    Gibbons, G F; Mitropoulos, K A; Ramananda, K

    1973-09-01

    A simple and relatively rapid technique has been developed for the separation of several 4,4-dimethyl steryl acetates, some of which contain sterically hindered nuclear double bonds. The method involves thin-layer chromatography on silver nitrate-impregnated silica gel and silver nitrate-impregnated alumina. The separated steryl acetates may then be analyzed quantitatively by gas-liquid chromatography.

  11. A Rapid and Sensitive Method for the Quantitation of Legionella Pneumophila Antigen from Human Urine.

    Science.gov (United States)

    1980-11-12

    reverse% passive hemagglutination test was developed to assay concentrations of solubl4 antigen of Legionnaires’ Disease ( Legionella pneumophila ) in... Legionella pneumophila Antigen from Humanl Urine JOSEPH A. MANGIAFICO, KENNETH W. HEDLUND, AND ALLEN R. KNOTT Running title: L. PNEUMOPHILA ANTIGEN IN...Approved for public release; distribution unlimited A Rapid and Sensitive Method for the Quantitation of Legionella pneumophila Antigen from Human

  12. Potential Guidelines for Conducting and Reporting Environmental Education Research: Quantitative Methods of Inquiry.

    Science.gov (United States)

    Smith-Sebasto, N. J.

    2001-01-01

    Presents potential guidelines for conducting and reporting environmental education research using quantitative methods of inquiry that were developed during a 10-hour (1-1/2 day) workshop sponsored by the North American Commission on Environmental Education Research during the 1998 annual meeting of the North American Association for Environmental…

  13. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers.

    Science.gov (United States)

    Harman-Ware, Anne E; Foster, Cliff; Happs, Renee M; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F

    2016-10-01

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. © 2016 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    Energy Technology Data Exchange (ETDEWEB)

    Harman-Ware, Anne E. [Bioenergy Science Center, Golden CO USA; National Bioenergy Center, National Renewable Energy Laboratory, Golden CO USA; Foster, Cliff [Great Lakes BioEnergy Research Center, Michigan State University, East Lansing MI USA; Happs, Renee M. [Bioenergy Science Center, Golden CO USA; National Bioenergy Center, National Renewable Energy Laboratory, Golden CO USA; Doeppke, Crissa [Bioenergy Science Center, Golden CO USA; National Bioenergy Center, National Renewable Energy Laboratory, Golden CO USA; Meunier, Kristoffer [Great Lakes BioEnergy Research Center, Michigan State University, East Lansing MI USA; Gehan, Jackson [Great Lakes BioEnergy Research Center, Michigan State University, East Lansing MI USA; Yue, Fengxia [Wisconsin Bioenergy Initiative, University of Wisconsin, Madison WI USA; Lu, Fachuang [Wisconsin Bioenergy Initiative, University of Wisconsin, Madison WI USA; Davis, Mark F. [Bioenergy Science Center, Golden CO USA; National Bioenergy Center, National Renewable Energy Laboratory, Golden CO USA

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by SS-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.

  15. Bayesian data augmentation methods for the synthesis of qualitative and quantitative research findings

    Science.gov (United States)

    Crandell, Jamie L.; Voils, Corrine I.; Chang, YunKyung; Sandelowski, Margarete

    2010-01-01

    The possible utility of Bayesian methods for the synthesis of qualitative and quantitative research has been repeatedly suggested but insufficiently investigated. In this project, we developed and used a Bayesian method for synthesis, with the goal of identifying factors that influence adherence to HIV medication regimens. We investigated the effect of 10 factors on adherence. Recognizing that not all factors were examined in all studies, we considered standard methods for dealing with missing data and chose a Bayesian data augmentation method. We were able to summarize, rank, and compare the effects of each of the 10 factors on medication adherence. This is a promising methodological development in the synthesis of qualitative and quantitative research. PMID:21572970

  16. A method for the extraction and quantitation of phycoerythrin from algae

    Science.gov (United States)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  17. Quantitative measurement of displacement and strain by the numerical moiré method

    Institute of Scientific and Technical Information of China (English)

    Chunwang Zhao; Yongming Xing; Pucun Bai; Lifu Wang

    2008-01-01

    The numerical moié method with sensitivity as high as 0.03 nm has been presented.A quantitative displacement and strai,analysis program has been proposed by using this method.It is applied to an edge dislocation and a stacking fault in aluminum.The measured strain of edge dislocation is compared with theoretical prediction given by Peierls-Nabarro dislocation model.The displacement of stacking fault is also obtained.

  18. Analysis of Preoperative Detection for Apex Prostate Cancer by Transrectal Biopsy

    OpenAIRE

    Tomokazu Sazuka; Takashi Imamoto; Takeshi Namekawa; Takanobu Utsumi; Mitsuru Yanagisawa; Koji Kawamura; Naoto Kamiya; Hiroyoshi Suzuki; Takeshi Ueda; Satoshi Ota; Yukio Nakatani; Tomohiko Ichikawa

    2013-01-01

    Background. The aim of this study was to determine concordance rates for prostatectomy specimens and transrectal needle biopsy samples in various areas of the prostate in order to assess diagnostic accuracy of the transrectal biopsy approach, especially for presurgical detection of cancer in the prostatic apex. Materials and Methods. From 2006 to 2011, 158 patients whose radical prostatectomy specimens had been evaluated were retrospectively enrolled in this study. Concordance rates for h...

  19. Linking multidimensional functional diversity to quantitative methods: a graphical hypothesis--evaluation framework.

    Science.gov (United States)

    Boersma, Kate S; Dee, Laura E; Miller, Steve J; Bogan, Michael T; Lytle, David A; Gitelman, Alix I

    2016-03-01

    Functional trait analysis is an appealing approach to study differences among biological communities because traits determine species' responses to the environment and their impacts on ecosystem functioning. Despite a rapidly expanding quantitative literature, it remains challenging to conceptualize concurrent changes in multiple trait dimensions ("trait space") and select quantitative functional diversity methods to test hypotheses prior to analysis. To address this need, we present a widely applicable framework for visualizing ecological phenomena in trait space to guide the selection, application, and interpretation of quantitative functional diversity methods. We describe five hypotheses that represent general patterns of responses to disturbance in functional community ecology and then apply a formal decision process to determine appropriate quantitative methods to test ecological hypotheses. As a part of this process, we devise a new statistical approach to test for functional turnover among communities. Our combination of hypotheses and metrics can be applied broadly to address ecological questions across a range of systems and study designs. We illustrate the framework with a case study of disturbance in freshwater communities. This hypothesis-driven approach will increase the rigor and transparency of applied functional trait studies.

  20. The LASSO and sparse least square regression methods for SNP selection in predicting quantitative traits.

    Science.gov (United States)

    Feng, Zeny Z; Yang, Xiaojian; Subedi, Sanjeena; McNicholas, Paul D

    2012-01-01

    Recent work concerning quantitative traits of interest has focused on selecting a small subset of single nucleotide polymorphisms (SNPs) from amongst the SNPs responsible for the phenotypic variation of the trait. When considered as covariates, the large number of variables (SNPs) and their association with those in close proximity pose challenges for variable selection. The features of sparsity and shrinkage of regression coefficients of the least absolute shrinkage and selection operator (LASSO) method appear attractive for SNP selection. Sparse partial least squares (SPLS) is also appealing as it combines the features of sparsity in subset selection and dimension reduction to handle correlations amongst SNPs. In this paper we investigate application of the LASSO and SPLS methods for selecting SNPs that predict quantitative traits. We evaluate the performance of both methods with different criteria and under different scenarios using simulation studies. Results indicate that these methods can be effective in selecting SNPs that predict quantitative traits but are limited by some conditions. Both methods perform similarly overall but each exhibit advantages over the other in given situations. Both methods are applied to Canadian Holstein cattle data to compare their performance.

  1. A Case Study Concerning Sales Prediction Using Sales Quantitative Prediction Methods

    Directory of Open Access Journals (Sweden)

    Simona Elena Dragomirescu

    2010-08-01

    Full Text Available The sales condition the entire activity of a enterprise, its variation being considered the main risk factor on the performances and financial position of the enterprise. The importance of elaboration of such budget is given by: (a on long term: the establishing of the investments and financing plans; (b on medium term: the establishing of publicity and promotion expenses budget; and (c on short term: the determination of the production level, of supply program, the optimization of labor force. In planning the sales volume, there exist several methods, from which we remind: causal method, non-causal method, direct method, indirect method, judgment and statistic methods. All these methods have advantages and disadvantages. Quantitative methods are the methods that in predictions’ realization start from numbered statistic data. The linear adjustment, correlation may be applied for the general tendencies of sales evolution research, when the tendency is linear.

  2. Combining qualitative and quantitative research within mixed method research designs: a methodological review.

    Science.gov (United States)

    Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh

    2011-03-01

    It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and

  3. Integrated Geophysical Methods Applied to Geotechnical and Geohazard Engineering: From Qualitative to Quantitative Analysis and Interpretation

    Science.gov (United States)

    Hayashi, K.

    2014-12-01

    The Near-Surface is a region of day-to-day human activity on the earth. It is exposed to the natural phenomena which sometimes cause disasters. This presentation covers a broad spectrum of the geotechnical and geohazard ways of mitigating disaster and conserving the natural environment using geophysical methods and emphasizes the contribution of geophysics to such issues. The presentation focusses on the usefulness of geophysical surveys in providing information to mitigate disasters, rather than the theoretical details of a particular technique. Several techniques are introduced at the level of concept and application. Topics include various geohazard and geoenvironmental applications, such as for earthquake disaster mitigation, preventing floods triggered by tremendous rain, for environmental conservation and studying the effect of global warming. Among the geophysical techniques, the active and passive surface wave, refraction and resistivity methods are mainly highlighted. Together with the geophysical techniques, several related issues, such as performance-based design, standardization or regularization, internet access and databases are also discussed. The presentation discusses the application of geophysical methods to engineering investigations from non-uniqueness point of view and introduces the concepts of integrated and quantitative. Most geophysical analyses are essentially non-unique and it is very difficult to obtain unique and reliable engineering solutions from only one geophysical method (Fig. 1). The only practical way to improve the reliability of investigation is the joint use of several geophysical and geotechnical investigation methods, an integrated approach to geophysics. The result of a geophysical method is generally vague, here is a high-velocity layer, it may be bed rock, this low resistivity section may contain clayey soils. Such vague, qualitative and subjective interpretation is not worthwhile on general engineering design works

  4. Quantitation of genistein and genistin in soy dry extracts by UV-Visible spectrophotometric method

    Directory of Open Access Journals (Sweden)

    Isabela da Costa César

    2008-01-01

    Full Text Available This paper describes the development and validation of an UV-Visible spectrophotometric method for quantitation of genistein and genistin in soy dry extracts, after reaction with aluminum chloride. The method showed to be linear (r²= 0.9999, precise (R.S.D. < 2%, accurate (recovery of 101.56% and robust. Seven samples of soy dry extracts were analyzed by the spectrophotometric validated method and by RP-HPLC. Genistein concentrations determined by spectrophotometry (0.63% - 16.05% were slightly higher than values obtained by HPLC analysis (0.40% - 12.79%; however, the results of both methods showed a strong correlation.

  5. Quantitative data analysis methods for 3D microstructure characterization of Solid Oxide Cells

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley

    through percolating networks and reaction rates at the triple phase boundaries. Quantitative analysis of microstructure is thus important both in research and development of optimal microstructure design and fabrication. Three dimensional microstructure characterization in particular holds great promise...... for gaining further fundamental understanding of how microstructure affects performance. In this work, methods for automatic 3D characterization of microstructure are studied: from the acquisition of 3D image data by focused ion beam tomography to the extraction of quantitative measures that characterize...... the microstructure. The methods are exemplied by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data are developed. The preprocessing steps correct for errors introduced by the image acquisition by the focused ion beam serial sectioning. Alignment...

  6. Quantitative interferometric microscopy with two dimensional Hilbert transform based phase retrieval method

    Science.gov (United States)

    Wang, Shouyu; Yan, Keding; Xue, Liang

    2017-01-01

    In order to obtain high contrast images and detailed descriptions of label free samples, quantitative interferometric microscopy combining with phase retrieval is designed to obtain sample phase distributions from fringes. As accuracy and efficiency of recovered phases are affected by phase retrieval methods, thus approaches owning higher precision and faster processing speed are still in demand. Here, two dimensional Hilbert transform based phase retrieval method is adopted in cellular phase imaging, it not only reserves more sample specifics compared to classical fast Fourier transform based method, but also overcomes disadvantages of traditional algorithm according to Hilbert transform which is a one dimensional processing causing phase ambiguities. Both simulations and experiments are provided, proving the proposed phase retrieval approach can acquire quantitative sample phases with high accuracy and fast speed.

  7. Preliminary researches to standardize a method of quantitative analysis on Lactobacillus acidophilus in poultry feed

    Directory of Open Access Journals (Sweden)

    Daniele Gallazzi

    2010-01-01

    Full Text Available The study focuses on the method and the problems about quantitative analyses in the research on Lactobacillus acidophilus after its addition to commercial poultry-feed, whose rough grinding is not suitable for the “IDF Standard quantitative method for lactic acid bacteria count at 37°C” employed in dairy products. Poultry-feed was prepared every month. A sample was collected before and after adding Lactobacillus acidophilus, while analyses were carried out respectively at T 0, 15 and 28 days after the food storage at 4-6°C. The best outcomes (more 30% of recovered cells compared to the standard method resulted from samples subjected to the homogenization and the addition of Skim Milk Powder.

  8. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    Science.gov (United States)

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  9. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Science.gov (United States)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-03-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  10. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com [Airbus Group Innovations, Munich (Germany); Grosse, Christian, E-mail: Grosse@tum.de [Technical University Munich (Germany)

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  11. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    B. Mrzygłód

    2013-07-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  12. Laboratory and field validation of a Cry1Ab protein quantitation method for water.

    Science.gov (United States)

    Strain, Katherine E; Whiting, Sara A; Lydy, Michael J

    2014-10-01

    The widespread planting of crops expressing insecticidal proteins derived from the soil bacterium Bacillus thuringiensis (Bt) has given rise to concerns regarding potential exposure to non-target species. These proteins are released from the plant throughout the growing season into soil and surface runoff and may enter adjacent waterways as runoff, erosion, aerial deposition of particulates, or plant debris. It is crucial to be able to accurately quantify Bt protein concentrations in the environment to aid in risk analyses and decision making. Enzyme-linked immunosorbent assay (ELISA) is commonly used for quantitation of Bt proteins in the environment; however, there are no published methods detailing and validating the extraction and quantitation of Bt proteins in water. The objective of the current study was to optimize the extraction of a Bt protein, Cry1Ab, from three water matrices and validate the ELISA method for specificity, precision, accuracy, stability, and sensitivity. Recovery of the Cry1Ab protein was matrix-dependent and ranged from 40 to 88% in the validated matrices, with an overall method detection limit of 2.1 ng/L. Precision among two plates and within a single plate was confirmed with a coefficient of variation less than 20%. The ELISA method was verified in field and laboratory samples, demonstrating the utility of the validated method. The implementation of a validated extraction and quantitation protocol adds consistency and reliability to field-collected data regarding transgenic products.

  13. THE DETECTION OF MDR1 GENE EXPRESSION USING FLUOROGENIC PROBE QUANTITATIVE RT-PCR METHOD

    Institute of Scientific and Technical Information of China (English)

    高劲松; 马刚; 仝明; 陈佩毅; 王传华; 何蕴韶

    2001-01-01

    Objective: To establish a fluoregenic probe quantitative RT-PCR (FQ-RT-PCR) method for detection of the expression of MDR1 gene in tumor cells and to investigate the expression of MDR1 gene in patients with lung cancer. Methods: The fluorogenic quantitative RT-PCR method for detection of the expression of MDR1 gene was established. K562/ADM and K562 cell lines or 45 tumor tissues from patients with lung cancer were examined on PE Applied Biosystems 7700 Sequence Detection machine. Results: the average levels of MDR1 gene expression in K562/ADM cells and K562 cells were (6.86±0.65)× 107copies/mg RNA and (8.49±0.67)×105 copies/mg RNA, respectively. The former was 80.8 times greater than the latter. Each sample was measured 10 times and the coefficient variation (CV) was 9.5% and 7.9%, respectively. Various levels of MDR1 gene expression were detected in 12 of 45 patients with lung cancer. Conclusion: Quantitative detection of MDR1 gene expression in tumor cells was achieved by using FQ-RT-PCR. FQ-RT-PCR is an accurate, and sensitive method and easy to perform. Using this method, low levels of MDR1 gene expression could be detected in 24% of the patients with lung cancer.

  14. Acute orbital apex syndrome and rhino-orbito-cerebral mucormycosis

    Directory of Open Access Journals (Sweden)

    Anders UM

    2015-04-01

    Full Text Available Ursula M Anders,1 Elise J Taylor,1 Joseph R Martel,1–3 James B Martel1–3 1Research Center, Martel Eye Medical Group, Rancho Cordova, 2Graduate Medical Education, California Northstate University College of Medicine, Elk Grove, 3Department of Ophthalmology, Dignity Health, Carmichael, CA, USA Purpose: To demonstrate the successful clinical identification and management of rhino-orbital mucormycosis, a fungal infection with a high mortality rate. Patients and methods: A diabetic male patient with a headache and orbital apex syndrome in the right eye was examined using computed tomography (CT and magnetic resonance imaging (MRI for a possible fungal infection. Endoscopic surgical resection was performed and a pathology sample was taken. Specimens were prepared with Gömöri methenamine silver and hematoxylin and eosin staining. The patient was treated with liposomal amphotericin B 400 mg daily, followed by posaconazole 400 mg twice daily. Results: CT and MRI revealed a mass of the right sphenoid spreading into the orbit, indicative of a fungal infection. The biopsy confirmed the diagnosis of mucormycosis. Complete recovery of eyelid and oculomotor function was achieved after 10 months of treatment, although the patient continues to suffer from irreversible blindness in the right eye due to optic nerve atrophy. He has been without signs or symptoms of recurrence. Conclusion: Patients with rhino-orbito-cerebral mucormycosis need extensive surgical and medical treatment to maximize outcomes. Success requires multidisciplinary management. Keywords: ophthalmoplegia, sixth nerve palsy, diabetes mellitus, nephrotoxicity, amphotericin B, posaconazole

  15. An in vivo radiographic evaluation of the accuracy of Apex and iPex electronic Apex locators.

    Science.gov (United States)

    Paludo, Laura; Souza, Sophia Lopes de; Só, Marcus Vinícius Reis; Rosa, Ricardo Abreu da; Vier-Pelisser, Fabiana Vieira; Duarte, Marco Antônio Húngaro

    2012-01-01

    The aim of this study was to evaluate in vivo the clinical applicability of two electronic apex locators (EALs) - Apex (Septodont) and iPex (NSK) - in different groups of human teeth by using radiography. The working lengths (WLs) of 100 root canals were determined electronically. The EAL to be used first was chosen randomly and a K-file was inserted into the root canal until the EAL display indicated the location of the apical constriction (0 mm). The K-file was fixed to the tooth and a periapical radiograph was taken using a radiographic film holder. The K-file was removed and the WL was measured. The same procedure was repeated using the other EAL. Radiographs were examined with the aid of a light-box with lens of ×4 magnification by two blinded experienced endodontists. The distance between the file tip and the root apex was recorded as follows: (A) +1 to 0 mm, (B) -0.1 to 0.5 mm, (C) -0.6 to 1 mm, (D) -1.1 to 1.5 mm, and (E) -1.6 mm or greater. For statistical purposes, these scores were divided into 2 subgroups according to the radiographic apex: acceptable (B, C, and D) and non-acceptable (A and E). Statistically significant differences were not found between the results of Apex and iPex in terms of acceptable and non-acceptable measurements (p>0.05) or in terms of the distance recorded from file tip and the radiographic apex (p>0.05). Apex and iPex EALs provided reliable measurements for WL determination for endodontic therapy.

  16. A processing method enabling the use of peak height for accurate and precise proton NMR quantitation.

    Science.gov (United States)

    Hays, Patrick A; Thompson, Robert A

    2009-10-01

    In NMR, peak area quantitation is the most common method used because the area under a peak or peak group is proportional to the number of nuclei at those frequencies. Peak height quantitation has not enjoyed as much utility because of poor precision and linearity as a result of inconsistent shapes and peak widths (measured at half height). By using a post-acquisition processing method employing a Gaussian or line-broadening (exponential decay) apodization (i.e. weighting function) to normalize the shape and width of the internal standard (ISTD) peak, the heights of an analyte calibration spectrum can be compared to the analyte peaks in a sample spectrum resulting in accurate and precise quantitative results. Peak height results compared favorably with 'clean' peak area results for several hundred illicit samples of methamphetamine HCl, cocaine HCl, and heroin HCl, of varying composition and purity. Using peak height and peak area results together can enhance the confidence in the reported purity value; a major advantage in high throughput, automated quantitative analyses. Published in 2009 by John Wiley & Sons, Ltd.

  17. Nanopatterning and tuning of optical taper antenna apex for tip-enhanced Raman scattering performance

    Science.gov (United States)

    Kharintsev, S. S.; Rogov, A. M.; Kazarian, S. G.

    2013-09-01

    This paper focuses on finding optimal electrochemical conditions from linear sweep voltammetry analysis for preparing highly reproducible tip-enhanced Raman scattering (TERS) conical gold tips with dc-pulsed voltage etching. Special attention is given to the reproducibility of tip apex shapes with different etchant mixtures. We show that the fractional Brownian motion model enables a mathematical description of the decaying current kinetics during the whole etching process up to the cutoff event. Further progress in preparation of highly reproducible smooth and sharp tip apexes is related to the effect of an additive, such as isopropanol, to aqueous acids. A finite-difference time-domain method based near-field analysis provides evidence that TERS performance depends critically on tip orientation relative to a highly focused laser beam. A TERS based criterion for recognizing gold tips able to couple/decouple optical near- and far-fields is proposed.

  18. Nanopatterning and tuning of optical taper antenna apex for tip-enhanced Raman scattering performance.

    Science.gov (United States)

    Kharintsev, S S; Rogov, A M; Kazarian, S G

    2013-09-01

    This paper focuses on finding optimal electrochemical conditions from linear sweep voltammetry analysis for preparing highly reproducible tip-enhanced Raman scattering (TERS) conical gold tips with dc-pulsed voltage etching. Special attention is given to the reproducibility of tip apex shapes with different etchant mixtures. We show that the fractional Brownian motion model enables a mathematical description of the decaying current kinetics during the whole etching process up to the cutoff event. Further progress in preparation of highly reproducible smooth and sharp tip apexes is related to the effect of an additive, such as isopropanol, to aqueous acids. A finite-difference time-domain method based near-field analysis provides evidence that TERS performance depends critically on tip orientation relative to a highly focused laser beam. A TERS based criterion for recognizing gold tips able to couple/decouple optical near- and far-fields is proposed.

  19. Nanopatterning and tuning of optical taper antenna apex for tip-enhanced Raman scattering performance

    Energy Technology Data Exchange (ETDEWEB)

    Kharintsev, S. S.; Rogov, A. M. [Department of Optics and Nanophotonics, Institute of Physics, Kazan Federal University, Kremlevskaya 16, Kazan 420008 (Russian Federation); Kazarian, S. G. [Department of Chemical Engineering, Imperial College London, London SW7 2AZ (United Kingdom)

    2013-09-15

    This paper focuses on finding optimal electrochemical conditions from linear sweep voltammetry analysis for preparing highly reproducible tip-enhanced Raman scattering (TERS) conical gold tips with dc-pulsed voltage etching. Special attention is given to the reproducibility of tip apex shapes with different etchant mixtures. We show that the fractional Brownian motion model enables a mathematical description of the decaying current kinetics during the whole etching process up to the cutoff event. Further progress in preparation of highly reproducible smooth and sharp tip apexes is related to the effect of an additive, such as isopropanol, to aqueous acids. A finite-difference time-domain method based near-field analysis provides evidence that TERS performance depends critically on tip orientation relative to a highly focused laser beam. A TERS based criterion for recognizing gold tips able to couple/decouple optical near- and far-fields is proposed.

  20. Penumbra pattern assessment in acute stroke patients: comparison of quantitative and non-quantitative methods in whole brain CT perfusion.

    Directory of Open Access Journals (Sweden)

    Kolja M Thierfelder

    Full Text Available BACKGROUND AND PURPOSE: While penumbra assessment has become an important part of the clinical decision making for acute stroke patients, there is a lack of studies measuring the reliability and reproducibility of defined assessment techniques in the clinical setting. Our aim was to determine reliability and reproducibility of different types of three-dimensional penumbra assessment methods in stroke patients who underwent whole brain CT perfusion imaging (WB-CTP. MATERIALS AND METHODS: We included 29 patients with a confirmed MCA infarction who underwent initial WB-CTP with a scan coverage of 100 mm in the z-axis. Two blinded and experienced readers assessed the flow-volume-mismatch twice and in two quantitative ways: Performing a volumetric mismatch analysis using OsiriX imaging software (MM(VOL and visual estimation of mismatch (MM(EST. Complementarily, the semiquantitative Alberta Stroke Programme Early CT Score for CT perfusion was used to define mismatch (MM(ASPECTS. A favorable penumbral pattern was defined by a mismatch of ≥ 30% in combination with a cerebral blood flow deficit of ≤ 90 ml and an MM(ASPECTS score of ≥ 1, respectively. Inter- and intrareader agreement was determined by Kappa-values and ICCs. RESULTS: Overall, MM(VOL showed considerably higher inter-/intrareader agreement (ICCs: 0.751/0.843 compared to MM(EST (0.292/0.749. In the subgroup of large (≥ 50 mL perfusion deficits, inter- and intrareader agreement of MM(VOL was excellent (ICCs: 0.961/0.942, while MM(EST interreader agreement was poor (0.415 and intrareader agreement was good (0.919. With respect to penumbra classification, MM(VOL showed the highest agreement (interreader agreement: 25 agreements/4 non-agreements/κ: 0.595; intrareader agreement 27/2/0.833, followed by MM(EST (22/7/0.471; 23/6/0.577, and MM(ASPECTS (18/11/0.133; 21/8/0.340. CONCLUSION: The evaluated approach of volumetric mismatch assessment is superior to pure visual and ASPECTS penumbra

  1. Electronic apex locator: A comprehensive literature review - Part I: Different generations, comparison with other techniques and different usages

    Directory of Open Access Journals (Sweden)

    Hamid Mosleh

    2014-01-01

    Full Text Available Introduction: To compare electronic apex locators (EAL with others root canal determination techniques and evaluate other usage of this devices. Materials and Methods: "Tooth apex," "Dental instrument," "Odontometry," "Electronic medical," and "Electronic apex locator" were searched as primary identifiers via Medline/PubMed, Cochrane library, and Scopus data base up to 30 July 2013. Original articles that fulfilled the inclusion criteria were selected and reviewed. Results: Out of 402 relevant studies, 183 were selected based on the inclusion criteria. In this part, 108 studies are presented. Under the same conditions, no significant differences could be seen between different EALs of one generation. The application of EALs can result in lower patient radiation exposure, exact diagnosing of fractures, less perforation, and better retreatment. Conclusions: EALs were more accurate than other techniques in root canal length determination.

  2. Assessment of a semi-quantitative screening method for diagnosis of ethylene glycol poisoning.

    Science.gov (United States)

    Sankaralingam, Arun; Thomas, Annette; James, David R; Wierzbicki, Anthony S

    2017-07-01

    Background Ethylene glycol poisoning remains a rare but important presentation to acute toxicology units. Guidelines recommended that ethylene glycol should be available as an 'urgent' test within 4 h, but these are difficult to deliver in practice. This study assessed a semi-quantitative enzymatic spectrophotometric assay for ethylene glycol compatible with automated platforms. Methods The ethylene glycol method was assessed in 21 samples from patients with an increased anion gap and metabolic acidosis not due to ethylene glycol ingestion, and seven samples known to contain ethylene glycol. All samples were analysed in random order in a blinded manner to their origin on a laboratory spectrophotometer. Results In this study, seven samples were known to contain ethylene glycol at concentrations >100 mg/L. The method correctly identified all seven samples as containing ethylene glycol. No false-positives were observed. Thirteen samples gave clear negative results. Ethylene glycol was present at ethylene glycol concentration against results obtained when the samples had been analysed using the quantitative method on an automated analyser showed a good correlation (R = 0.84) but with an apparent under-recovery. Conclusions A semi-quantitative assay for ethylene glycol was able to discriminate well between samples containing ethylene glycol and those with other causes of acidosis. It is a practical small-scale assay for rapid identification of cases of ethylene glycol poisoning.

  3. A Novel Image Cytometric Method for Quantitation of Immunohistochemical Staining of Cytoplasmic Antigens

    Directory of Open Access Journals (Sweden)

    M. Guillaud

    1997-01-01

    Full Text Available Evaluation of molecular markers by immunohistochemical labelling of tissue sections has traditionally been performed by qualitative assessment by trained pathologists. For those markers with a staining component present outside of the nucleus, there has been no image histometric method available to reliably and consistently define cell interfaces within the tissue. We present a new method of approximating cellular boundaries to define cellular regions within which quantitative measurements of staining intensity may be made. The method is based upon Voronoi tessellation of a defined region of interest (ROI, and requires only the position of the nuclear centroids within the ROI.

  4. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    Science.gov (United States)

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  5. Paraganglioma presenting as cholesterol granuloma of the petrous apex.

    Science.gov (United States)

    Heman-Ackah, Selena E; Huang, Tina C

    2013-09-01

    We report the unique finding of a petrous apex cholesterol granuloma associated with a paraganglioma, also known as a glomus jugulare tumor, in a 52-year-old woman who presented to our department with pulsatile tinnitus, hearing loss, aural fullness, and disequilibrium. She had been treated for a petrous apex cholesterol granuloma 20 years earlier, at which time she had undergone drainage of the granuloma via subtotal petrous apicectomy. When she came to our facility approximately 20 years later, she had signs and symptoms consistent with a jugular paraganglioma, which was likely to have been present at the time of her initial presentation for the cholesterol granuloma. In fact, microscopic bleeding from the paraganglioma might have led to the formation of the cholesterol granuloma. The metachronous presentation of these two entities, which to our knowledge has not been reported previously in the literature, indicates the potential association of paragangliomas with the formation of cholesterol granulomas of the petrous apex.

  6. Study on Correlation and Quantitative Error Estimation Method Among the Splitting Shear Wave Identification Methods

    Institute of Scientific and Technical Information of China (English)

    Liu Xiqiang; Zhou Huilan; Li Hong; Gai Dianguang

    2000-01-01

    Based on the propagation characteristics of shear wave in the anisotropic layers, thecorrelation among several splitting shear-wave identification methods hasbeen studied. Thispaper puts forward the method estimating splitting shear-wave phases and its reliability byusing of the assumption that variance of noise and useful signal data obey normaldistribution. To check the validity of new method, the identification results and errorestimation corresponding to 95% confidence level by analyzing simulation signals have beengiven.

  7. Business Scenario Evaluation Method Using Monte Carlo Simulation on Qualitative and Quantitative Hybrid Model

    Science.gov (United States)

    Samejima, Masaki; Akiyoshi, Masanori; Mitsukuni, Koshichiro; Komoda, Norihisa

    We propose a business scenario evaluation method using qualitative and quantitative hybrid model. In order to evaluate business factors with qualitative causal relations, we introduce statistical values based on propagation and combination of effects of business factors by Monte Carlo simulation. In propagating an effect, we divide a range of each factor by landmarks and decide an effect to a destination node based on the divided ranges. In combining effects, we decide an effect of each arc using contribution degree and sum all effects. Through applied results to practical models, it is confirmed that there are no differences between results obtained by quantitative relations and results obtained by the proposed method at the risk rate of 5%.

  8. Revisiting the Isobole and Related Quantitative Methods for Assessing Drug Synergism

    OpenAIRE

    Tallarida, Ronald J.

    2012-01-01

    The isobole is well established and commonly used in the quantitative study of agonist drug combinations. This article reviews the isobole, its derivation from the concept of dose equivalence, and its usefulness in providing the predicted effect of an agonist drug combination, a topic not discussed in pharmacology textbooks. This review addresses that topic and also shows that an alternate method, called “Bliss independence,” is inconsistent with the isobolar approach and also has a less clea...

  9. A quantitative PCR method to quantify ruminant DNA in porcine crude heparin

    OpenAIRE

    Concannon, Sean P.; Wimberley, P. Brett; Workman, Wesley E.

    2010-01-01

    Heparin is a well-known glycosaminoglycan extracted from porcine intestines. Increased vigilance for transmissible spongiform encephalopathy in animal-derived pharmaceuticals requires methods to prevent the introduction of heparin from ruminants into the supply chain. The sensitivity, specificity, and precision of the quantitative polymerase chain reaction (PCR) make it a superior analytical platform for screening heparin raw material for bovine-, ovine-, and caprine-derived material. A quant...

  10. Comparison of Two Methods: Qualitative and Quantitative Study of C - Reactive Protein

    OpenAIRE

    Kiaei, MR. (BSc); HedayatMofidi, M. (MSc); Koohsar, F.; Amini, A; Hoseinzadeh, S.; Mirbazel, A.; Hesari, Z. (MSc)

    2014-01-01

    Background and Objective: C - reactive protein (CRP) is an acute phase protein produced in liver. It is less than 5 mg per deciliter in the serum and body fluids of normal individuals, but it is increased suddenly within a few hours following inflammatory reaction. In bacterial and viral infections, active rheumatic fever, acute myocardial infarction and rheumatoid arthritis are also increased. The aim of this study was to investigate CRP level by Qualitative and Quantitative methods. ...

  11. A method for estimating and removing streaking artifacts in quantitative susceptibility mapping.

    Science.gov (United States)

    Li, Wei; Wang, Nian; Yu, Fang; Han, Hui; Cao, Wei; Romero, Rebecca; Tantiwongkosi, Bundhit; Duong, Timothy Q; Liu, Chunlei

    2015-03-01

    Quantitative susceptibility mapping (QSM) is a novel MRI method for quantifying tissue magnetic property. In the brain, it reflects the molecular composition and microstructure of the local tissue. However, susceptibility maps reconstructed from single-orientation data still suffer from streaking artifacts which obscure structural details and small lesions. We propose and have developed a general method for estimating streaking artifacts and subtracting them from susceptibility maps. Specifically, this method uses a sparse linear equation and least-squares (LSQR)-algorithm-based method to derive an initial estimation of magnetic susceptibility, a fast quantitative susceptibility mapping method to estimate the susceptibility boundaries, and an iterative approach to estimate the susceptibility artifact from ill-conditioned k-space regions only. With a fixed set of parameters for the initial susceptibility estimation and subsequent streaking artifact estimation and removal, the method provides an unbiased estimate of tissue susceptibility with negligible streaking artifacts, as compared to multi-orientation QSM reconstruction. This method allows for improved delineation of white matter lesions in patients with multiple sclerosis and small structures of the human brain with excellent anatomical details. The proposed methodology can be extended to other existing QSM algorithms.

  12. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  13. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  14. Sixth Nerve Palsy from Cholesterol Granuloma of the Petrous Apex

    Science.gov (United States)

    Roemer, Ségolène; Maeder, Philippe; Daniel, Roy Thomas; Kawasaki, Aki

    2017-01-01

    Herein, we report a patient who had an isolated sixth nerve palsy due to a petrous apex cholesterol granuloma. The sixth nerve palsy appeared acutely and then spontaneously resolved over several months, initially suggesting a microvascular origin of the palsy. Subsequent recurrences of the palsy indicated a different pathophysiologic etiology and MRI revealed the lesion at the petrous apex. Surgical resection improved the compressive effect of the lesion at Dorello’s canal and clinical improvement was observed. A relapsing–remitting sixth nerve palsy is an unusual presentation of this rare lesion. PMID:28261154

  15. Study of foramen openings and their concurrence with root apexes

    Directory of Open Access Journals (Sweden)

    Sandra Maria Alves SAYÃO MAIA

    2005-05-01

    Full Text Available The present study is aimed at evaluating the anatomic concurrencebetween foramen openings and root apexes of 247 upper and lowerpermanent human molar canals, the distance between these structures,the instrument that best fits into the root canal, as well as the direction of foramen deviation. Sixty-four of the canals were partially impenetrable and were discarded. The findings showed that 39.9% of the root canals studied had their apical foramen concurrent with their root apexes and 60.1% did not. Clinicians must be made aware of this important anatomical detail, which could be indispensable for successful endodontic treatment.

  16. APEX survey of southern high mass star forming regions

    CERN Document Server

    Hieret, C; Menten, K M; Schilke, P; Thorwirth, S; Wyrowski, F

    2007-01-01

    A systematic study of a large sample of sources, covering a wide range in galactocentric distances, masses and luminosities, is a fast and efficient way of obtaining a good overview of the different stages of high-mass star formation. With these goals in mind, we have started a survey of 40 color selected IRAS sources south of -20 degrees declination with the APEX telescope on Chajnantor, Chile. Our first APEX results already demonstrate that the selection criteria were successful, since some of the sources are very rich in molecular lines.

  17. Automatic segmentation method of striatum regions in quantitative susceptibility mapping images

    Science.gov (United States)

    Murakawa, Saki; Uchiyama, Yoshikazu; Hirai, Toshinori

    2015-03-01

    Abnormal accumulation of brain iron has been detected in various neurodegenerative diseases. Quantitative susceptibility mapping (QSM) is a novel contrast mechanism in magnetic resonance (MR) imaging and enables the quantitative analysis of local tissue susceptibility property. Therefore, automatic segmentation tools of brain regions on QSM images would be helpful for radiologists' quantitative analysis in various neurodegenerative diseases. The purpose of this study was to develop an automatic segmentation and classification method of striatum regions on QSM images. Our image database consisted of 22 QSM images obtained from healthy volunteers. These images were acquired on a 3.0 T MR scanner. The voxel size was 0.9×0.9×2 mm. The matrix size of each slice image was 256×256 pixels. In our computerized method, a template mating technique was first used for the detection of a slice image containing striatum regions. An image registration technique was subsequently employed for the classification of striatum regions in consideration of the anatomical knowledge. After the image registration, the voxels in the target image which correspond with striatum regions in the reference image were classified into three striatum regions, i.e., head of the caudate nucleus, putamen, and globus pallidus. The experimental results indicated that 100% (21/21) of the slice images containing striatum regions were detected accurately. The subjective evaluation of the classification results indicated that 20 (95.2%) of 21 showed good or adequate quality. Our computerized method would be useful for the quantitative analysis of Parkinson diseases in QSM images.

  18. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    Science.gov (United States)

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  19. An Augmented Classical Least Squares Method for Quantitative Raman Spectral Analysis against Component Information Loss

    Directory of Open Access Journals (Sweden)

    Yan Zhou

    2013-01-01

    Full Text Available We propose an augmented classical least squares (ACLS calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS and principal component regression (PCR using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.

  20. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    Science.gov (United States)

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this.

  1. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    Energy Technology Data Exchange (ETDEWEB)

    Tam, Allison [Stanford Institutes of Medical Research Program, Stanford University School of Medicine, Stanford, California 94305 (United States); Barker, Jocelyn [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 (United States); Rubin, Daniel [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 and Department of Medicine (Biomedical Informatics Research), Stanford University School of Medicine, Stanford, California 94305 (United States)

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  2. Validation of quantitative and qualitative methods for detecting allergenic ingredients in processed foods in Japan.

    Science.gov (United States)

    Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko

    2013-06-19

    A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.

  3. A method to quantitate regional wall motion in left ventriculography using Hildreth algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Terashima, Mikio [Hyogo Red Cross Blood Center (Japan); Naito, Hiroaki; Sato, Yoshinobu; Tamura, Shinichi; Kurosawa, Tsutomu

    1998-06-01

    Quantitative measurement of ventricular wall motion is indispensable for objective evaluation of cardiac function associated with coronary artery disease. We have modified the Hildreth`s algorithm to estimate excursions of the ventricular wall on left ventricular images yielded by various imaging techniques. Tagging cine-MRI was carried out on 7 healthy volunteers. The original Hildreth method, the modified Hildreth method and the centerline method were applied to the outlines of the images obtained, to estimate excursion of the left ventricular wall and regional shortening and to evaluate the accuracy of these methods when measuring these parameters, compared to the values of these parameters measured actually using the attached tags. The accuracy of the original Hildreth method was comparable to that of the centerline method, while the modified Hildreth method was significantly more accurate than the centerline method (P<0.05). Regional shortening as estimated using the modified Hildreth method differed less from the actually measured regional shortening than did the shortening estimated using the centerline method (P<0.05). The modified Hildreth method allowed reasonable estimation of left ventricular wall excursion in all cases where it was applied. These results indicate that when applied to left ventriculograms for ventricular wall motion analysis, the modified Hildreth method is more useful than the original Hildreth method. (author)

  4. Interobserver comparison of CT and MRI-based prostate apex definition. Clinical relevance for conformal radiotherapy treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Wachter, S.; Wachter-Gerstner, N.; Goldner, G.; Poetter, R. [University Hospital Vienna (Austria). Dept. of Radiotherapy and Radiobiology; Bock, T. [University Hospital Vienna (Austria). Dept. of Radiotherapy and Radiobiology; University Hospital Kiel (Germany). Interdisziplinary Center of Brachytherapy; Kovacs, G. [University Hospital Kiel (Germany). Interdisziplinary Center of Brachytherapy; Fransson, A. [University Hospital Vienna (Austria). Dept. of Radiotherapy and Radiobiology; Karolinska Hospital, Stockholm (Sweden). Dept. of Hospital

    2002-05-01

    Background: CT is widely used for conformal radiotherapy treatment planning of prostate carcinoma. Its limitations are especially at the prostatic apex which cannot be separated from the urogenital diaphragm. The aim of this study was to compare the localization of the prostatic apex in CT and axial MRI to the sagittal MRI in an interobserver analysis. Patients and Methods: 22 patients with pathologically proven prostatic carcinoma were included in the analysis. In all patients sagittal and axial T2-weighted MRI and conventional CT were performed. The position of the MRI and CT apices were localized independently by three observers in relation to the intertrochanteric line. Additional subjective judgement of the ability to define the apical border of the prostatic gland was performed by a five-scaled score. Results: The apex of the prostate could be discriminated statistically significant (p<0.001) better in the MRI as compared to CT with best judgement for the sagittal MRI. The interobserver variation for the definition of the prostatic apex was statistically significant (p=0.009) smaller for the sagittal MRI compared to axial MRI and CT. On the average the apex as determined by sagittal MRI, axial MRI and CT was located 29 mm, 27 mm and 24 mm above the intertrochanteric line. The apex defined by CT would have led to an additional treatment of 6-13 mm in 10/22 patients compared to the sagittal MRI, defined by axial MRI only in five patients. Conclusion: Additional MRI provides a superior anatomic information especially in the apical portion of the prostate. It should be recommended for every single patient in the treatment planning process. It helps to avoid an unnecessary irradiation of healthy tissue and could lead to a decrease of anal side effects and radiation-induced impotency due to a reduction of the extent of irradiated penile structures. (orig.)

  5. Development of a rapid method for the quantitative determination of deoxynivalenol using Quenchbody

    Energy Technology Data Exchange (ETDEWEB)

    Yoshinari, Tomoya [Division of Microbiology, National Institute of Health Sciences, 1-18-1, Kamiyoga, Setagaya-ku, Tokyo 158-8501 (Japan); Ohashi, Hiroyuki; Abe, Ryoji; Kaigome, Rena [Biomedical Division, Ushio Inc., 1-12 Minamiwatarida-cho, Kawasaki-ku, Kawasaki 210-0855 (Japan); Ohkawa, Hideo [Research Center for Environmental Genomics, Kobe University, 1-1 Rokkodai, Nada, Kobe 657-8501 (Japan); Sugita-Konishi, Yoshiko, E-mail: y-konishi@azabu-u.ac.jp [Department of Food and Life Science, Azabu University, 1-17-71 Fuchinobe, Chuo-ku, Sagamihara, Kanagawa 252-5201 (Japan)

    2015-08-12

    Quenchbody (Q-body) is a novel fluorescent biosensor based on the antigen-dependent removal of a quenching effect on a fluorophore attached to antibody domains. In order to develop a method using Q-body for the quantitative determination of deoxynivalenol (DON), a trichothecene mycotoxin produced by some Fusarium species, anti-DON Q-body was synthesized from the sequence information of a monoclonal antibody specific to DON. When the purified anti-DON Q-body was mixed with DON, a dose-dependent increase in the fluorescence intensity was observed and the detection range was between 0.0003 and 3 mg L{sup −1}. The coefficients of variation were 7.9% at 0.003 mg L{sup −1}, 5.0% at 0.03 mg L{sup −1} and 13.7% at 0.3 mg L{sup −1}, respectively. The limit of detection was 0.006 mg L{sup −1} for DON in wheat. The Q-body showed an antigen-dependent fluorescence enhancement even in the presence of wheat extracts. To validate the analytical method using Q-body, a spike-and-recovery experiment was performed using four spiked wheat samples. The recoveries were in the range of 94.9–100.2%. The concentrations of DON in twenty-one naturally contaminated wheat samples were quantitated by the Q-body method, LC-MS/MS and an immunochromatographic assay kit. The LC-MS/MS analysis showed that the levels of DON contamination in the samples were between 0.001 and 2.68 mg kg{sup −1}. The concentrations of DON quantitated by LC-MS/MS were more strongly correlated with those using the Q-body method (R{sup 2} = 0.9760) than the immunochromatographic assay kit (R{sup 2} = 0.8824). These data indicate that the Q-body system for the determination of DON in wheat samples was successfully developed and Q-body is expected to have a range of applications in the field of food safety. - Highlights: • A rapid method for quantitation of DON using Q-body has been developed. • A recovery test using the anti-DON Q-body was performed. • The concentrations of DON in wheat

  6. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    Science.gov (United States)

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method.

  7. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Energy Technology Data Exchange (ETDEWEB)

    Conconi, M.S.; Gauna, M.R.; Serra, M.F. [Centro de Tecnologia de Recursos Minerales y Ceramica (CETMIC), Buenos Aires (Argentina); Suarez, G.; Aglietti, E.F.; Rendtorff, N.M., E-mail: rendtorff@cetmic.unlp.edu.ar [Universidad Nacional de La Plata (UNLP), Buenos Aires (Argentina). Fac. de Ciencias Exactas. Dept. de Quimica

    2014-10-15

    The firing transformations of traditional (clay based) ceramics are of technological and archaeological interest, and are usually reported qualitatively or semi quantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite), the low crystalline (metakaolinite and/or spinel type pre-mullite) and glassy phases evolution of a triaxial (clay-quartz-feldspar) ceramic fired in a wide temperature range between 900 and 1300 deg C. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 deg C) spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy) phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and materials

  8. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    Science.gov (United States)

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  9. The Use of Quantitative and Qualitative Methods in the Analysis of Academic Achievement among Undergraduates in Jamaica

    Science.gov (United States)

    McLaren, Ingrid Ann Marie

    2012-01-01

    This paper describes a study which uses quantitative and qualitative methods in determining the relationship between academic, institutional and psychological variables and degree performance for a sample of Jamaican undergraduate students. Quantitative methods, traditionally associated with the positivist paradigm, and involving the counting and…

  10. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    Science.gov (United States)

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated.

  11. Quantitative Evaluation Methods of In-Line X-Ray Phase Contrast Techniques

    Institute of Scientific and Technical Information of China (English)

    LI Zheng; LI Cheng-Quan; YU Ai-Min

    2007-01-01

    By revealing the relationship between edge visibility and imaging parameters in in-line phase contrast imaging (PCI), we propose a method to quantitatively measure the contribution of absorption and phase shift from acquired images. We also prove that edge visibility will grow with the increasing source-object distance and object-detector distance. The result is validated by relative phase factor and by experiments conducted on a microfocus x-ray source. This method provides a new approach to evaluate in-line PCI images and is helpful for deciding imaging parameters.

  12. A method for quantitative analysis of aquatic humic substances in clear water based on carbon concentration.

    Science.gov (United States)

    Tsuda, Kumiko; Takata, Akihiro; Shirai, Hidekado; Kozaki, Katsutoshi; Fujitake, Nobuhide

    2012-01-01

    Aquatic humic substances (AHSs) are major constituents of dissolved organic matter (DOM) in freshwater, where they perform a number of important ecological and geochemical functions, yet no method exists for quantifying all AHSs. We have developed a method for the quantitative analysis of AHSs based on their carbon concentration. Our approach includes: (1) the development of techniques for clear-water samples with low AHS concentrations, which normally complicate quantification; (2) avoiding carbon contamination in the laboratory; and (3) optimizing the AHS adsorption conditions.

  13. Quantitative Evaluation of the Total Magnetic Moments of Colloidal Magnetic Nanoparticles: A Kinetics-based Method.

    Science.gov (United States)

    Liu, Haiyi; Sun, Jianfei; Wang, Haoyao; Wang, Peng; Song, Lina; Li, Yang; Chen, Bo; Zhang, Yu; Gu, Ning

    2015-06-08

    A kinetics-based method is proposed to quantitatively characterize the collective magnetization of colloidal magnetic nanoparticles. The method is based on the relationship between the magnetic force on a colloidal droplet and the movement of the droplet under a gradient magnetic field. Through computational analysis of the kinetic parameters, such as displacement, velocity, and acceleration, the magnetization of colloidal magnetic nanoparticles can be calculated. In our experiments, the values measured by using our method exhibited a better linear correlation with magnetothermal heating, than those obtained by using a vibrating sample magnetometer and magnetic balance. This finding indicates that this method may be more suitable to evaluate the collective magnetism of colloidal magnetic nanoparticles under low magnetic fields than the commonly used methods. Accurate evaluation of the magnetic properties of colloidal nanoparticles is of great importance for the standardization of magnetic nanomaterials and for their practical application in biomedicine.

  14. A method for improved clustering and classification of microscopy images using quantitative co-localization coefficients

    LENUS (Irish Health Repository)

    Singan, Vasanth R

    2012-06-08

    AbstractBackgroundThe localization of proteins to specific subcellular structures in eukaryotic cells provides important information with respect to their function. Fluorescence microscopy approaches to determine localization distribution have proved to be an essential tool in the characterization of unknown proteins, and are now particularly pertinent as a result of the wide availability of fluorescently-tagged constructs and antibodies. However, there are currently very few image analysis options able to effectively discriminate proteins with apparently similar distributions in cells, despite this information being important for protein characterization.FindingsWe have developed a novel method for combining two existing image analysis approaches, which results in highly efficient and accurate discrimination of proteins with seemingly similar distributions. We have combined image texture-based analysis with quantitative co-localization coefficients, a method that has traditionally only been used to study the spatial overlap between two populations of molecules. Here we describe and present a novel application for quantitative co-localization, as applied to the study of Rab family small GTP binding proteins localizing to the endomembrane system of cultured cells.ConclusionsWe show how quantitative co-localization can be used alongside texture feature analysis, resulting in improved clustering of microscopy images. The use of co-localization as an additional clustering parameter is non-biased and highly applicable to high-throughput image data sets.

  15. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    Science.gov (United States)

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  16. Report of the methods for quantitative organ evaluation in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Nakata, Shigeru [Ehime Univ., Matsuyama (Japan). Hospital; Akagi, Naoki; Mimura, Hiroaki; Nagaki, Akio; Takahashi, Yasuyuki

    1999-06-01

    The group for the methods in the title herein reported the summary of their investigations on literatures concerning the brain, heart, liver and kidney evaluation. The report consisted of the history, kinetics of the agents, methods for quantitative evaluation and summary for each organ. As for the brain, the quantitative evaluation of cerebral blood flow scintigraphy with {sup 123}I-IMP and {sup 99m}Tc-HMPAO or -ECD were reviewed to conclude that the present convenient methods are of problems in precision, for which a novel method and/or tracer should be developed. For cardiac functions, there is a method based either on the behavior of tracer in the blood which is excellent in reproducibility, or on the morphology of cardiac wall of which images can be analyzed alternatively by CT and MRI. For these, {sup 131}I-albumin, {sup 99m}Tc-albumin, -red blood cells, -MIBI and -tetrofosmin have been used. For myocardium, {sup 201}Tl has been used to evaluate the ischemic region and, with simultaneous use of {sup 99m}Tc-MIBI or -tetrofosmin, the viability. {sup 123}I-BMIPP and -MIBG have been developed for myocardial fatty acid metabolism and for cardiac sympathetic nerve functions. Liver functions have been evaluated by the blood elimination rate, hepatic uptake, hepatic elimination and hepatic blood flow with use of {sup 99m}Tc-labeled colloids, -PMT and -GSA. Quantitative evaluation of renal functions is now well established with high precision since the kinetic behavior of the tracers, like {sup 99m}Tc-DTPA, -MAG3, -DMSA and {sup 131}I-OIH, is simple. (K.H.)

  17. The use of Triangulation in Social Sciences Research : Can qualitative and quantitative methods be combined?

    Directory of Open Access Journals (Sweden)

    Ashatu Hussein

    2015-03-01

    Full Text Available This article refers to a study in Tanzania on fringe benefits or welfare via the work contract1 where we will work both quantitatively and qualitatively. My focus is on the vital issue of combining methods or methodologies. There has been mixed views on the uses of triangulation in researches. Some authors argue that triangulation is just for increasing the wider and deep understanding of the study phenomenon, while others have argued that triangulation is actually used to increase the study accuracy, in this case triangulation is one of the validity measures. Triangulation is defined as the use of multiple methods mainly qualitative and quantitative methods in studying the same phenomenon for the purpose of increasing study credibility. This implies that triangulation is the combination of two or more methodological approaches, theoretical perspectives, data sources, investigators and analysis methods to study the same phenomenon.However, using both qualitative and quantitative paradigms in the same study has resulted into debate from some researchers arguing that the two paradigms differ epistemologically and ontologically. Nevertheless, both paradigms are designed towards understanding about a particular subject area of interest and both of them have strengths and weaknesses. Thus, when combined there is a great possibility of neutralizing the flaws of one method and strengthening the benefits of the other for the better research results. Thus, to reap the benefits of two paradigms and minimizing the drawbacks of each, the combination of the two approaches have been advocated in this article. The quality of our studies on welfare to combat poverty is crucial, and especially when we want our conclusions to matter in practice.

  18. Are three generations of quantitative molecular methods sufficient in medical virology? Brief review.

    Science.gov (United States)

    Clementi, Massimo; Bagnarelli, Patrizia

    2015-10-01

    In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.

  19. Link-based quantitative methods to identify differentially coexpressed genes and gene Pairs

    Directory of Open Access Journals (Sweden)

    Ye Zhi-Qiang

    2011-08-01

    Full Text Available Abstract Background Differential coexpression analysis (DCEA is increasingly used for investigating the global transcriptional mechanisms underlying phenotypic changes. Current DCEA methods mostly adopt a gene connectivity-based strategy to estimate differential coexpression, which is characterized by comparing the numbers of gene neighbors in different coexpression networks. Although it simplifies the calculation, this strategy mixes up the identities of different coexpression neighbors of a gene, and fails to differentiate significant differential coexpression changes from those trivial ones. Especially, the correlation-reversal is easily missed although it probably indicates remarkable biological significance. Results We developed two link-based quantitative methods, DCp and DCe, to identify differentially coexpressed genes and gene pairs (links. Bearing the uniqueness of exploiting the quantitative coexpression change of each gene pair in the coexpression networks, both methods proved to be superior to currently popular methods in simulation studies. Re-mining of a publicly available type 2 diabetes (T2D expression dataset from the perspective of differential coexpression analysis led to additional discoveries than those from differential expression analysis. Conclusions This work pointed out the critical weakness of current popular DCEA methods, and proposed two link-based DCEA algorithms that will make contribution to the development of DCEA and help extend it to a broader spectrum.

  20. Development and validation of HPLC method for quantitative analysis of triamcinolone in biodegradable microparticles

    Directory of Open Access Journals (Sweden)

    A. A. Silva-Júnior

    2009-01-01

    Full Text Available

    A simple, rapid, selective and specific high performance liquid chromatographic (HPLC method for quantitative analysis of the triamcinolone in polylactide-co-glycolide acid (PLGA microparticles was developed. The chromatographic parameters were reversed-phase C18 column, 250mm x 4.6mm, with particle size 5 m. The column oven was thermostated at 35 ºC ± 2 ºC. The mobile phase was methanol/water 45:55 (v/v and elution was isocratic at a flow-rate of 1mL.mL-1. The determinations were performed using a UV-Vis detector at 239 nm. The injected sample volume was 10 µL. The standard curve was linear (r2 > 0.999 in the concentration range 100-2500 ng.mL-1. The method showed adequate precision, with a relative standard deviation (RSD was smaller than 3%. The accuracy was analyzed by adding a standard drug and good recovery values were obtained for all drug concentrations used. The method showed specificity and selectivity with linearity in the working range and good precision and accuracy, making it very suitable for quantitation of triamcinolone in PLGA microparticles. Keywords: triamcinolone; HPLC analytical method; PLGA microparticles; analytical method validation.

  1. Emerging flow injection mass spectrometry methods for high-throughput quantitative analysis.

    Science.gov (United States)

    Nanita, Sergio C; Kaldon, Laura G

    2016-01-01

    Where does flow injection analysis mass spectrometry (FIA-MS) stand relative to ambient mass spectrometry (MS) and chromatography-MS? Improvements in FIA-MS methods have resulted in fast-expanding uses of this technique. Key advantages of FIA-MS over chromatography-MS are fast analysis (typical run time quantitative screening of chemicals needs to be performed rapidly and reliably. The FIA-MS methods discussed herein have demonstrated quantitation of diverse analytes, including pharmaceuticals, pesticides, environmental contaminants, and endogenous compounds, at levels ranging from parts-per-billion (ppb) to parts-per-million (ppm) in very complex matrices (such as blood, urine, and a variety of foods of plant and animal origin), allowing successful applications of the technique in clinical diagnostics, metabolomics, environmental sciences, toxicology, and detection of adulterated/counterfeited goods. The recent boom in applications of FIA-MS for high-throughput quantitative analysis has been driven in part by (1) the continuous improvements in sensitivity and selectivity of MS instrumentation, (2) the introduction of novel sample preparation procedures compatible with standalone mass spectrometric analysis such as salting out assisted liquid-liquid extraction (SALLE) with volatile solutes and NH4(+) QuEChERS, and (3) the need to improve efficiency of laboratories to satisfy increasing analytical demand while lowering operational cost. The advantages and drawbacks of quantitative analysis by FIA-MS are discussed in comparison to chromatography-MS and ambient MS (e.g., DESI, LAESI, DART). Generally, FIA-MS sits 'in the middle' between ambient MS and chromatography-MS, offering a balance between analytical capability and sample analysis throughput suitable for broad applications in life sciences, agricultural chemistry, consumer safety, and beyond.

  2. Combinative Method Using Multi-components Quantitation and HPLC Fingerprint for Comprehensive Evaluation of Gentiana crassicaulis

    Science.gov (United States)

    Song, Jiuhua; Chen, Fengzheng; Liu, Jiang; Zou, Yuanfeng; Luo, Yun; Yi, Xiaoyan; Meng, Jie; Chen, Xingfu

    2017-01-01

    Background: Gentiana crassicaulis () is an important traditional Chinese herb. Like other herbs, its chemical compounds vary greatly by the environmental and genetic factors, as a result, the quality is always different even from the same region, and therefore, the quality evaluation is necessary for its safety and effective use. In this study, a comprehensive method including HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Cujingqinjiao and to classify the samples collected from Lijiang City of Yunnan province. A total of 30 common peaks including four identified peaks, were found, and were involved for further characterization and quality control of Cujingqinjiao. Twenty-one batches of samples from Lijiang City of Yunnan Province were evaluated by similarity analysis (SA), hierarchical cluster analysis (HCA), principal component analysis (PCA) and factor analysis (FA) according to the characteristic of common peaks. Results: The obtained data showed good stability and repeatability of the chromatographic fingerprint, similarity values were all more than 0.90. This study demonstrated that a combination of the chromatographic quantitative analysis and fingerprint offered an efficient way to quality consistency evaluation of Cujingqinjiao. Consistent results were obtained to show that samples from a same origin could be successfully classified into two groups. Conclusion: This study revealed that the combinative method was reliable, simple and sensitive for fingerprint analysis, moreover, for quality control and pattern recognition of Cujingqinjiao. SUMMARY HPLC quantitative analysis and fingerprints was developed to evaluate the quality of Gentiana crassicaulisSimilarity analysis, hierarchical cluster analysis, principal component analysis and factor analysis were employed to analysis the chromatographic dataset.The results of multi-components quantitation analysis, similarity analysis, hierarchical cluster analysis, principal

  3. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR

    Science.gov (United States)

    Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J.; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H.

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  4. Method Specific Calibration Corrects for DNA Extraction Method Effects on Relative Telomere Length Measurements by Quantitative PCR.

    Science.gov (United States)

    Seeker, Luise A; Holland, Rebecca; Underwood, Sarah; Fairlie, Jennifer; Psifidi, Androniki; Ilska, Joanna J; Bagnall, Ainsley; Whitelaw, Bruce; Coffey, Mike; Banos, Georgios; Nussey, Daniel H

    2016-01-01

    Telomere length (TL) is increasingly being used as a biomarker in epidemiological, biomedical and ecological studies. A wide range of DNA extraction techniques have been used in telomere experiments and recent quantitative PCR (qPCR) based studies suggest that the choice of DNA extraction method may influence average relative TL (RTL) measurements. Such extraction method effects may limit the use of historically collected DNA samples extracted with different methods. However, if extraction method effects are systematic an extraction method specific (MS) calibrator might be able to correct for them, because systematic effects would influence the calibrator sample in the same way as all other samples. In the present study we tested whether leukocyte RTL in blood samples from Holstein Friesian cattle and Soay sheep measured by qPCR was influenced by DNA extraction method and whether MS calibration could account for any observed differences. We compared two silica membrane-based DNA extraction kits and a salting out method. All extraction methods were optimized to yield enough high quality DNA for TL measurement. In both species we found that silica membrane-based DNA extraction methods produced shorter RTL measurements than the non-membrane-based method when calibrated against an identical calibrator. However, these differences were not statistically detectable when a MS calibrator was used to calculate RTL. This approach produced RTL measurements that were highly correlated across extraction methods (r > 0.76) and had coefficients of variation lower than 10% across plates of identical samples extracted by different methods. Our results are consistent with previous findings that popular membrane-based DNA extraction methods may lead to shorter RTL measurements than non-membrane-based methods. However, we also demonstrate that these differences can be accounted for by using an extraction method-specific calibrator, offering researchers a simple means of accounting for

  5. An evaluation of root ZX and elements diagnostic apex locators.

    Science.gov (United States)

    Tselnik, Marat; Baumgartner, J Craig; Marshall, J Gordon

    2005-07-01

    The purpose of this study was to compare the accuracy of the Root ZX and Elements Diagnostic electronic apex locators under clinical conditions. Thirty-six teeth planned for extraction were used. Each tooth was decoronated, coronally flared with Orifice Shapers, and irrigated with 2.6% sodium hypochlorite. Working lengths were measured with K-files using both electronic apex locators. The files were cemented at the last measured working length and the teeth were extracted. The apical 4-mm of each canal were exposed and photographed under 15x and 30x magnification. Images of each apex were projected and the distance from the file tip to the minor diameter was determined. The mean distances from the file tip to the minor diameter were 0.346 mm for the Elements Diagnostic and 0.410-mm for the Root ZX beyond the minor constriction. In locating the minor constriction the Root ZX was accurate 75% of the time to +/-0.5 mm, 83.3% +/-0.75 mm, and 88.9% to +/-1 mm. The Elements Diagnostic was accurate 75% of the time to +/-0.5 mm, 88.9% to +/-0.75 mm, and 91.7% to +/-1 mm. There was no statistically significant difference between the accuracy of the two electronic apex locators in locating the minor diameter (p < 0.05).

  6. Phosphorus modeling in tile drained agricultural systems using APEX

    Science.gov (United States)

    Phosphorus losses through tile drained systems in agricultural landscapes may be causing the persistent eutrophication problems observed in surface water. The purpose of this paper is to evaluate the state of the science in the Agricultural Policy/Environmental eXtender (APEX) model related to surf...

  7. Aspergillosis of the Petrous Apex and Meckel's Cave

    OpenAIRE

    Ederies, Ash; Chen, Joseph; Aviv, Richard I.; Pirouzmand, Farhad; Bilbao, Juan M.; Thompson, Andrew L.; Symons, Sean P.

    2010-01-01

    Cranial cerebral aspergillosis is a rare entity in immunocompetent patients. Invasive disease involving the petrous apex and Meckel's cave has rarely been described. We present a case of localized invasive petrous apical and Meckel's cave disease in an immunocompetent patient who presented with hemicranial neuralgic pain.

  8. Aspergillosis of the Petrous Apex and Meckel's Cave.

    Science.gov (United States)

    Ederies, Ash; Chen, Joseph; Aviv, Richard I; Pirouzmand, Farhad; Bilbao, Juan M; Thompson, Andrew L; Symons, Sean P

    2010-05-01

    Cranial cerebral aspergillosis is a rare entity in immunocompetent patients. Invasive disease involving the petrous apex and Meckel's cave has rarely been described. We present a case of localized invasive petrous apical and Meckel's cave disease in an immunocompetent patient who presented with hemicranial neuralgic pain.

  9. In vitro comparison of four different electronic apex locators to ...

    African Journals Online (AJOL)

    2014-04-14

    Apr 14, 2014 ... Nigerian Journal of Clinical Practice • Nov-Dec 2014 • Vol 17 • Issue 6. Abstract. Objectives: The aim of this study was to evaluate the accuracy of four ... The distance between the tip of the file and the major foramen was measured ... Key words: Clearing technique, electronic apex locator, major foramen.

  10. CT scan screening is associated with increased distress among subjects of the APExS

    Directory of Open Access Journals (Sweden)

    Stoufflet Audrey

    2010-10-01

    Full Text Available Abstract Background The aim of this study was to assess the psychological consequences of HRCT scan screening in retired asbestos-exposed workers. Methods A HRCT-scan screening program for asbestos-related diseases was carried out in four regions of France. At baseline (T1, subjects filled in self-administered occupational questionnaires. In two of the regions, subjects also received a validated psychological scale, namely the psychological consequences questionnaire (PCQ. The physician was required to provide the subject with the results of the HRCT scan at a final visit. A second assessment of psychological consequences was performed 6 months after the HRCT-scan examination (T2. PCQ scores were compared quantitatively (t-test, general linear model and qualitatively (chi²-test, logistic regression to screening results. Multivariate analyses were adjusted for gender, age, smoking, asbestos exposure and counseling. Results Among the 832 subjects included in this psychological impact study, HRCT-scan screening was associated with a significant increase of the psychological score 6 months after the examination relative to baseline values (8.31 to 10.08, p Conclusion This study suggests that HRCT-scan screening may be associated with increased distress in asbestos-exposed subjects. If confirmed, these results may have consequences for HRCT-scan screening recommendations.

  11. A Quantitative Analysis on Two RFS-Based Filtering Methods for Multicell Tracking

    Directory of Open Access Journals (Sweden)

    Yayun Ren

    2014-01-01

    Full Text Available Multiobject filters developed from the theory of random finite sets (RFS have recently become well-known methods for solving multiobject tracking problem. In this paper, we present two RFS-based filtering methods, Gaussian mixture probability hypothesis density (GM-PHD filter and multi-Bernoulli filter, to quantitatively analyze their performance on tracking multiple cells in a series of low-contrast image sequences. The GM-PHD filter, under linear Gaussian assumptions on the cell dynamics and birth process, applies the PHD recursion to propagate the posterior intensity in an analytic form, while the multi-Bernoulli filter estimates the multitarget posterior density through propagating the parameters of a multi-Bernoulli RFS that approximates the posterior density of multitarget RFS. Numerous performance comparisons between the two RFS-based methods are carried out on two real cell images sequences and demonstrate that both yield satisfactory results that are in good agreement with manual tracking method.

  12. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    Science.gov (United States)

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition.

  13. Development and application of quantitative detection method for viral hemorrhagic septicemia virus (VHSV) genogroup IVa.

    Science.gov (United States)

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-05-23

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R² values of the primer set developed in this study were -0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID₅₀) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID₅₀, making it a very useful tool for VHSV diagnosis.

  14. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV Genogroup IVa

    Directory of Open Access Journals (Sweden)

    Jong-Oh Kim

    2014-05-01

    Full Text Available Viral hemorrhagic septicemia virus (VHSV is a problematic pathogen in olive flounder (Paralichthys olivaceus aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR method based on the nucleocapsid (N gene sequence of Korean VHSV isolate (Genogroup IVa. The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50 showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis.

  15. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    Science.gov (United States)

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  16. COMPUTATIONAL METHOD FOR SEMI-QUANTITATIVE ANALYSIS OF IMMUNOBLOTS OF MODIFIED PROTEINS USING IMAGEJ

    Directory of Open Access Journals (Sweden)

    Aditya Arya

    2015-09-01

    Full Text Available Oxidative stress is associated with the generation of reactive oxygen/nitrogen species (RNOS which non-enzymatically modify active functional groups in proteins mostly turning them into protein carbonyls or nitrosyls. These changes render the protein molecules non-functional and drive them to degradation or formation of cross linking aggregates. Limited methods are available for detections of protein modifications. Enzyme linked immunoassays as quantitative method and Immunoblotting based qualitative methods are most common. Visual examination of Immunoblots containing a characteristic pattern is difficult for a fair comparison if the differences are trifling or the modifications are of varying degree across the complete range. This necessitates the use of image processing tools for a fair comparison. We, report here a computational approach using ImageJ, to process and obtain significant inter group comparison. Also, this method provides, software developers and programmers an opportunity to augment the gel processing tools with such plugins and features.

  17. Full quantitative phase analysis of hydrated lime using the Rietveld method

    Energy Technology Data Exchange (ETDEWEB)

    Lassinantti Gualtieri, Magdalena, E-mail: magdalena.gualtieri@unimore.it [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Romagnoli, Marcello; Miselli, Paola; Cannio, Maria [Dipartimento Ingegneria dei Materiali e dell' Ambiente, Universita Degli Studi di Modena e Reggio Emilia, Via Vignolese 905/a, I-41100 Modena (Italy); Gualtieri, Alessandro F. [Dipartimento di Scienze della Terra, Universita Degli Studi di Modena e Reggio Emilia, I-41100 Modena (Italy)

    2012-09-15

    Full quantitative phase analysis (FQPA) using X-ray powder diffraction and Rietveld refinements is a well-established method for the characterization of various hydraulic binders such as Portland cement and hydraulic limes. In this paper, the Rietveld method is applied to hydrated lime, a non-hydraulic traditional binder. The potential presence of an amorphous phase in this material is generally ignored. Both synchrotron radiation and a conventional X-ray source were used for data collection. The applicability of the developed control file for the Rietveld refinements was investigated using samples spiked with glass. The results were cross-checked by other independent methods such as thermal and chemical analyses. The sample microstructure was observed by transmission electron microscopy. It was found that the consistency between the different methods was satisfactory, supporting the validity of FQPA for this material. For the samples studied in this work, the amount of amorphous material was in the range 2-15 wt.%.

  18. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  19. Novel method for quantitative ANA measurement using near-infrared imaging.

    Science.gov (United States)

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  20. Quantitative dot blot analysis (QDB), a versatile high throughput immunoblot method.

    Science.gov (United States)

    Tian, Geng; Tang, Fangrong; Yang, Chunhua; Zhang, Wenfeng; Bergquist, Jonas; Wang, Bin; Mi, Jia; Zhang, Jiandi

    2017-08-29

    Lacking access to an affordable method of high throughput immunoblot analysis for daily use remains a big challenge for scientists worldwide. We proposed here Quantitative Dot Blot analysis (QDB) to meet this demand. With the defined linear range, QDB analysis fundamentally transforms traditional immunoblot method into a true quantitative assay. Its convenience in analyzing large number of samples also enables bench scientists to examine protein expression levels from multiple parameters. In addition, the small amount of sample lysates needed for analysis means significant saving in research sources and efforts. This method was evaluated at both cellular and tissue levels with unexpected observations otherwise would be hard to achieve using conventional immunoblot methods like Western blot analysis. Using QDB technique, we were able to observed an age-dependent significant alteration of CAPG protein expression level in TRAMP mice. We believe that the adoption of QDB analysis would have immediate impact on biological and biomedical research to provide much needed high-throughput information at protein level in this "Big Data" era.

  1. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  2. Development of a HPLC Method for the Quantitative Determination of Capsaicin in Collagen Sponge

    Directory of Open Access Journals (Sweden)

    Chun-Lian Guo

    2015-01-01

    Full Text Available Controlling the concentration of drugs in pharmaceutical products is essential to patient’s safety. In this study, a simple and sensitive HPLC method is developed to quantitatively analyze capsaicin in collagen sponge. The capsaicin from sponge was extracted for 30 min with ultrasonic wave extraction technique and methanol was used as solvent. The chromatographic method was performed by using isocratic system composed of acetonitrile-water (70 : 30 with a flow rate of 1 mL/min and the detection wavelength was at 280 nm. Capsaicin can be successfully separated with good linearity (the regression equation is A = 9.7182C + 0.8547; R2 = 1.0 and perfect recovery (99.72%. The mean capsaicin concentration in collagen sponge was 49.32 mg/g (RSD = 1.30%; n = 3. In conclusion, the ultrasonic wave extraction method is simple and the extracting efficiency is high. The HPLC assay has excellent sensitivity and specificity and is a convenient method for capsaicin detection in collagen sponge. This paper firstly discusses the quantitative analysis of capsaicin in collagen sponge.

  3. An improved method for retrospective motion correction in quantitative T2* mapping.

    Science.gov (United States)

    Nöth, Ulrike; Volz, Steffen; Hattingen, Elke; Deichmann, Ralf

    2014-05-15

    A new method for motion correction of T2*-weighted data and resulting quantitative T2* maps is presented. For this method, additional data sets with a reduced number of phase encoding steps covering the k-space centre are acquired. Motion correction is based on a 3-step procedure: (1) calculation of improved input data sets with reduced artefact levels from the original data, (2) creation of a target data set free of movement artefacts on the basis of the improved input data sets, and (3) fitting of original data to the target data set, yielding an optimum combination of acquired k-space data which suppresses lines affected by movement. The method was tested on healthy subjects performing pre-trained movement. Motion correction was successful unless the same k-space line was affected by movement in all data sets acquired on a specific subject. The method was applied to patients suffering from subarachnoid haemorrhage (group 1) or tumours (group 2) with accompanying edema in the brain. Motion correction improved the interpretability of T2*-weighted patient data and resulting quantitative T2* maps considerably by allowing a clear delineation between ventricle and edema and a clear localisation of haemorrhage (group 1) or a clear delineation of tumour accompanying edema (group 2) which was not possible in data affected by movement.

  4. An in vivo comparative evaluation to determine the accuracy of working length between radiographic and electronic apex locators

    Directory of Open Access Journals (Sweden)

    S Vijay Singh

    2012-01-01

    Full Text Available Background: An in vivo comparative evaluation to determine the accuracy of working length between radiographic and electronic apex locators. Aim: The study was aimed at evaluating the accuracy of electronic apex locator, to determine the working length of root canal, and to compare it with the radiographic method of working length determination. Materials and Methods: A total of 20 teeth selected for the study had to go for extraction because of periodontal or orthodontic reasons. Access cavity was prepared and the clinical estimated working length (CEWL was determined with 10-25 no. K-file. A radiograph was then taken for determining the radiographic estimated working length (REWL. For electronic measurement of root canal, a 10 no. K-file was advanced toward the apex until it reached a 0.5 mm short of apex as shown by the apex locator. After fixing the file with a light cured composite, the tooth was extracted, the tooth surface was then longitudinally grounded using straight fissure diamond bur until the root canal and the tip of the file were visible. The distance of file from the minor constriction was measured with help of stereomicroscope. Statistical analysis : The chi-square test was used for statistical analysis for this study. Results: The chi-square test where χ2 = 21.034 with P = 0.000 indicated that a significant difference exists among the groups. The electronic method showed highest number of cases with the working length at the minor constrictor. Conclusion: The electronic method for determining the working length of root canal was found to be more accurate than the radiographic method.

  5. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    Science.gov (United States)

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  6. Assessing size of pituitary adenomas: a comparison of qualitative and quantitative methods on MR.

    Science.gov (United States)

    Davies, Benjamin M; Carr, Elizabeth; Soh, Calvin; Gnanalingham, Kanna K

    2016-04-01

    A variety of methods are used for estimating pituitary tumour size in clinical practice and in research. Quantitative methods, such as maximum tumour dimension, and qualitative methods, such as Hardy and Knosp grades, are well established but do not give an accurate assessment of the tumour volume. We therefore sought to compare existing measures of pituitary tumours with more quantitative methods of tumour volume estimation. Magnetic resonance imaging was reviewed for 99 consecutive patients with pituitary adenomas awaiting surgery between 2010 and 2013. Maximal tumour diameter, Hardy and Knosp grades were compared with tumour volume estimates by the ellipsoid equation, [4/3π (a,b,c)], (i.e. ellipsoid volume) and slice-by-slice perimetry (i.e. perimeter volume). Ellipsoid and perimeter methods of tumour volume estimation strongly correlated (R(2) = 0.99, p < 0.0001). However the correlation was less strong with increasing tumour size, with the ellipsoid method slightly underestimating. The mean differences were -0.11 (95 % CI, -0.35, 0.14), -0.74 (95 % CI, -2.2, 0.74) and -1.4 (95 % CI, -6.4, 3.7) for micro-tumours, macro-tumours and giant tumours respectively. Tumour volume correlated with maximal diameter, following a cubic distribution. Correlations of tumour volume with Hardy and Knosp grades was less strong. Perimeter and ellipsoid methods give a good estimation of tumour volume, whereas Knosp and Hardy grades may offer other clinically relevant information, such as cavernous sinus invasion or chiasmal compression. Thus the different methods of estimating tumour size are likely to have different clinical utilities.

  7. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    Science.gov (United States)

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities.

  8. A new method for quantitative real-time polymerase chain reaction data analysis.

    Science.gov (United States)

    Rao, Xiayu; Lai, Dejian; Huang, Xuelin

    2013-09-01

    Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantification method that has been extensively used in biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle method and linear and nonlinear model-fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence can hardly be accurate and therefore can distort results. We propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtract the fluorescence in the former cycle from that in the latter cycle, transforming the n cycle raw data into n-1 cycle data. Then, linear regression is applied to the natural logarithm of the transformed data. Finally, PCR amplification efficiencies and the initial DNA molecular numbers are calculated for each reaction. This taking-difference method avoids the error in subtracting an unknown background, and thus it is more accurate and reliable. This method is easy to perform, and this strategy can be extended to all current methods for PCR data analysis.

  9. Quantitative Determination of Synthesized Genotoxic Impurities in Nifuroxazide Capsules by Validated Chromatographic Methods.

    Science.gov (United States)

    Abdelwahab, Nada S; Ali, Nouruddin W; Zaki, Marco M; Abdelkawy, M; El-Saadi, Mohammed T

    2017-07-31

    Two accurate, selective, and precise chromatographic methods, namely TLC-densitometric and reversed-phase (RP)-HPLC, were developed and validated for the simultaneous determination of nifuroxazide (NIF) and its four synthesized impurities, which are also reported to be its related substances in the range of 10–100 μg/band and 10–100 μg/mL for NIF in the TLC and RP-HPLC methods, respectively. The developed TLC-densitometric method depended on the separation and quantitation of the studied components on silica gel 60 F254 TLC plates. Ethyl acetate–acetone–methanol–ammonia (85 + 25 + 5 + 0.5, v/v/v/v) was used as the developing system, and the separated bands were UV-scanned at 230 nm. On the other hand, the developed RP-HPLC method depended on chromatographic separation using a C8 column at 25°C and an aqueous solution of 0.1% sodium lauryl sulfate–acetonitrile as the mobile phase delivered according to the gradient elution program. Factors affecting the developed methods were studied and optimized. Also, method validation was carried out according to International Conference on Harmonization guidelines. The proposed methods were successfully applied for the determination of the studied drug in its bulk powder and in its pharmaceutical formulation. The developed methods showed no significant difference when compared with the reported RP-HPLC one. Their advantage is being the first stability-indicating methods for NIF and its genotoxic impurities.

  10. Quantitative firing transformations of a triaxial ceramic by X-ray diffraction methods

    Directory of Open Access Journals (Sweden)

    M. S. Conconi

    2014-12-01

    Full Text Available The firing transformations of traditional (clay based ceramics are of technological and archeological interest, and are usually reported qualitatively or semiquantitatively. These kinds of systems present an important complexity, especially for X-ray diffraction techniques, due to the presence of fully crystalline, low crystalline and amorphous phases. In this article we present the results of a qualitative and quantitative X-ray diffraction Rietveld analysis of the fully crystalline (kaolinite, quartz, cristobalite, feldspars and/or mullite, the low crystalline (metakaolinite and/or spinel type pre-mullite and glassy phases evolution of a triaxial (clay-quartz-feldspar ceramic fired in a wide temperature range between 900 and 1300 ºC. The employed methodology to determine low crystalline and glassy phase abundances is based in a combination of the internal standard method and the use of a nanocrystalline model where the long-range order is lost, respectively. A preliminary sintering characterization was carried out by contraction, density and porosity evolution with the firing temperature. Simultaneous thermo-gravimetric and differential thermal analysis was carried out to elucidate the actual temperature at which the chemical changes occur. Finally, the quantitative analysis based on the Rietveld refinement of the X-ray diffraction patterns was performed. The kaolinite decomposition into metakaolinite was determined quantitatively; the intermediate (980 ºC spinel type alumino-silicate formation was also quantified; the incongruent fusion of the potash feldspar was observed and quantified together with the final mullitization and the amorphous (glassy phase formation.The methodology used to analyze the X-ray diffraction patterns proved to be suitable to evaluate quantitatively the thermal transformations that occur in a complex system like the triaxial ceramics. The evaluated phases can be easily correlated with the processing variables and

  11. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    Science.gov (United States)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  12. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    Science.gov (United States)

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  13. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  14. MES buffer affects Arabidopsis root apex zonation and root growth by suppressing superoxide generation in root apex

    Directory of Open Access Journals (Sweden)

    Tomoko eKagenishi

    2016-02-01

    Full Text Available In plants, growth of roots and root hairs is regulated by the fine cellular control of pH and reactive oxygen species. MES, 2-(N-morpholinoethanesulfonic acid as one of the Good’s buffers has broadly been used for buffering medium, and it is thought to suit for plant growth with the concentration at 0.1% (w/v because the buffer capacity of MES ranging pH 5.5-7.0 (for Arabidopsis, pH 5.8. However, many reports have shown that, in nature, roots require different pH values on the surface of specific root apex zones, namely meristem, transition zone and elongation zone. Despite the fact that roots always grow on a media containing buffer molecule, little is known about impact of MES on root growth. Here, we have checked the effects of different concentrations of MES buffer using growing roots of Arabidopsis thaliana. Our results show that 1% of MES significantly inhibited root growth, the number of root hairs and length of meristem, whereas 0.1% promoted root growth and root apex area (region spanning from the root tip up to the transition zone. Furthermore, superoxide generation in root apex disappeared at 1% of MES. These results suggest that MES disturbs normal root morphogenesis by changing the reactive oxygen species (ROS homeostasis in root apex.

  15. A method for quantitative assessment of artifacts in EEG, and an empirical study of artifacts.

    Science.gov (United States)

    Kappel, Simon L; Looney, David; Mandic, Danilo P; Kidmose, Preben

    2014-01-01

    Wearable EEG systems for continuous brain monitoring is an emergent technology that involves significant technical challenges. Some of these are related to the fact that these systems operate in conditions that are far less controllable with respect to interference and artifacts than is the case for conventional systems. Quantitative assessment of artifacts provides a mean for optimization with respect to electrode technology, electrode location, electronic instrumentation and system design. To this end, we propose an artifact assessment method and evaluate it over an empirical study of 3 subjects and 5 different types of artifacts. The study showed consistent results across subjects and artifacts.

  16. A mixed methods approach to understanding cyberbullying: A role for both quantitative and qualitative research.

    OpenAIRE

    Mc Guckin, Conor; Espey, K; Duffy, J

    2013-01-01

    PUBLISHED The study investigated the incidence and nature of cyber-bullying within six post-primary schools in Northern Ireland. A mixed methods sequential explanatory design was employed. The first, quantitative phase involved questionnaires with 757 year 8 and year 11 pupils (57.5% females, n = 435; 42.5% males, n = 322) ranging in age from 11 to 15 years (mean = 13.04 years). The second, qualitative phase involved focus groups with two groups of pupils (n = 14). Cyber-bullying was l...

  17. Collaborating to improve the use of free-energy and other quantitative methods in drug discovery

    Science.gov (United States)

    Sherborne, Bradley; Shanmugasundaram, Veerabahu; Cheng, Alan C.; Christ, Clara D.; DesJarlais, Renee L.; Duca, Jose S.; Lewis, Richard A.; Loughney, Deborah A.; Manas, Eric S.; McGaughey, Georgia B.; Peishoff, Catherine E.; van Vlijmen, Herman

    2016-12-01

    In May and August, 2016, several pharmaceutical companies convened to discuss and compare experiences with Free Energy Perturbation (FEP). This unusual synchronization of interest was prompted by Schrödinger's FEP+ implementation and offered the opportunity to share fresh studies with FEP and enable broader discussions on the topic. This article summarizes key conclusions of the meetings, including a path forward of actions for this group to aid the accelerated evaluation, application and development of free energy and related quantitative, structure-based design methods.

  18. Conductance method for quantitative determination of Photobacterium phosphoreum in fish products

    DEFF Research Database (Denmark)

    Dalgaard, Paw; Mejlholm, Ole; Huss, Hans Henrik

    1996-01-01

    This paper presents the development of a sensitive and selective conductance method for quantitative determination of Photobacterium phosphoreum in fresh fish. A calibration curve with a correlation coefficient of -0.981 was established from conductance detection times (DT) for estimation of cell...... organisms tested. In naturally contaminated fresh fish, P. phosphoreum was specifically enumerated when it made up 0.1% of the total level of micro-organisms. The repeatability (S.D.%) of the conductance assay ranged from 2.9% to 7.3%...

  19. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.

    Science.gov (United States)

    Tavakol, Mohsen; Sandars, John

    2014-10-01

    Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  20. Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.

    Science.gov (United States)

    Tavakol, Mohsen; Sandars, John

    2014-09-01

    Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.

  1. Simple methods for the qualitative identification and quantitative determination of macrolide antibiotics.

    Science.gov (United States)

    Danielson, N D; Holeman, J A; Bristol, D C; Kirzner, D H

    1993-02-01

    Pyrolysis-gas chromatography is shown to be a rapid straightforward method for the qualitative differentiation of the macrolide antibiotics erythromycin, oleandomycin, troleandomycin, spiramycin and tylosin. Organic salts do not interfere and identification of erythromycin and troleandomycin in commercial products is viable. Spectrophotometric quantitation of these same five antibiotics after reaction with concentrated sulphuric acid is studied at about 470 nm. Reaction conditions such as acid concentration, time and temperature are provided. The sugar moieties of the antibiotics are proposed as the reactive sites. Detection limits are about 0.2-1.0 microg ml-1 [corrected] and analysis of pharmaceutical products should be possible.

  2. A quantitative application of the J-based configuration analysis method to a flexible small molecule.

    Science.gov (United States)

    Sharman, Gary J

    2007-04-01

    An example of the use of the J-based configuration analysis method to determine relative stereochemistry of a small molecule related to reboxetine is described. This study was complicated by the fact that the molecule did not exhibit J-couplings and NOEs consistent with a single conformation, but rather an ensemble average. A quantitative fitting procedure using predicted couplings and NOEs from all possible conformers was used. This gave a clear indication of the stereochemistry, and the populations of the conformers involved.

  3. On the use of quantitative methods in the Danish food industry

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn; Østergaard, Peder; Kristensen, Kai

    1997-01-01

    Executive summary 1. The paper examines the use of quantitative methods in the Danish food industry and a comparison is made between the food industry and other manufacturing industries. Data was collected in 1991 and 107 manufacturing companies filled in the questionnaire. 20 of the companies were...... food companies. 2. The main purpose with this paper is to obtain a frame of reference for a much bigger investigation just carried out by the authors in the Danish food industry with particular emphasis on collection and processing of information during product development. 3. The comparison...

  4. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    Science.gov (United States)

    Gray, Jeffrey F.; Puri, Ashok

    2007-06-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green’s function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10-6 , comparable with the recent results reported in the literature.

  5. A simple method for unbiased quantitation of adoptively transferred cells in solid tissues

    DEFF Research Database (Denmark)

    Petersen, Mikkel; Petersen, Charlotte Christie; Agger, Ralf

    2006-01-01

    In a mouse model, we demonstrate how to obtain a direct, unbiased estimate of the total number of adoptively transferred cells in a variety of organs at different time points. The estimate is obtained by a straightforward method based on the optical fractionator principle. Specifically, non-stimu......, the samples were chosen and prepared in accordance with the optical fractionator principle. We demonstrate that the method is simple, precise, and well suited for quantitative immunological studies.......In a mouse model, we demonstrate how to obtain a direct, unbiased estimate of the total number of adoptively transferred cells in a variety of organs at different time points. The estimate is obtained by a straightforward method based on the optical fractionator principle. Specifically, non...... node at six different time points following adoptive transfer (from 60 s to 1 week), providing a quantitative estimate of the organ distribution of the transferred cells over time. These estimates were obtained by microscopy of uniform samples of thick sections from the respective organs. Importantly...

  6. Quantitative diagnostic method for biceps long head tendinitis by using ultrasound.

    Science.gov (United States)

    Huang, Shih-Wei; Wang, Wei-Te

    2013-01-01

    To investigate the feasibility of grayscale quantitative diagnostic method for biceps tendinitis and determine the cut-off points of a quantitative biceps ultrasound (US) method to diagnose biceps tendinitis. Design. Prospective cross-sectional case controlled study. Outpatient rehabilitation service. A total of 336 shoulder pain patients with suspected biceps tendinitis were recruited in this prospective observational study. The grayscale pixel data of the range of interest (ROI) were obtained for both the transverse and longitudinal views of the biceps US. A total of 136 patients were classified with biceps tendinitis, and 200 patients were classified as not having biceps tendinitis based on the diagnostic criteria. Based on the Youden index, the cut-off points were determined as 26.85 for the transverse view and 21.25 for the longitudinal view of the standard deviation (StdDev) of the ROI values, respectively. When the ROI evaluation of the US surpassed the cut-off point, the sensitivity was 68% and the specificity was 90% in the StdDev of the transverse view, and the sensitivity was 81% and the specificity was 73% in the StdDev of the longitudinal view to diagnose biceps tendinitis. For equivocal cases or inexperienced sonographers, our study provides a more objective method for diagnosing biceps tendinitis in shoulder pain patients.

  7. A Method for Quantitative Phase Analysis of Nanocrystalline Zirconium Dioxide Polymorphs.

    Science.gov (United States)

    Zhou, Zhiqiang; Guo, Li

    2015-04-01

    A method based on X-ray diffractometry was developed for quantitative phase analysis of nanocrystalline zirconium dioxide polymorphs. Corresponding formulas were derived. The key factors therein were evaluated by rigorous theoretical calculation and fully verified by experimentation. A process of iteration was raised to make the experimental verification proceed in the case of lack of pure ZrO2 crystal polymorphs. By this method, the weight ratios of tetragonal ZrO2 (t-ZrO2) to monoclinic ZrO2 (m-ZrO2) in any a mixture that contains nanocrystalline t-ZrO2 and m-ZrO2 or their weight fractions in a mixture that is composed of nanocrystalline t-ZrO2 and m-ZrO2 can be determined only upon an XRD test. It is proved by both theoretical calculation and experimental test that mutual substitutions of t-ZrO2 and cubic ZrO2 (c-ZrO2) in a wide range show almost no impact on the XRD patterns of their mixtures. And plus the similarity in property of t-ZrO2 and c-ZrO2, they can be treated as one whole phase. The high agreement of the theoretical and experimental results in this work also proves the validity and reliability of the theoretical calculation based on X-ray diffractometry theory for such quantitative phase analysis. This method has the potential to be popularized to other materials.

  8. Evaluation of HAART by HIV-1 Quantitative Method and Immunological Function Inspection

    Institute of Scientific and Technical Information of China (English)

    陈琳; 冯铁建; 李良成; 何建凡

    2001-01-01

    Objective: To observe the effectiveness of highly active antiretrovirus therapy (HAART) on HIV/AIDS patients. Methods: Using HIV-I quantitative methods and immunological function inspection, we monitored 4 HIV/AIDS patients who were suffering from immunological deficiency and were treated with HAART. Results: The reproduction of HIV in all 4 patients was efficiently controlled at the 4th week of the treatment. The average viral load decreased by 1.99 Log/ml (0.73 - 2.46 Log/ml). The number of CO+4 and CD+s showed a steady continuous increase 4 to 12 weeks after the treatment, with an increase of 67.2% and 103.0% respectively. Correlative study among different variables after the treatment revealed that positive correlation existed between the number of CD+4 and CD+3 as well as CD+8,while negative correlation existed between the number of CD+4 and plasma viral load. Conclusion: HIV-I quantitative method (plasma viral load)and the number of CD+4 in peripheral blood can be used as important reference indicators in evaluating HAART.

  9. Generalized multiple internal standard method for quantitative liquid chromatography mass spectrometry.

    Science.gov (United States)

    Hu, Yuan-Liang; Chen, Zeng-Ping; Chen, Yao; Shi, Cai-Xia; Yu, Ru-Qin

    2016-05-06

    In this contribution, a multiplicative effects model for generalized multiple-internal-standard method (MEMGMIS) was proposed to solve the signal instability problem of LC-MS over time. MEMGMIS model seamlessly integrates the multiple-internal-standard strategy with multivariate calibration method, and takes full use of all the information carried by multiple internal standards during the quantification of target analytes. Unlike the existing methods based on multiple internal standards, MEMGMIS does not require selecting an optimal internal standard for the quantification of a specific analyte from multiple internal standards used. MEMGMIS was applied to a proof-of-concept model system: the simultaneous quantitative analysis of five edible artificial colorants in two kinds of cocktail drinks. Experimental results demonstrated that MEMGMIS models established on LC-MS data of calibration samples prepared with ultrapure water could provide quite satisfactory concentration predictions for colorants in cocktail samples from their LC-MS data measured 10days after the LC-MS analysis of the calibration samples. The average relative prediction errors of MEMGMIS models did not exceed 6.0%, considerably better than the corresponding values of commonly used univariate calibration models combined with multiple internal standards. The advantages of good performance and simple implementation render MEMGMIS model a promising alternative tool in quantitative LC-MS assays.

  10. Quantitative evaluation of sustainable development based on ecological footprint method: a case study of Tianjin

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Since the concept of sustainable development emerged in the late 1980s, more and more countries and regions have been utilizing sustainable development as their developing strategy. But decades have passed without any effective methods available to quantitatively assess sustainable development, Since the ecological footprint evaluation method initiated in 1992, it has become popular in quantitative assessment of sustainable development because of its convenience, easy-understanding, and reliability. As one of the biggest coastal cities in north China and the economic center of the Bohai Coastal Region, Tianjin's gross domestic product (GDP) was 369.762 billion yuan in 2005, accounting for 2.0% of the whole nation's GDP The paper analyzes Tianjin's development with the ecological footprint method, and the results show that Tianjin's ecological footprint and biocapacity in 2005 were 2. 507gha/cap and 0.276gha/cap respectively. The ecological deficit was 2.230gha/cap. And from 1980 to 2005, Tianjin's ecological deficit per 104 yuan GDP decreased; while per capita ecological deficit has been tending to increase rapidly in recent years. All these results demonstrate that Tianjin is in a state of unsustainable development.

  11. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    Science.gov (United States)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as 1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  12. A method to prioritize quantitative traits and individuals for sequencing in family-based studies.

    Directory of Open Access Journals (Sweden)

    Kaanan P Shah

    Full Text Available Owing to recent advances in DNA sequencing, it is now technically feasible to evaluate the contribution of rare variation to complex traits and diseases. However, it is still cost prohibitive to sequence the whole genome (or exome of all individuals in each study. For quantitative traits, one strategy to reduce cost is to sequence individuals in the tails of the trait distribution. However, the next challenge becomes how to prioritize traits and individuals for sequencing since individuals are often characterized for dozens of medically relevant traits. In this article, we describe a new method, the Rare Variant Kinship Test (RVKT, which leverages relationship information in family-based studies to identify quantitative traits that are likely influenced by rare variants. Conditional on nuclear families and extended pedigrees, we evaluate the power of the RVKT via simulation. Not unexpectedly, the power of our method depends strongly on effect size, and to a lesser extent, on the frequency of the rare variant and the number and type of relationships in the sample. As an illustration, we also apply our method to data from two genetic studies in the Old Order Amish, a founder population with extensive genealogical records. Remarkably, we implicate the presence of a rare variant that lowers fasting triglyceride levels in the Heredity and Phenotype Intervention (HAPI Heart study (p = 0.044, consistent with the presence of a previously identified null mutation in the APOC3 gene that lowers fasting triglyceride levels in HAPI Heart study participants.

  13. A non-invasive method of qualitative and quantitative measurement of drugs.

    Science.gov (United States)

    Westerman, S T; Gilbert, L M

    1981-09-01

    Methods for quick qualitative and quantitative evaluation of drug intake are needed, especially during emergency situations such as drug overdose and alcohol intoxication. The electronystagmograph was used in an attempt to develop a non-invasive method for identification of drug intake, and to study the effects of alcohol and other drugs on the vestibular system. Results of the study reveal that alcohol, diazepam, opiates, barbiturates, cocaine, marijuana, and hallucinogenic drugs produce a characteristic printout pattern which can be evaluated qualitatively. This method is a practical, non-invasive, objective procedure that provides rapid assessment of quality of drug intake. Its potential uses are extensive, including such possibilities as evaluation of drug intake in emergency drug overdose situations, monitoring anesthesia during surgery, evaluating drug intake in women about to deliver, (as well as the effects on the newborn), and determining whether or not persons who are being tested on a polygraph are under the influence of drugs.

  14. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    Science.gov (United States)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  15. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    Science.gov (United States)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  16. Method for quantitative estimation of position perception using a joystick during linear movement.

    Science.gov (United States)

    Wada, Y; Tanaka, M; Mori, S; Chen, Y; Sumigama, S; Naito, H; Maeda, M; Yamamoto, M; Watanabe, S; Kajitani, N

    1996-12-01

    We designed a method for quantitatively estimating self-motion perceptions during passive body movement on a sled. The subjects were instructed to tilt a joystick in proportion to perceived displacement from a giving starting position during linear movement with varying displacements of 4 m, 10 m and 16 m induced by constant acceleration of 0.02 g, 0.05 g and 0.08 g along the antero-posterior axis. With this method, we could monitor not only subjective position perceptions but also response latencies for the beginning (RLbgn) and end (RLend) of the linear movement. Perceived body position fitted Stevens' power law, where R=kSn (R is output of the joystick, k is a constant, S is the displacement from the linear movement and n is an exponent). RLbgn decreased as linear acceleration increased. We conclude that this method is useful in analyzing the features and sensitivities of self-motion perceptions during movement.

  17. A method for estimating the effective number of loci affecting a quantitative character.

    Science.gov (United States)

    Slatkin, Montgomery

    2013-11-01

    A likelihood method is introduced that jointly estimates the number of loci and the additive effect of alleles that account for the genetic variance of a normally distributed quantitative character in a randomly mating population. The method assumes that measurements of the character are available from one or both parents and an arbitrary number of full siblings. The method uses the fact, first recognized by Karl Pearson in 1904, that the variance of a character among offspring depends on both the parental phenotypes and on the number of loci. Simulations show that the method performs well provided that data from a sufficient number of families (on the order of thousands) are available. This method assumes that the loci are in Hardy-Weinberg and linkage equilibrium but does not assume anything about the linkage relationships. It performs equally well if all loci are on the same non-recombining chromosome provided they are in linkage equilibrium. The method can be adapted to take account of loci already identified as being associated with the character of interest. In that case, the method estimates the number of loci not already known to affect the character. The method applied to measurements of crown-rump length in 281 family trios in a captive colony of African green monkeys (Chlorocebus aethiopus sabaeus) estimates the number of loci to be 112 and the additive effect to be 0.26 cm. A parametric bootstrap analysis shows that a rough confidence interval has a lower bound of 14 loci.

  18. Quantitative Detection Method of Hydroxyapatite Nanoparticles Based on Eu(3+) Fluorescent Labeling in Vitro and in Vivo.

    Science.gov (United States)

    Xie, Yunfei; Perera, Thalagalage Shalika Harshani; Li, Fang; Han, Yingchao; Yin, Meizhen

    2015-11-04

    One major challenge for application of hydroxyapatite nanoparticles (nHAP) in nanomedicine is the quantitative detection method. Herein, we exploited one quantitative detection method for nHAP based on the Eu(3+) fluorescent labeling via a simple chemical coprecipitation method. The trace amount of nHAP in cells and tissues can be quantitatively detected on the basis of the fluorescent quantitative determination of Eu(3+) ions in nHAP crystal lattice. The lowest concentration of Eu(3+) ions that can be quantitatively detected is 0.5 nM using DELFIA enhancement solution. This methodology can be broadly applicable for studying the tissue distribution and metabolization of nHAP in vivo.

  19. A novel benzene quantitative analysis method using miniaturized metal ionization gas sensor and non-linear bistable dynamic system.

    Science.gov (United States)

    Tang, Xuxiang; Liu, Fuqi

    2015-01-01

    In this paper, a novel benzene quantitative analysis method utilizing miniaturized metal ionization gas sensor and non-linear bistable dynamic system was investigated. Al plate anodic gas-ionization sensor was installed for electrical current-voltage data measurement. Measurement data was analyzed by non-linear bistable dynamics system. Results demonstrated that this method realized benzene concentration quantitative determination. This method is promising in laboratory safety management in benzene leak detection.

  20. Quantitative measurement of speech sound distortions with the aid of minimum variance spectral estimation method for dentistry use.

    Science.gov (United States)

    Bereteu, L; Drăgănescu, G E; Stănescu, D; Sinescu, C

    2011-12-01

    In this paper, we search an adequate quantitative method based on minimum variance spectral analysis in order to reflect the dependence of the speech quality on the correct positioning of the dental prostheses. We also search some quantitative parameters, which reflect the correct position of dental prostheses in a sensitive manner.

  1. Selective Weighted Least Squares Method for Fourier Transform Infrared Quantitative Analysis.

    Science.gov (United States)

    Wang, Xin; Li, Yan; Wei, Haoyun; Chen, Xia

    2016-10-26

    Classical least squares (CLS) regression is a popular multivariate statistical method used frequently for quantitative analysis using Fourier transform infrared (FT-IR) spectrometry. Classical least squares provides the best unbiased estimator for uncorrelated residual errors with zero mean and equal variance. However, the noise in FT-IR spectra, which accounts for a large portion of the residual errors, is heteroscedastic. Thus, if this noise with zero mean dominates in the residual errors, the weighted least squares (WLS) regression method described in this paper is a better estimator than CLS. However, if bias errors, such as the residual baseline error, are significant, WLS may perform worse than CLS. In this paper, we compare the effect of noise and bias error in using CLS and WLS in quantitative analysis. Results indicated that for wavenumbers with low absorbance, the bias error significantly affected the error, such that the performance of CLS is better than that of WLS. However, for wavenumbers with high absorbance, the noise significantly affected the error, and WLS proves to be better than CLS. Thus, we propose a selective weighted least squares (SWLS) regression that processes data with different wavenumbers using either CLS or WLS based on a selection criterion, i.e., lower or higher than an absorbance threshold. The effects of various factors on the optimal threshold value (OTV) for SWLS have been studied through numerical simulations. These studies reported that: (1) the concentration and the analyte type had minimal effect on OTV; and (2) the major factor that influences OTV is the ratio between the bias error and the standard deviation of the noise. The last part of this paper is dedicated to quantitative analysis of methane gas spectra, and methane/toluene mixtures gas spectra as measured using FT-IR spectrometry and CLS, WLS, and SWLS. The standard error of prediction (SEP), bias of prediction (bias), and the residual sum of squares of the errors

  2. Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions*

    Science.gov (United States)

    Shatsky, Maxim; Dong, Ming; Liu, Haichuan; Yang, Lee Lisheng; Choi, Megan; Singer, Mary E.; Geller, Jil T.; Fisher, Susan J.; Hall, Steven C.; Hazen, Terry C.; Brenner, Steven E.; Butland, Gareth; Jin, Jian; Witkowska, H. Ewa; Chandonia, John-Marc; Biggin, Mark D.

    2016-01-01

    Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification of endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR. PMID:27099342

  3. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  4. Spatial access priority mapping (SAPM with fishers: a quantitative GIS method for participatory planning.

    Directory of Open Access Journals (Sweden)

    Katherine L Yates

    Full Text Available Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the

  5. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    Science.gov (United States)

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  6. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue.

    Science.gov (United States)

    Foldager, Casper Bindzus; Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-04-01

    To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin-eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm(3) (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage.

  7. Quantitative analysis of patients with celiac disease by video capsule endoscopy: A deep learning method.

    Science.gov (United States)

    Zhou, Teng; Han, Guoqiang; Li, Bing Nan; Lin, Zhizhe; Ciaccio, Edward J; Green, Peter H; Qin, Jing

    2017-06-01

    Celiac disease is one of the most common diseases in the world. Capsule endoscopy is an alternative way to visualize the entire small intestine without invasiveness to the patient. It is useful to characterize celiac disease, but hours are need to manually analyze the retrospective data of a single patient. Computer-aided quantitative analysis by a deep learning method helps in alleviating the workload during analysis of the retrospective videos. Capsule endoscopy clips from 6 celiac disease patients and 5 controls were preprocessed for training. The frames with a large field of opaque extraluminal fluid or air bubbles were removed automatically by using a pre-selection algorithm. Then the frames were cropped and the intensity was corrected prior to frame rotation in the proposed new method. The GoogLeNet is trained with these frames. Then, the clips of capsule endoscopy from 5 additional celiac disease patients and 5 additional control patients are used for testing. The trained GoogLeNet was able to distinguish the frames from capsule endoscopy clips of celiac disease patients vs controls. Quantitative measurement with evaluation of the confidence was developed to assess the severity level of pathology in the subjects. Relying on the evaluation confidence, the GoogLeNet achieved 100% sensitivity and specificity for the testing set. The t-test confirmed the evaluation confidence is significant to distinguish celiac disease patients from controls. Furthermore, it is found that the evaluation confidence may also relate to the severity level of small bowel mucosal lesions. A deep convolutional neural network was established for quantitative measurement of the existence and degree of pathology throughout the small intestine, which may improve computer-aided clinical techniques to assess mucosal atrophy and other etiologies in real-time with videocapsule endoscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A SVM-based quantitative fMRI method for resting-state functional network detection.

    Science.gov (United States)

    Song, Xiaomu; Chen, Nan-kuei

    2014-09-01

    Resting-state functional magnetic resonance imaging (fMRI) aims to measure baseline neuronal connectivity independent of specific functional tasks and to capture changes in the connectivity due to neurological diseases. Most existing network detection methods rely on a fixed threshold to identify functionally connected voxels under the resting state. Due to fMRI non-stationarity, the threshold cannot adapt to variation of data characteristics across sessions and subjects, and generates unreliable mapping results. In this study, a new method is presented for resting-state fMRI data analysis. Specifically, the resting-state network mapping is formulated as an outlier detection process that is implemented using one-class support vector machine (SVM). The results are refined by using a spatial-feature domain prototype selection method and two-class SVM reclassification. The final decision on each voxel is made by comparing its probabilities of functionally connected and unconnected instead of a threshold. Multiple features for resting-state analysis were extracted and examined using an SVM-based feature selection method, and the most representative features were identified. The proposed method was evaluated using synthetic and experimental fMRI data. A comparison study was also performed with independent component analysis (ICA) and correlation analysis. The experimental results show that the proposed method can provide comparable or better network detection performance than ICA and correlation analysis. The method is potentially applicable to various resting-state quantitative fMRI studies.

  9. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    Science.gov (United States)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  10. Painting With and Without Numbers: the use of qualitative and quantitative methods to study social learning

    Directory of Open Access Journals (Sweden)

    Leoni Warne

    2005-05-01

    Full Text Available The Enterprise Social Learning Architecture (ESLA team of the Defence Science and Technology Organisation (DSTO, conducted a four-year research study investigating social learning within the Australian Defence Organisation (ADO. The immediate aim of this research was to understand the issues inherent in building learning, adaptive and sustainable systems. The long-term objective was to develop architectures that would support the development of information systems to guide and enhance organisational learning and facilitate knowledge management. In this paper we will discuss the methodologies used by the ESLA team to gain understanding into effective social learning and the organisational and cultural factors that support such learning. Also, the paper will discuss the lessons learned from methodological approaches to this study as well as support tools used to analyse large volumes of qualitative data. There has been an increasing emphasis in the past decade on investigating the social and organisational factors that may underpin successful information system development and usage (Butterfield and Pendegraft, 1996; Davenport and Prusak, 1992; DeLone and McLean, 1992. Investigation of these issues necessitates a sound understanding of organisational culture, human social interactions, communication and relationships, and reflects an increasing awareness of the importance of the social aspects of socio-technical systems that people work and operate in. This paper describes the process by which the qualitative methods in this study of knowledge processes were expanded to include quantitative methods. It focuses on how this combination of data collection methods evolved, and the ways in which it was capitalised on to provide a much more enriched set of findings than would have been the case if qualitative or quantitative methods had been used alone. The paper also focuses on pitfalls that arose in the use of the various methods, including those

  11. Quantitative determination of sibutramine in adulterated herbal slimming formulations by TLC-image analysis method.

    Science.gov (United States)

    Phattanawasin, Panadda; Sotanaphun, Uthai; Sukwattanasinit, Tasamaporn; Akkarawaranthorn, Jariya; Kitchaiya, Sarunyaporn

    2012-06-10

    A simple thin layer chromatographic (TLC)-image analysis method was developed for rapid determination and quantitation of sibutramine hydrochloride (SH) adulterated in herbal slimming products. Chromatographic separation of SH was achieved on a silica gel 60 F(254) TLC plate, using toluene-n-hexane-diethylamine (9:1:0.3, v/v/v) as the mobile phase and Dragendorff reagent as spot detection. Image analysis of the scanned TLC plate was performed to quantify the amount of SH. The polynomial regression data for the calibration plots showed good linear relationship in the concentration range of 1-6 μg/spot. The limits of detection and quantitation were 190 and 634 ng/spot, respectively. The method gave satisfactory specificity, precision, accuracy, robustness and was applied for determination of SH in herbal formulations. The contents of SH in adulterated samples determined by the TLC-image analysis and TLC-densitometry were also compared. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  12. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods

    Directory of Open Access Journals (Sweden)

    Selcuk SAVAS

    Full Text Available ABSTRACT Objective The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs with four different quantitative methods. Material and Methods Four windows (3x3 mm were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week in acidified gel system. The test material (MI Varnish was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH, quantitative light-induced fluorescence-digital (QLF-D, energy-dispersive spectroscopy (EDS and laser fluorescence (LF pen. The data were statistically analyzed (α=0.05. Results While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p0.05. With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (p<0.05. After the 1- and 4-week treatment periods, the calcium (Ca and phosphate (P concentrations and Ca/P ratio were higher compared to those of the demineralization surfaces (p<0.05. Conclusion CPP-ACP containing fluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use.

  13. Comparison of analytic methods for quantitative real-time polymerase chain reaction data.

    Science.gov (United States)

    Chen, Ping; Huang, Xuelin

    2015-11-01

    Polymerase chain reaction (PCR) is a laboratory procedure to amplify and simultaneously quantify targeted DNA molecules, and then detect the product of the reaction at the end of all the amplification cycles. A more modern technique, real-time PCR, also known as quantitative PCR (qPCR), detects the product after each cycle of the progressing reaction by applying a specific fluorescence technique. The quantitative methods currently used to analyze qPCR data result in varying levels of estimation quality. This study compares the accuracy and precision of the estimation achieved by eight different models when applied to the same qPCR dataset. Also, the study evaluates a newly introduced data preprocessing approach, the taking-the-difference approach, and compares it to the currently used approach of subtracting the background fluorescence. The taking-the-difference method subtracts the fluorescence in the former cycle from that in the latter cycle to avoid estimating the background fluorescence. The results obtained from the eight models show that taking-the-difference is a better way to preprocess qPCR data compared to the original approach because of a reduction in the background estimation error. The results also show that weighted models are better than non-weighted models, and that the precision of the estimation achieved by the mixed models is slightly better than that achieved by the linear regression models.

  14. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    Science.gov (United States)

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  15. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    Directory of Open Access Journals (Sweden)

    Joanna Kwiecinska-Piróg

    2014-12-01

    Full Text Available Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride and CV (crystal violet application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters. The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  16. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    Directory of Open Access Journals (Sweden)

    Linjun Fan

    2014-01-01

    Full Text Available This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA. Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service’s evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA is constructed based on finite state automata (FSA, which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average is the biggest influential factor, the noncomposition of atomic services (13.12% is the second biggest one, and the service version’s confusion (1.2% is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  17. Joint Analysis Method for Major Genes Controlling Multiple Correlated Quantitative Traits

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the major gene and polygene mixed inheritance model for multiple correlated quantitative traits, the authors proposed a new joint segregation analysis method of major gene controlling multiple correlated quantitative traits, which include major gene detection and its effect and variation estimation. The effect and variation of major gene are estimated by the maximum likelihood method implemented via expectation-maximization (EM) algorithm. Major gene is tested with the likelihood ratio (LR) test statistic. Extensive simulation studies showed that joint analysis not only increases the statistical power of major gene detection but also improves the precision and accuracy of major gene effect estimates. An example of the plant height and the number of tiller of F2 population in rice cross Duonieai × Zhonghua 11 was used in the illustration. The results indicated that the genetic difference of these two traits in this cross refers to only one pleiotropic major gene. The additive effect and dominance effect of the major gene are estimated as -21.3 and 40.6 cm on plant height, and 22.7 and -25.3 on number of tiller, respectively. The major gene shows overdominance for plant height and close to complete dominance for number of tillers.

  18. Clinical Comparison of Root Length Measurements with Electronic Apex Locator and Conventional Radiography in Mandibular Deciduous Teeth

    Directory of Open Access Journals (Sweden)

    Eskandarian T.

    2011-04-01

    Full Text Available Statement of Problem: Success in pulpectomy of deciduous teeth greatly depends on the accuracy of root length measurements which have mostly been done radiographically. However, X-ray risks and patient cooperation have usually been a challenge for the clinicians.Purpose: The clinical comparison of root length measurements with electronic apex locator and conventional radiography in 4-6 year old children's mandibular deciduous teeth was the aim of the present study.Materials and Method: In the current clinical trial, 15 mandibular molars with 60 canals in 4-6 year old patients who had the treatment plan of pulpectomy were chosen. The measured root lengths with apex locator and parallel technique radiography were evaluated with the same reference point. Data were analyzed using simple linear regression, coefficient of correlation, coefficient of variability and also graphic Bland Altman Plot.Results: The accuracy of electronic apex locator measurements in ±0/5 from apical foramen was 85%. In all cases without considering pulp situation, the difference between the two techniques was not significant.Conclusion: The electronic apex locators are recommended for root length measurements of the deciduous mandibular molars without apical resorption, disregarding the pulp status, especially when initial radiographic films are available.

  19. Evaluation of Conventional Radiography and an Electronic Apex Locator in Determining the Working Length in C-shaped Canals

    Science.gov (United States)

    Jafarzadeh, Hamid; Beyrami, Masoud; Forghani, Maryam

    2017-01-01

    Introduction: The purpose of this in vitro study was to compare the accuracy of working length determination using the apex locator versus conventional radiography in C-shaped canals. Methods and Materials: After confirming the actual C-shaped anatomy using cone-beam computed tomography (CBCT), 22 extracted C-shaped mandibular second molars were selected and decoronated at the cemento-enamel junction. The actual working length of these canals were determined by inserting a #15 K-file until the tip could be seen through the apical foramen and the working length was established by subtracting 0.5 mm from this length. The working length was also determined using conventional analog radiography and electronic apex locator (EAL) that were both compared with the actual working length. The data was statistically analyzed using paired t-test and marginal homogeneity test. Results: There was no significant differences between the working length obtained with apex locator and that achieved through conventional radiography in terms of measuring the mesiolingual and distal canals (P>0.05); while, significant differences were observed in measurements of the mesiobuccal canals (P=0.036). Within ±0.5 mm of tolerance margin there was no significant difference between EAL and conventional radiography. Conclusion: The apex locator was more accurate in determination of the working length of C-shaped canals compared with the conventional radiography. PMID:28179926

  20. A novel HPTLC method for quantitative estimation of biomarkers in polyherbal formulation

    Institute of Scientific and Technical Information of China (English)

    Zeeshan Ahmed Sheikh; Sadia Shakeel; Somia Gul; Aqib Zahoor; Saleha Suleman Khan; Faisal Haider Zaidi; Khan Usmanghani

    2015-01-01

    Objective:To explore the quantitative estimation of biomarkers gallic acid and berberine in polyherbal formulation Entoban syrup. Methods: High performance thin layer chromatography was performed to evaluate the presence of gallic acid and berberine employing toluene:ethyl acetate:formic acid:methanol 12:9:4:0.5 (v/v/v/v) and ethanol: water: formic acid 90:9:1 (v/v/v), as a mobile phase respectively. Results:The Rf values (0.58) for gallic acid and (0.76) for berberine in both sample and reference standard were found comparable under UV light at 273 nm and 366 nm respectively. The high performance thin layer chromatography method developed for quantization was simple, accurate and specific. Conclusions: The present standardization provides specific and accurate tool to develop qualifications for identity, transparency and reproducibility of biomarkers in Entoban syrup.

  1. A novel HPTLC method for quantitative estimation of biomarkers in polyherbal formulation

    Institute of Scientific and Technical Information of China (English)

    Zeeshan; Ahmed; Sheikh; Sadia; Shakeel; Somia; Gul; Aqib; Zahoor; Saleha; Suleman; Khan; Faisal; Haider; Zaidi; Khan; Usmanghani

    2015-01-01

    Objective: To explore the quantitative estimation of biomarkers gallic acid and berberine in polyherbal formulation Entoban syrup.Methods: High performance thin layer chromatography was performed to evaluate the presence of gallic acid and berberine employing toluene: ethyl acetate: formic acid:methanol 12:9:4:0.5(v/v/v/v) and ethanol: water: formic acid 90:9:1(v/v/v), as a mobile phase respectively.Results: The R f values(0.58) for gallic acid and(0.76) for berberine in both sample and reference standard were found comparable under UV light at 273 nm and 366 nm respectively. The high performance thin layer chromatography method developed for quantization was simple, accurate and specific.Conclusions: The present standardization provides specific and accurate tool to develop qualifications for identity, transparency and reproducibility of biomarkers in Entoban syrup.

  2. A Quantitative Calculation Method of Composite Spatial Direction Similarity Concerning Scale Differences

    Directory of Open Access Journals (Sweden)

    CHEN Zhanlong

    2016-03-01

    Full Text Available This article introduces a new model for direction relations between multiple spatial objects at multiple scales and a corresponding similarity assessment method. The model is an improvement of direction relation matrix, which quantitatively models direction relations on object scale, and by the idea of decomposition and means of the optimum solution of the transportation problem to solve the minimum conversion cost between multiple direction matrices, namely distance between a pair of matrices, thus quantified the difference between a pair of directions, finally obtain the similarity values between arbitrary pairs of multiple spatial objects and compare the results. Experiments on calculating similarity between objects at different scales show that the presented method is efficient, accurate, and capable of obtaining results consistent with human cognition.

  3. Genetic programming:  a novel method for the quantitative analysis of pyrolysis mass spectral data.

    Science.gov (United States)

    Gilbert, R J; Goodacre, R; Woodward, A M; Kell, D B

    1997-11-01

    A technique for the analysis of multivariate data by genetic programming (GP) is described, with particular reference to the quantitative analysis of orange juice adulteration data collected by pyrolysis mass spectrometry (PyMS). The dimensionality of the input space was reduced by ranking variables according to product moment correlation or mutual information with the outputs. The GP technique as described gives predictive errors equivalent to, if not better than, more widespread methods such as partial least squares and artificial neural networks but additionally can provide a means for easing the interpretation of the correlation between input and output variables. The described application demonstrates that by using the GP method for analyzing PyMS data the adulteration of orange juice with 10% sucrose solution can be quantified reliably over a 0-20% range with an RMS error in the estimate of ∼1%.

  4. Methods for a quantitative evaluation of odd-even staggering effects

    CERN Document Server

    Olmi, Alessandro

    2015-01-01

    Odd-even effects, also known as "staggering" effects, are a common feature observed in the yield distributions of fragments produced in different types of nuclear reactions. We review old methods, and we propose new ones, for a quantitative estimation of these effects as a function of proton or neutron number of the reaction products. All methods are compared on the basis of Monte Carlo simulations. We find that some are not well suited for the task, the most reliable ones being those based either on a non-linear fit with a properly oscillating function or on a third (or fourth) finite difference approach. In any case, high statistic is of paramount importance to avoid that spurious structures appear just because of statistical fluctuations in the data and of strong correlations among the yields of neighboring fragments.

  5. Standard test method for quantitative determination of americium 241 in plutonium by Gamma-Ray spectrometry

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1994-01-01

    1.1 This test method covers the quantitative determination of americium 241 by gamma-ray spectrometry in plutonium nitrate solution samples that do not contain significant amounts of radioactive fission products or other high specific activity gamma-ray emitters. 1.2 This test method can be used to determine the americium 241 in samples of plutonium metal, oxide and other solid forms, when the solid is appropriately sampled and dissolved. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  6. Quantitative Study on Nonmetallic Inclusion Particles in Steels by Automatic Image Analysis With Extreme Values Method

    Institute of Scientific and Technical Information of China (English)

    Cássio Barbosa; José Brant de Campos; J(ǒ)neo Lopes do Nascimento; Iêda Maria Vieira Caminha

    2009-01-01

    The presence of nonmetallic inclusion particles which appear during steelmaking process is harmful to the properties of steels, which is mainly as a function of some aspects such as size, volume fraction, shape, and distribution of these particles. The automatic image analysis technique is one of the most important tools for the quantitative determination of these parameters. The classical Student approach and the Extreme Values Method (EVM) were used for the inclusion size and shape determination and the evaluation of distance between the inclusion particles. The results thus obtained indicated that there were significant differences in the characteristics of the inclusion particles in the analyzed products. Both methods achieved results with some differences, indicating that EVM could be used as a faster and more reliable statistical methodology.

  7. Enantiomer labelling, a method for the quantitative analysis of amino acids.

    Science.gov (United States)

    Frank, H; Nicholson, G J; Bayer, E

    1978-12-21

    Enantiomer labelling a method for the quntitative analysis of optically active natural compounds by gas chromatography, involves the use of the unnatural enantiomer as an internal standard. With Chirasil-Val, a chiral stationary phase that is thermally stable up to up to 240 degrees, the enantiomers of amino acids and a variety of other compounds can be separated and quantitated. Incomplete recovery from the sample, incomplete derivatization, hydrolysis and thermal decomposition of the derivative and shifting response factors can be compensated for by adding the unnatural enantiomer. The accuracy of amino acid analysis by enantiomer labelling is equal or superior to that of hitherto known methods. The procedure affords a complete analysis of peptides with respect to both amino acid composition and the optical purity of each amino acid.

  8. A robust GC-MS method for the quantitation of fatty acids in biological systems.

    Science.gov (United States)

    Jayasinghe, Nirupama Samanmalie; Dias, Daniel Anthony

    2013-01-01

    Fatty acids (FAs) are involved in a wide range of functions in biological systems. It is important to measure the exact amount of fatty acids in biological matrices in order to determine the level of fatty acids and understand the role they play. The ability to quantify fatty acids in various systems, especially plant species and microbes has recently paved the way to the mass production of pharmaceuticals and energy substitutes including biodiesel. This chapter describes an efficient method to quantify the total fatty acids (TFAs) in biological systems using gas chromatography-mass spectrometry (GC-MS) and a commercially available standard mix of fatty acid methyl esters (FAMEs) using a step-by-step methodology to setup a quantitation method using the Agilent Chemstation software.

  9. Simple saponification method for the quantitative determination of carotenoids in green vegetables.

    Science.gov (United States)

    Larsen, Erik; Christensen, Lars P

    2005-08-24

    A simple, reliable, and gentle saponification method for the quantitative determination of carotenoids in green vegetables was developed. The method involves an extraction procedure with acetone and the selective removal of the chlorophylls and esterified fatty acids from the organic phase using a strongly basic resin (Ambersep 900 OH). Extracts from common green vegetables (beans, broccoli, green bell pepper, chive, lettuce, parsley, peas, and spinach) were analyzed by high-performance liquid chromatography (HPLC) for their content of major carotenoids before and after action of Ambersep 900 OH. The mean recovery percentages for most carotenoids [(all-E)-violaxanthin, (all-E)-lutein epoxide, (all-E)-lutein, neolutein A, and (all-E)-beta-carotene] after saponification of the vegetable extracts with Ambersep 900 OH were close to 100% (99-104%), while the mean recovery percentages of (9'Z)-neoxanthin increased to 119% and that of (all-E)-neoxanthin and neolutein B decreased to 90% and 72%, respectively.

  10. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  11. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    Directory of Open Access Journals (Sweden)

    Ahmed R

    2014-03-01

    Full Text Available Rafay Ahmed,1 Matthew J Oborski,2 Misun Hwang,1 Frank S Lieberman,3 James M Mountz11Department of Radiology, 2Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA; 3Department of Neurology and Department of Medicine, Division of Hematology/Oncology, University of Pittsburgh School of Medicine, Pittsburgh, PA, USAAbstract: Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies

  12. A New Quantitative Method for Evaluating Dry Powder Inhalation Efficiency in Asthma Patients.

    Science.gov (United States)

    Liang, Yasha; Hu, Hefang; Tian, Cuijie; Lei, Yi; Liu, Chuntao; Luo, Fengming

    2016-10-01

    Many methods have been developed to evaluate dry powder inhalation techniques and their efficiency for disease control in asthma patients. However, it is difficult to apply these methods to clinical practice and research. In this study, we introduce a simple new method that can be applied to dry powder inhalation techniques to evaluate their efficiency in clinical practice. Twenty volunteers were recruited to evaluate the reliability of this new method. One hundred one asthma patients who met the inclusion criteria participated in this study. A dark cloth covered the outlet of the inhaler during dry powder inhalation. The image formed by the inhalation process was evaluated using analysis software and converted into integrated optical density (IOD). Inhalation techniques were scored before and after inhalation technique training, and asthma control was evaluated using the Asthma Control Questionnaire (ACQ) before inhalation technique training. The relative standard deviation of IOD ranged from 3.8% to 7.8%. In patients with or without inhaler prior use, both the IOD and inhalation technique scores improved significantly after inhalation technique training (p quantitative method is equivalent to traditional methods for dry powder inhalation evaluation. This study also indicated that training significantly improved the inhalation technique and efficiency in asthma patients with or without prior inhaler use.

  13. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages.

    Directory of Open Access Journals (Sweden)

    Jian Yuan

    Full Text Available Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and "naked" unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance.

  14. An Improved DNA Extraction Method for Efficient and Quantitative Recovery of Phytoplankton Diversity in Natural Assemblages

    Science.gov (United States)

    Yuan, Jian; Li, Meizhen; Lin, Senjie

    2015-01-01

    Marine phytoplankton are highly diverse with different species possessing different cell coverings, posing challenges for thoroughly breaking the cells in DNA extraction yet preserving DNA integrity. While quantitative molecular techniques have been increasingly used in phytoplankton research, an effective and simple method broadly applicable to different lineages and natural assemblages is still lacking. In this study, we developed a bead-beating protocol based on our previous experience and tested it against 9 species of phytoplankton representing different lineages and different cell covering rigidities. We found the bead-beating method enhanced the final yield of DNA (highest as 2 folds) in comparison with the non-bead-beating method, while also preserving the DNA integrity. When our method was applied to a field sample collected at a subtropical bay located in Xiamen, China, the resultant ITS clone library revealed a highly diverse assemblage of phytoplankton and other micro-eukaryotes, including Archaea, Amoebozoa, Chlorophyta, Ciliphora, Bacillariophyta, Dinophyta, Fungi, Metazoa, etc. The appearance of thecate dinoflagellates, thin-walled phytoplankton and “naked” unicellular organisms indicates that our method could obtain the intact DNA of organisms with different cell coverings. All the results demonstrate that our method is useful for DNA extraction of phytoplankton and environmental surveys of their diversity and abundance. PMID:26218575

  15. Rapid quantitative analysis of lipids using a colorimetric method in a microplate format.

    Science.gov (United States)

    Cheng, Yu-Shen; Zheng, Yi; VanderGheynst, Jean S

    2011-01-01

    A colorimetric sulfo-phospho-vanillin (SPV) method was developed for high throughput analysis of total lipids. The developed method uses a reaction mixture that is maintained in a 96-well microplate throughout the entire assay. The new assay provides the following advantages over other methods of lipid measurement: (1) background absorbance can be easily corrected for each well, (2) there is less risk of handling and transferring sulfuric acid contained in reaction mixtures, (3) color develops more consistently providing more accurate measurement of absorbance, and (4) the assay can be used for quantitative measurement of lipids extracted from a wide variety of sources. Unlike other spectrophotometric approaches that use fluorescent dyes, the optimal spectra and reaction conditions for the developed assay do not vary with the sample source. The developed method was used to measure lipids in extracts from four strains of microalgae. No significant difference was found in lipid determination when lipid content was measured using the new method and compared to results obtained using a macro-gravimetric method.

  16. New Spectrophotometric Methods for Quantitative Determination of 7-ADCA in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    Medikondu Kishore,

    2010-09-01

    Full Text Available Three simple, sensitive and accurate methods are described for the determination of 7-Amino deacetoxy cephalosporanic acid (7-ADCA in bulk drug and in formulations. Methods Ma to Mc are based on ion association complex between 7-ADCA and NQS (Ma, vanillin (Mb and Ninhydrin (Mc solutions. The chromogen being extractable with chloroform could be measured quantitatively at 480 (Ma and 560 nm (Mb&c. All variables were studied to optimize the reaction conditions. Regression analysis of Beer's Law plot showed good correlation in the concentration range 4-24 for Ma, 0.4-2.4 for Mb and 0.5-3.0 g/mL for Mc. The calculated molar absorptivity valuesare 5.945 x 103, 1.722 x 105, and 6.701 x 104 L/mol/cm for Ma to Mc, respectively. The methods were successfully applied to the determination of 7-ADCA in formulations and the results tallied well with the label claim. The results were statistically compared with those of a literature method by applying the Student’s t-test and F-test. No interference was observed from the concomitant substances normally added to preparations. The accuracy and validity of the methods were further ascertained by performing recovery experiments via standard-addition method.

  17. How to use linear regression and correlation in quantitative method comparison studies.

    Science.gov (United States)

    Twomey, P J; Kroll, M H

    2008-04-01

    Linear regression methods try to determine the best linear relationship between data points while correlation coefficients assess the association (as opposed to agreement) between the two methods. Linear regression and correlation play an important part in the interpretation of quantitative method comparison studies. Their major strength is that they are widely known and as a result both are employed in the vast majority of method comparison studies. While previously performed by hand, the availability of statistical packages means that regression analysis is usually performed by software packages including MS Excel, with or without the software programe Analyze-it as well as by other software packages. Such techniques need to be employed in a way that compares the agreement between the two methods examined and more importantly, because we are dealing with individual patients, whether the degree of agreement is clinically acceptable. Despite their use for many years, there is a lot of ignorance about the validity as well as the pros and cons of linear regression and correlation techniques. This review article describes the types of linear regression and regression (parametric and non-parametric methods) and the necessary general and specific requirements. The selection of the type of regression depends on where one has been trained, the tradition of the laboratory and the availability of adequate software.

  18. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-01-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  19. Investigation of the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection method: a phantom study

    Science.gov (United States)

    Abuhadi, Nouf; Bradley, David; Katarey, Dev; Podolyak, Zsolt; Sassi, Salem

    2014-03-01

    Introduction: Single-Photon Emission Computed Tomography (SPECT) is used to measure and quantify radiopharmaceutical distribution within the body. The accuracy of quantification depends on acquisition parameters and reconstruction algorithms. Until recently, most SPECT images were constructed using Filtered Back Projection techniques with no attenuation or scatter corrections. The introduction of 3-D Iterative Reconstruction algorithms with the availability of both computed tomography (CT)-based attenuation correction and scatter correction may provide for more accurate measurement of radiotracer bio-distribution. The effect of attenuation and scatter corrections on accuracy of SPECT measurements is well researched. It has been suggested that the combination of CT-based attenuation correction and scatter correction can allow for more accurate quantification of radiopharmaceutical distribution in SPECT studies (Bushberg et al., 2012). However, The effect of respiratory induced cardiac motion on SPECT images acquired using higher resolution algorithms such 3-D iterative reconstruction with attenuation and scatter corrections has not been investigated. Aims: To investigate the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection (FBP) methods implemented on cardiac SPECT/CT imaging with and without CT-attenuation and scatter corrections. Also to investigate the effects of respiratory induced cardiac motion on myocardium perfusion quantification. Lastly, to present a comparison of spatial resolution for FBP and ordered subset expectation maximization (OSEM) Flash 3D together with and without respiratory induced motion, and with and without attenuation and scatter correction. Methods: This study was performed on a Siemens Symbia T16 SPECT/CT system using clinical acquisition protocols. Respiratory induced cardiac motion was simulated by imaging a cardiac phantom insert whilst moving it using a respiratory motion motor

  20. Wholly Endoscopic Permeatal Removal of a Petrous Apex Cholesteatoma

    Directory of Open Access Journals (Sweden)

    Todd Kanzara

    2014-01-01

    Full Text Available We report a case of a petrous apex cholesteatoma which was managed with a wholly endoscopic permeatal approach. A 63-year-old Caucasian male presented with a 10-year history of right-sided facial palsy and profound deafness. On examination in our clinic, the patient had a grade VI House-Brackmann paresis, otoscopic evidence of attic cholesteatoma behind an intact drum, and extensive scarring of the face from previous facial reanimation surgery. Imaging review was suggestive of petrous apex cholesteatoma. An initial decision to manage the patient conservatively was later reviewed on account of the patient suffering recurrent epileptic seizures. A wholly endoscopic permeatal approach was used with successful outcomes. In addition to the case report we also provide a brief description of the technique and a review of the relevant literature.

  1. APEX 1 mm line survey of the Orion Bar

    CERN Document Server

    Leurini, S; Thorwirth, S; Parise, B; Schilke, P; Comito, C; Wyrowski, F; Güsten, R; Bergman, P; Menten, K M; Nyman, L A A

    2006-01-01

    Unbiased molecular line surveys are a powerful tool for analyzing the physical and chemical parameters of astronomical objects and are the only means for obtaining a complete view of the molecular inventory for a given source. The present work stands for the first such investigation of a photon-dominated region. The first results of an ongoing millimeter-wave survey obtained towards the Orion Bar are reported. The APEX telescope in combination with the APEX-2A facility receiver was employed in this investigation. We derived the physical parameters of the gas through LVG analyses of the methanol and formaldehyde data. Information on the sulfur and deuterium chemistry of photon-dominated regions is obtained from detections of several sulfur-bearing molecules and DCN.

  2. The Doppler paradigm and the APEX-EPOS-ORANGE quandary

    CERN Document Server

    Griffin, J J

    1996-01-01

    The experimental detection of the sharp lines of the \\ee Puzzle is viewed as a struggle against Doppler broadening. Gedanken experiments which are realistic in zeroth order of detail are analyzed to show that the ORANGE and EPOS/I geometries select narrower slices of a Doppler broadened line than spherically inclusive (APEX and EPOS/II --like) apparati. Roughly speaking, the latter require event-by-event Doppler reconstruction simply to regain an even footing with the former. This suggests that APEX' or EPOS/II's coincident pair distributions must be statistically superior to those of EPOS/I or ORANGE in order to support a comparable inference about sharp structure. Under present circumstances, independent alternative data is invaluable. Therefore, a corroboration of Sakai's 330.1 keV (< 3 keV wide) electron line in few MeV e^+ or e^- bombardments of U and Th targets could prove crucial.

  3. Quantitative GSL-glycome analysis of human whole serum based on an EGCase digestion and glycoblotting method[S

    Science.gov (United States)

    Furukawa, Jun-ichi; Sakai, Shota; Yokota, Ikuko; Okada, Kazue; Hanamatsu, Hisatoshi; Kobayashi, Takashi; Yoshida, Yasunobu; Higashino, Kenichi; Tamura, Tomohiro; Igarashi, Yasuyuki; Shinohara, Yasuro

    2015-01-01

    Glycosphingolipids (GSLs) are lipid molecules linked to carbohydrate units that form the plasma membrane lipid raft, which is clustered with sphingolipids, sterols, and specific proteins, and thereby contributes to membrane physical properties and specific recognition sites for various biological events. These bioactive GSL molecules consequently affect the pathophysiology and pathogenesis of various diseases. Thus, altered expression of GSLs in various diseases may be of importance for disease-related biomarker discovery. However, analysis of GSLs in blood is particularly challenging because GSLs are present at extremely low concentrations in serum/plasma. In this study, we established absolute GSL-glycan analysis of human serum based on endoglycoceramidase digestion and glycoblotting purification. We established two sample preparation protocols, one with and the other without GSL extraction using chloroform/methanol. Similar amounts of GSL-glycans were recovered with the two protocols. Both protocols permitted absolute quantitation of GSL-glycans using as little as 20 μl of serum. Using 10 healthy human serum samples, up to 42 signals corresponding to GSL-glycan compositions could be quantitatively detected, and the total serum GSL-glycan concentration was calculated to be 12.1–21.4 μM. We further applied this method to TLC-prefractionated serum samples. These findings will assist the discovery of disease-related biomarkers by serum GSL-glycomics. PMID:26420879

  4. A Dilute-and-Shoot LC-MS Method for Quantitating Opioids in Oral Fluid.

    Science.gov (United States)

    Enders, Jeffrey R; McIntire, Gregory L

    2015-10-01

    Opioid testing represents a dominant share of the market in pain management clinical testing facilities. Testing of this drug class in oral fluid (OF) has begun to rise in popularity. OF analysis has traditionally required extensive clean-up protocols and sample concentration, which can be avoided. This work highlights the use of a fast, 'dilute-and-shoot' method that performs no considerable sample manipulation. A quantitative method for the determination of eight common opioids and associated metabolites (codeine, morphine, hydrocodone, hydromorphone, norhydrocodone, oxycodone, noroxycodone and oxymorphone) in OF is described herein. OF sample is diluted 10-fold in methanol/water and then analyzed using an Agilent chromatographic stack coupled with an AB SCIEX 4500. The method has a 2.2-min LC gradient and a cycle time of 2.9 min. In contrast to most published methods of this particular type, this method uses no sample clean-up or concentration and has a considerably faster LC gradient, making it ideal for very high-throughput laboratories. Importantly, the method requires only 100 μL of sample and is diluted 10-fold prior to injection to help with instrument viability. Baseline separation of all isobaric opioids listed above was achieved on a phenyl-hexyl column. The validated calibration range for this method is 2.5-1,000 ng/mL. This 'dilute-and-shoot' method removes the unnecessary, costly and time-consuming extraction steps found in traditional methods and still surpasses all analytical requirements. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Mucormycosis with Orbital Apex Syndrome in a Renal Transplant Recipient

    Directory of Open Access Journals (Sweden)

    Ebru Kursun

    2015-06-01

    Full Text Available Mucormycosis is a rarely encountered invasive fungal infection with high mortality.Solid organ transplantation is one of the risk factors for mucormycosis. Mucormycosis can be classified in six different groups according to the anatomical localization; rhinocerebral, pulmonary, cutaneous, gastrointestinal, disseminated, and other less common involvements. This paper presented a mucormycosis case with rhinoorbitocerebral involvementin a renal transplantation receiver, which manifested with orbital apex syndrome. [Cukurova Med J 2015; 40(2.000: 384-389

  6. IMPROVED RP-HPLC METHOD FOR QUANTITATIVE ESTIMATION OF STEVIOSIDE IN STEVIA REBAUDIANA BERTONI BURM

    Directory of Open Access Journals (Sweden)

    Shankar Katekhaye

    2011-01-01

    Full Text Available An RP-HPLC method with UV array detection was established for the determination of stevioside, an extract of herbal S. rebaudiana plant. The stevioside was separated using isocratic solvent system consisting of methanol and 0.1% orthophosphoric acid (v/v in water (70:30 at flow rate of 1.0 ml/min and the detection wavelength of 219 nm. The method was validated for linearity, precision, accuracy, limit of detection (LOD, and limit of quantitation (LOQ. The linearity of the proposed method was obtained in the range of 5.0-75 μg/ml with regression coefficient of 0.9999. Intraday and interday precision studies showed the relative standard deviation less than 2.5%. The accuracy of the proposed method was determined by a recovery study conducted at 3 different levels. The average recovery was 97-99%. The LOD and LOQ were 0.02 and 0.05 µg/ml, respectively. The content of stevioside obtained in the dried leaves powder was within the ranges of 6.83 – 7.91% and 1.7 – 2.9 % w/w, respectively. The proposed method is simple, sensitive, yet reproducible. It is therefore suitable for routine analysis of stevioside in S. rebaudiana Bertoni.

  7. Quantitative measurement of ultrasound pressure field by optical phase contrast method and acoustic holography

    Science.gov (United States)

    Oyama, Seiji; Yasuda, Jun; Hanayama, Hiroki; Yoshizawa, Shin; Umemura, Shin-ichiro

    2016-07-01

    A fast and accurate measurement of an ultrasound field with various exposure sequences is necessary to ensure the efficacy and safety of various ultrasound applications in medicine. The most common method used to measure an ultrasound pressure field, that is, hydrophone scanning, requires a long scanning time and potentially disturbs the field. This may limit the efficiency of developing applications of ultrasound. In this study, an optical phase contrast method enabling fast and noninterfering measurements is proposed. In this method, the modulated phase of light caused by the focused ultrasound pressure field is measured. Then, a computed tomography (CT) algorithm used to quantitatively reconstruct a three-dimensional (3D) pressure field is applied. For a high-intensity focused ultrasound field, a new approach that combines the optical phase contrast method and acoustic holography was attempted. First, the optical measurement of focused ultrasound was rapidly performed over the field near a transducer. Second, the nonlinear propagation of the measured ultrasound was simulated. The result of the new approach agreed well with that of the measurement using a hydrophone and was improved from that of the phase contrast method alone with phase unwrapping.

  8. Worked examples of alternative methods for the synthesis of qualitative and quantitative research in systematic reviews

    Science.gov (United States)

    Lucas, Patricia J; Baird, Janis; Arai, Lisa; Law, Catherine; Roberts, Helen M

    2007-01-01

    Background The inclusion of qualitative studies in systematic reviews poses methodological challenges. This paper presents worked examples of two methods of data synthesis (textual narrative and thematic), used in relation to one review, with the aim of enabling researchers to consider the strength of different approaches. Methods A systematic review of lay perspectives of infant size and growth was conducted, locating 19 studies (including both qualitative and quantitative). The data extracted from these were synthesised using both a textual narrative and a thematic synthesis. Results The processes of both methods are presented, showing a stepwise progression to the final synthesis. Both methods led us to similar conclusions about lay views toward infant size and growth. Differences between methods lie in the way they dealt with study quality and heterogeneity. Conclusion On the basis of the work reported here, we consider textual narrative and thematic synthesis have strengths and weaknesses in relation to different research questions. Thematic synthesis holds most potential for hypothesis generation, but may obscure heterogeneity and quality appraisal. Textual narrative synthesis is better able to describe the scope of existing research and account for the strength of evidence, but is less good at identifying commonality. PMID:17224044

  9. A small-scale method for quantitation of carotenoids in bacteria and yeasts.

    Science.gov (United States)

    Kaiser, Philipp; Surmann, Peter; Vallentin, Gerald; Fuhrmann, Herbert

    2007-07-01

    Microbial carotenoids are difficult to extract because of their embedding into a compact matrix and prominent sensitivity to degradation. Especially for carotenoid analysis of bacteria and yeasts, there is lack of information about capability, precision and recovery of the method used. Accordingly, we investigated feasibility, throughput and validity of a new small-scale method using Micrococcus luteus and Rhodotorula glutinis for testing purposes. For disintegration and extraction, we combined primarily mild techniques: enzymatically we used combinations of lysozyme and lipase for bacteria as well as lyticase and lipase for yeasts. Additional mechanical treatment included sonication and freeze-thawing cycles. Chemical treatment with dimethylsulfoxide was applied for yeasts only. For extraction we used a methanol-chloroform mixture stabilized efficiently with butylated hydroxytoluene and alpha-tocopherol. Separation of compounds was achieved with HPLC, applying a binary methanol/tert-butyl methyl ether gradient on a polymer reversed C30 phase. Substances of interest were detected and identified applying a photodiode-array (PDA) and carotenoids quantitated as all-trans-beta-carotene equivalents. For evaluation of recovery and reproducibility of the extraction method, we used beta-8'-apo-carotenal as internal standard. The method provides a sensitive tool for the determination of carotenoids from bacteria and yeasts and also for small changes in carotenoid spectrum of a single species. Corequisite large experiments are facilitated by the high throughput of the method.

  10. Validation procedures for quantitative gluten ELISA methods: AOAC allergen community guidance and best practices.

    Science.gov (United States)

    Koerner, Terry B; Abbott, Michael; Godefroy, Samuel Benrejeb; Popping, Bert; Yeung, Jupiter M; Diaz-Amigo, Carmen; Roberts, James; Taylor, Steve L; Baumert, Joseph L; Ulberth, Franz; Wehling, Paul; Koehler, Peter

    2013-01-01

    The food allergen analytical community is endeavoring to create harmonized guidelines for the validation of food allergen ELISA methodologies to help protect food-sensitive individuals and promote consumer confidence. This document provides additional guidance to existing method validation publications for quantitative food allergen ELISA methods. The gluten-specific criterion provided in this document is divided into sections for information required by the method developer about the assay and information for the implementation of the multilaboratory validation study. Many of these recommendations and guidance are built upon the widely accepted Codex Alimentarius definitions and recommendations for gluten-free foods. The information in this document can be used as the basis of a harmonized validation protocol for any ELISA method for gluten, whether proprietary or nonproprietary, that will be submitted to AOAC andlor regulatory authorities or other bodies for status recognition. Future work is planned for the implementation of this guidance document for the validation of gluten methods and the creation of gluten reference materials.

  11. Simultaneous quantitative determination of paracetamol and tramadol in tablet formulation using UV spectrophotometry and chemometric methods

    Science.gov (United States)

    Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav

    2016-03-01

    The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).

  12. The use of electromagnetic induction methods for establishing quantitative permafrost models in West Greenland

    Science.gov (United States)

    Ingeman-Nielsen, Thomas; Brandt, Inooraq

    2010-05-01

    The sedimentary settings at West Greenlandic town and infrastructural development sites are dominated by fine-grained marine deposits of late to post glacial origin. Prior to permafrost formation, these materials were leached by percolating precipitation, resulting in depletion of salts. Present day permafrost in these deposits is therefore very ice-rich with ice contents approaching 50-70% vol. in some areas. Such formations are of great concern in building and construction projects in Greenland, as they loose strength and bearing capacity upon thaw. It is therefore of both technical and economical interest to develop methods to precisely investigate and determine parameters such as ice-content and depth to bedrock in these areas. In terms of geophysical methods for near surface investigations, traditional methods such as Electrical Resistivity Tomography (ERT) and Refraction Seismics (RS) have generally been applied with success. The Georadar method usually fails due to very limited penetration depth in the fine-grained materials, and Electromagnetic Induction (EMI) methods are seldom applicable for quantitative interpretation due to the very high resistivities causing low induced currents and thus small secondary fields. Nevertheless, in some areas of Greenland the marine sequence was exposed relatively late, and as a result the sediments may not be completely leached of salts. In such cases, layers with pore water salinity approaching that of sea water, may be present below an upper layer of very ice rich permafrost. The saline pore water causes a freezing-point depression which results in technically unfrozen sediments at permafrost temperatures around -3 °C. Traditional ERT and VES measurements are severely affected by equivalency problems in these settings, practically prohibiting reasonable quantitative interpretation without constraining information. Such prior information may be obtained of course from boreholes, but equipment capable of drilling

  13. Bifid cardiac apex in a 25-year-old male with sudden cardiac death.

    Science.gov (United States)

    Wu, Annie; Kay, Deborah; Fishbein, Michael C

    2014-01-01

    Although a bifid cardiac apex is common in certain marine animals, it is an uncommon finding in humans. When present, bifid cardiac apex is usually associated with other congenital heart anomalies. We present a case of bifid cardiac apex that was an incidental finding in a 25-year-old male with sudden cardiac death from combined drug toxicity. On gross examination, there was a bifid cardiac apex with a 2-cm long cleft. There were no other significant gross or microscopic abnormalities. This case represents the very rare occurrence of a bifid cardiac apex as an isolated cardiac anomaly.

  14. DEVELOPMENT OF THE TECHNIQUE OF QUANTITATIVE DEFINITION OF TRIAZAVIRIN IN WATER SOLUTIONS WITH USE OF THE SPECTROPHOTOMETRY METHOD

    OpenAIRE

    M. Yu. Kinev; O. A. Melnikova; A. Yu. Petro; D. V. Zaboyarkina

    2014-01-01

    Authors of article developed techniques of quantitative definition Triazavirin in water solutions with use of a method of a spectrophotometry. Three options of a method of a s spectrophotometry are used: direct spectrophotometry, spectrophotometry according to A.M. Firordt, a spectrophotometry with use of standard solution. For all techniques calculation of metrological characteristics is carried out. The developed techniques of quantitative definition were applicable for practical use in the...

  15. A quantitative method for zoning of protected areas and its spatial ecological implications.

    Science.gov (United States)

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  16. A Novel Method of Quantitative Anterior Chamber Depth Estimation Using Temporal Perpendicular Digital Photography

    Science.gov (United States)

    Zamir, Ehud; Kong, George Y.X.; Kowalski, Tanya; Coote, Michael; Ang, Ghee Soon

    2016-01-01

    Purpose We hypothesize that: (1) Anterior chamber depth (ACD) is correlated with the relative anteroposterior position of the pupillary image, as viewed from the temporal side. (2) Such a correlation may be used as a simple quantitative tool for estimation of ACD. Methods Two hundred sixty-six phakic eyes had lateral digital photographs taken from the temporal side, perpendicular to the visual axis, and underwent optical biometry (Nidek AL scanner). The relative anteroposterior position of the pupillary image was expressed using the ratio between: (1) lateral photographic temporal limbus to pupil distance (“E”) and (2) lateral photographic temporal limbus to cornea distance (“Z”). In the first chronological half of patients (Correlation Series), E:Z ratio (EZR) was correlated with optical biometric ACD. The correlation equation was then used to predict ACD in the second half of patients (Prediction Series) and compared to their biometric ACD for agreement analysis. Results A strong linear correlation was found between EZR and ACD, R = −0.91, R2 = 0.81. Bland-Altman analysis showed good agreement between predicted ACD using this method and the optical biometric ACD. The mean error was −0.013 mm (range −0.377 to 0.336 mm), standard deviation 0.166 mm. The 95% limits of agreement were ±0.33 mm. Conclusions Lateral digital photography and EZR calculation is a novel method to quantitatively estimate ACD, requiring minimal equipment and training. Translational Relevance EZ ratio may be employed in screening for angle closure glaucoma. It may also be helpful in outpatient medical clinic settings, where doctors need to judge the safety of topical or systemic pupil-dilating medications versus their risk of triggering acute angle closure glaucoma. Similarly, non ophthalmologists may use it to estimate the likelihood of acute angle closure glaucoma in emergency presentations. PMID:27540496

  17. A quantitative method for risk assessment of agriculture due to climate change

    Science.gov (United States)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  18. A new method and software for quantitative analysis of continuous intracranial pressure recordings.

    Science.gov (United States)

    Eide, P K; Fremming, A D

    2001-12-01

    A computer software utilising a new method for quantitative analysis of intracranial pressure (ICP), was developed to provide a more accurate analysis of continuously recorded ICP. Intracranial pressure curves were analysed by the software to explore the relationship between mean ICP and the presence of ICP elevations. The Sensometrics Pressure Analyser (version 1.2) software provides a quantitative analysis of the ICP curve, presenting the ICP recordings as a matrix of numbers of ICP elevations of different levels (e.g. 20 or 30 or 40 mmHg) and durations (e.g. 0.5, 5 or 10 minutes). The number of ICP elevations may be standardised by calculating the number of elevations during for instance a 10 hour period. The computer software was used to retrospectively analyse the ICP curves in our first consecutive 127 patients undergoing continuous 24 hours ICP monitoring during the two-year period from February 1997 to December 1998. The indications for ICP monitoring were suspected hydrocephalus, craniosynostosis or shunt failure. Analysis of the ICP curves revealed a rather weak relationship between mean ICP and the number of apparently abnormal ICP elevations (that is elevations of 20 mmHg or above). Abnormal ICP elevations were present in a relatively high proportion of cases with a normal mean ICP below 10 mmHg, or a borderline mean ICP between 10 and 15 mmHg. In addition, the ICP data of two cases are presented suggesting that mean ICP may be an inaccurate measure of ICP. The results of analysing ICP curves by means of this method and software reveal that calculation of ICP elevations of different levels and durations may represent a more accurate description of the ICP curve than calculation of mean ICP. The method may enhance the clinical application of ICP monitoring.

  19. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    Science.gov (United States)

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  20. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    Science.gov (United States)

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  1. Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.

    Science.gov (United States)

    Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C

    2015-02-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  2. Quantitative Analysis of Differential Proteome Expression in Bladder Cancer vs. Normal Bladder Cells Using SILAC Method.

    Directory of Open Access Journals (Sweden)

    Ganglong Yang

    Full Text Available The best way to increase patient survival rate is to identify patients who are likely to progress to muscle-invasive or metastatic disease upfront and treat them more aggressively. The human cell lines HCV29 (normal bladder epithelia, KK47 (low grade nonmuscle invasive bladder cancer, NMIBC, and YTS1 (metastatic bladder cancer have been widely used in studies of molecular mechanisms and cell signaling during bladder cancer (BC progression. However, little attention has been paid to global quantitative proteome analysis of these three cell lines. We labeled HCV29, KK47, and YTS1 cells by the SILAC method using three stable isotopes each of arginine and lysine. Labeled proteins were analyzed by 2D ultrahigh-resolution liquid chromatography LTQ Orbitrap mass spectrometry. Among 3721 unique identified and annotated proteins in KK47 and YTS1 cells, 36 were significantly upregulated and 74 were significantly downregulated with >95% confidence. Differential expression of these proteins was confirmed by western blotting, quantitative RT-PCR, and cell staining with specific antibodies. Gene ontology (GO term and pathway analysis indicated that the differentially regulated proteins were involved in DNA replication and molecular transport, cell growth and proliferation, cellular movement, immune cell trafficking, and cell death and survival. These proteins and the advanced proteome techniques described here will be useful for further elucidation of molecular mechanisms in BC and other types of cancer.

  3. Evaluation of the remineralization capacity of CPP-ACP containing fluoride varnish by different quantitative methods.

    Science.gov (United States)

    Savas, Selcuk; Kavrìk, Fevzi; Kucukyìlmaz, Ebru

    2016-01-01

    The aim of this study was to evaluate the efficacy of CPP-ACP containing fluoride varnish for remineralizing white spot lesions (WSLs) with four different quantitative methods. Four windows (3x3 mm) were created on the enamel surfaces of bovine incisor teeth. A control window was covered with nail varnish, and WSLs were created on the other windows (after demineralization, first week and fourth week) in acidified gel system. The test material (MI Varnish) was applied on the demineralized areas, and the treated enamel samples were stored in artificial saliva. At the fourth week, the enamel surfaces were tested by surface microhardness (SMH), quantitative light-induced fluorescence-digital (QLF-D), energy-dispersive spectroscopy (EDS) and laser fluorescence (LF pen). The data were statistically analyzed (α=0.05). While the LF pen measurements showed significant differences at baseline, after demineralization, and after the one-week remineralization period (p0.05). With regards to the SMH and QLF-D analyses, statistically significant differences were found among all the phases (pfluoride varnish provides remineralization of WSLs after a single application and seems suitable for clinical use.

  4. Spiked proteomic standard dataset for testing label-free quantitative software and statistical methods.

    Science.gov (United States)

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Dorssaeler, Alain Van; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-03-01

    This data article describes a controlled, spiked proteomic dataset for which the "ground truth" of variant proteins is known. It is based on the LC-MS analysis of samples composed of a fixed background of yeast lysate and different spiked amounts of the UPS1 mixture of 48 recombinant proteins. It can be used to objectively evaluate bioinformatic pipelines for label-free quantitative analysis, and their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. More specifically, it can be useful for tuning software tools parameters, but also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. The raw MS files can be downloaded from ProteomeXchange with identifier PXD001819. Starting from some raw files of this dataset, we also provide here some processed data obtained through various bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, to exemplify the use of such data in the context of software benchmarking, as discussed in details in the accompanying manuscript [1]. The experimental design used here for data processing takes advantage of the different spike levels introduced in the samples composing the dataset, and processed data are merged in a single file to facilitate the evaluation and illustration of software tools results for the detection of variant proteins with different absolute expression levels and fold change values.

  5. Quantitative evaluation of besifloxacin ophthalmic suspension by HPLC, application to bioassay method and cytotoxicity studies.

    Science.gov (United States)

    Costa, Márcia C N; Barden, Amanda T; Andrade, Juliana M M; Oppe, Tércio P; Schapoval, Elfrides E S

    2014-02-01

    Besifloxacin (BSF) is a synthetic chiral fluoroquinolone developed for the topical treatment of ophthalmic infections. The present study reports the development and validation of a microbiological assay, applying the cylinder-plate method, for determination of BSF in ophthalmic suspension. To assess this methodology, the development and validation of the method was performed for the quantification of BSF by high performance liquid chromatography (HPLC). The HPLC method showed specificity, linearity in the range of 20-80 µg mL(-1) (r=0.9998), precision, accuracy and robustness. The microbiological method is based on the inhibitory effect of BSF upon the strain of Staphylococcus epidermidis ATCC 12228 used as a test microorganism. The bioassay validation method yielded excellent results and included linearity, precision, accuracy, robustness and selectivity. The assay results were treated statistically by analysis of variance (ANOVA) and were found to be linear (r=0.9974) in the range of 0.5-2.0 µg mL(-1), precise (inter-assay: RSD=0.84), accurate (101.4%), specific and robust. The bioassay and the previously validated high performance liquid chromatographic (HPLC) method were compared using Student's t test, which indicated that there was no statistically significant difference between these two methods. These results confirm that the proposed microbiological method can be used as routine analysis for the quantitative determination of BSF in an ophthalmic suspension. A preliminary stability study during the HPLC validation was performed and demonstrated that BSF is unstable under UV conditions. The photodegradation kinetics of BSF in water showed a first-order reaction for the drug product (ophthalmic suspension) and a second-order reaction for the reference standard (RS) under UVA light. UVA degraded samples of BSF were also studied in order to determine the preliminary in vitro cytotoxicity against mononuclear cells. The results indicated that BSF does not alter

  6. Evaluation of methods for oligonucleotide array data via quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Morris Daryl E

    2006-01-01

    Full Text Available Abstract Background There are currently many different methods for processing and summarizing probe-level data from Affymetrix oligonucleotide arrays. It is of great interest to validate these methods and identify those that are most effective. There is no single best way to do this validation, and a variety of approaches is needed. Moreover, gene expression data are collected to answer a variety of scientific questions, and the same method may not be best for all questions. Only a handful of validation studies have been done so far, most of which rely on spike-in datasets and focus on the question of detecting differential expression. Here we seek methods that excel at estimating relative expression. We evaluate methods by identifying those that give the strongest linear association between expression measurements by array and the "gold-standard" assay. Quantitative reverse-transcription polymerase chain reaction (qRT-PCR is generally considered the "gold-standard" assay for measuring gene expression by biologists and is often used to confirm findings from microarray data. Here we use qRT-PCR measurements to validate methods for the components of processing oligo array data: background adjustment, normalization, mismatch adjustment, and probeset summary. An advantage of our approach over spike-in studies is that methods are validated on a real dataset that was collected to address a scientific question. Results We initially identify three of six popular methods that consistently produced the best agreement between oligo array and RT-PCR data for medium- and high-intensity genes. The three methods are generally known as MAS5, gcRMA, and the dChip mismatch mode. For medium- and high-intensity genes, we identified use of data from mismatch probes (as in MAS5 and dChip mismatch and a sequence-based method of background adjustment (as in gcRMA as the most important factors in methods' performances. However, we found poor reliability for methods

  7. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  8. A simple, quantitative method using alginate gel to determine rat colonic tumor volume in vivo.

    Science.gov (United States)

    Irving, Amy A; Young, Lindsay B; Pleiman, Jennifer K; Konrath, Michael J; Marzella, Blake; Nonte, Michael; Cacciatore, Justin; Ford, Madeline R; Clipson, Linda; Amos-Landgraf, James M; Dove, William F

    2014-04-01

    Many studies of the response of colonic tumors to therapeutics use tumor multiplicity as the endpoint to determine the effectiveness of the agent. These studies can be greatly enhanced by accurate measurements of tumor volume. Here we present a quantitative method to easily and accurately determine colonic tumor volume. This approach uses a biocompatible alginate to create a negative mold of a tumor-bearing colon; this mold is then used to make positive casts of dental stone that replicate the shape of each original tumor. The weight of the dental stone cast correlates highly with the weight of the dissected tumors. After refinement of the technique, overall error in tumor volume was 16.9% ± 7.9% and includes error from both the alginate and dental stone procedures. Because this technique is limited to molding of tumors in the colon, we utilized the Apc(Pirc/+) rat, which has a propensity for developing colonic tumors that reflect the location of the majority of human intestinal tumors. We have successfully used the described method to determine tumor volumes ranging from 4 to 196 mm³. Alginate molding combined with dental stone casting is a facile method for determining tumor volume in vivo without costly equipment or knowledge of analytic software. This broadly accessible method creates the opportunity to objectively study colonic tumors over time in living animals in conjunction with other experiments and without transferring animals from the facility where they are maintained.

  9. Intracranial aneurysm segmentation in 3D CT angiography: method and quantitative validation

    Science.gov (United States)

    Firouzian, Azadeh; Manniesing, R.; Flach, Z. H.; Risselada, R.; van Kooten, F.; Sturkenboom, M. C. J. M.; van der Lugt, A.; Niessen, W. J.

    2010-03-01

    Accurately quantifying aneurysm shape parameters is of clinical importance, as it is an important factor in choosing the right treatment modality (i.e. coiling or clipping), in predicting rupture risk and operative risk and for pre-surgical planning. The first step in aneurysm quantification is to segment it from other structures that are present in the image. As manual segmentation is a tedious procedure and prone to inter- and intra-observer variability, there is a need for an automated method which is accurate and reproducible. In this paper a novel semi-automated method for segmenting aneurysms in Computed Tomography Angiography (CTA) data based on Geodesic Active Contours is presented and quantitatively evaluated. Three different image features are used to steer the level set to the boundary of the aneurysm, namely intensity, gradient magnitude and variance in intensity. The method requires minimum user interaction, i.e. clicking a single seed point inside the aneurysm which is used to estimate the vessel intensity distribution and to initialize the level set. The results show that the developed method is reproducible, and performs in the range of interobserver variability in terms of accuracy.

  10. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    Science.gov (United States)

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society.

  11. Polymorphism in nimodipine raw materials: development and validation of a quantitative method through differential scanning calorimetry.

    Science.gov (United States)

    Riekes, Manoela Klüppel; Pereira, Rafael Nicolay; Rauber, Gabriela Schneider; Cuffini, Silvia Lucia; de Campos, Carlos Eduardo Maduro; Silva, Marcos Antonio Segatto; Stulzer, Hellen Karine

    2012-11-01

    Due to the physical-chemical and therapeutic impacts of polymorphism, its monitoring in raw materials is necessary. The purpose of this study was to develop and validate a quantitative method to determine the polymorphic content of nimodipine (NMP) raw materials based on differential scanning calorimetry (DSC). The polymorphs required for the development of the method were characterized through DSC, X-ray powder diffraction (XRPD) and Raman spectroscopy and their polymorphic identity was confirmed. The developed method was found to be linear, robust, precise, accurate and specific. Three different samples obtained from distinct suppliers (NMP 1, NMP 2 and NMP 3) were firstly characterized through XRPD and DSC as polymorphic mixtures. The determination of their polymorphic identity revealed that all samples presented the Modification I (Mod I) or metastable form in greatest proportion. Since the commercial polymorph is Mod I, the polymorphic characteristic of the samples analyzed needs to be investigated. Thus, the proposed method provides a useful tool for the monitoring of the polymorphic content of NMP raw materials. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    Science.gov (United States)

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  13. WNN-Based Network Security Situation Quantitative Prediction Method and Its Optimization

    Institute of Scientific and Technical Information of China (English)

    Ji-Bao Lai; Hui-Qiang Wang; Xiao-Wu Liu; Ying Liang; Rui-Juan Zheng; Guo-Sheng Zhao

    2008-01-01

    The accurate and real-time prediction of network security situation is the premise and basis of preventing intrusions and attacks in a large-scale network. In order to predict the security situation more accurately, a quantitative prediction method of network security situation based on Wavelet Neural Network with Genetic Algorithm (GAWNN) is proposed. After analyzing the past and the current network security situation in detail, we build a network security situation prediction model based on wavelet neural network that is optimized by the improved genetic algorithm and then adopt GAWNN to predict the non-linear time series of network security situation. Simulation experiments prove that the proposed method has advantages over Wavelet Neural Network (WNN) method and Back Propagation Neural Network (BPNN) method with the same architecture in convergence speed, functional approximation and prediction accuracy. What is more, system security tendency and laws by which security analyzers and administrators can adjust security policies in near real-time are revealed from the prediction results as early as possible.

  14. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    Science.gov (United States)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  15. Evaluation of radar-gauge merging methods for quantitative precipitation estimates

    Directory of Open Access Journals (Sweden)

    E. Goudenhoofdt

    2008-10-01

    Full Text Available Accurate quantitative precipitation estimates are of crucial importance for hydrological studies and applications. When spatial precipitation fields are required, rain gauge measurements are often combined with weather radar observations. In this paper, we evaluate several radar-gauge merging methods with various degrees of complexity: from mean field bias correction to geostatical merging techniques. The study area is the Walloon region of Belgium, which is mostly located in the Meuse catchment. Observations from a C-band Doppler radar and a dense rain gauge network are used to retrieve daily rainfall accumulations over this area. The relative performance of the different merging methods are assessed through a comparison against daily measurements from an independent gauge network. A 3-year verification is performed using several statistical quality parameters. It appears that the geostatistical merging methods perform best with the mean absolute error decreasing by 40% with respect to the original data. A mean field bias correction still achieves a reduction of 25%. A seasonal analysis shows that the benefit of using radar observations is particularly significant during summer. The effect of the network density on the performance of the methods is also investigated. For this purpose, a simple approach to remove gauges from a network is proposed. The analysis reveals that the sensitivity is relatively high for the geostatistical methods but rather small for the simple methods. The geostatistical methods give the best results for all network densities except for a very low density of 1 gauge per 500 km2 where a range-dependent adjustment complemented with a static local bias correction performs best.

  16. Method of quantitatively separating uranium from specimens of natural water by sorption on silica

    Energy Technology Data Exchange (ETDEWEB)

    Putral, A.; Schwochau, K.

    1981-08-11

    A method of quantitatively separating uranium from specimens of natural water, especially sea water or from solutions which as to their composition are comparable to such specimens, while silica gel is provided for adsorbing the uranium. The liquid of the specimens or solution is conveyed over granular silica gel having an average grain diameter not exceeding 0.5 mm. The through-flow speed is so selected that the contact time period for the liquid with the silica gel amounts to at least 50 seconds, whereupon the uranium adsorbed on the silica gel is separated therefrom by elutriation with at least 0.1 normal oxidizing mineral acid. The volume of the elutriate utilized, should amount to at least twice the volume of the silica gel, and this proportion of the elutriate should, if possible, not be exceeded.

  17. Rapid and Inexpensive Screening of Genomic Copy Number Variations Using a Novel Quantitative Fluorescent PCR Method

    Directory of Open Access Journals (Sweden)

    Martin Stofanko

    2013-01-01

    Full Text Available Detection of human microdeletion and microduplication syndromes poses significant burden on public healthcare systems in developing countries. With genome-wide diagnostic assays frequently inaccessible, targeted low-cost PCR-based approaches are preferred. However, their reproducibility depends on equally efficient amplification using a number of target and control primers. To address this, the recently described technique called Microdeletion/Microduplication Quantitative Fluorescent PCR (MQF-PCR was shown to reliably detect four human syndromes by quantifying DNA amplification in an internally controlled PCR reaction. Here, we confirm its utility in the detection of eight human microdeletion syndromes, including the more common WAGR, Smith-Magenis, and Potocki-Lupski syndromes with 100% sensitivity and 100% specificity. We present selection, design, and performance evaluation of detection primers using variety of approaches. We conclude that MQF-PCR is an easily adaptable method for detection of human pathological chromosomal aberrations.

  18. Study on microscope hyperspectral medical imaging method for biomedical quantitative analysis

    Institute of Scientific and Technical Information of China (English)

    LI QingLi; XUE YongQi; XIAO GongHai; ZHANG JingFa

    2008-01-01

    A microscopic pushbroom hyperspectral imaging system was developed based on the microscopic technology and spectral imaging technology according to the principle of spectral imager in remote sensing. The basic principle and key technologies of this system were presented and the system per-formance was also analyzed. Some methods and algorithms were proposed to preprocess and nor-malize the microscopic hyperspectral data and retrieve the transmittance spectrum of samples. As a case study, the microscopic hyperspectral imaging system was used to image the retina sections of different rats and get some significant results. Experiment results show that the system can be used for the quantitative assessment and evaluating the effect of medication in biomedical research.

  19. Measuring access to medicines: a review of quantitative methods used in household surveys

    Directory of Open Access Journals (Sweden)

    Domingues Marlos R

    2010-05-01

    Full Text Available Abstract Background Medicine access is an important goal of medicine policy; however the evaluation of medicine access is a subject under conceptual and methodological development. The aim of this study was to describe quantitative methodologies to measure medicine access on household level, access expressed as paid or unpaid medicine acquisition. Methods Searches were carried out in electronic databases and health institutional sites; within references from retrieved papers and by contacting authors. Results Nine papers were located. The methodologies of the studies presented differences in the recall period, recruitment of subjects and medicine access characterization. Conclusions The standardization of medicine access indicators and the definition of appropriate recall periods are required to evaluate different medicines and access dimensions, improving studies comparison. Besides, specific keywords must be established to allow future literature reviews about this topic.

  20. An effective method for the quantitative detection of porcine endogenous retrovirus in pig tissues.

    Science.gov (United States)

    Zhang, Peng; Yu, Ping; Wang, Wei; Zhang, Li; Li, Shengfu; Bu, Hong

    2010-05-01

    Xenotransplantation shows great promise for providing a virtually limitless supply of cells, tissues, and organs for a variety of therapeutical procedures. However, the potential of porcine endogenous retrovirus (PERV) as a human-tropic pathogen, particularly as a public health risk, is a major concern for xenotransplantation. This study focus on the detection of copy number in various tissues and organs in Banna Minipig Inbreed (BMI) from 2006 to 2007 in West China Hospital, Sichuan University. Real-time quantitative polymerase chain reaction (SYBR Green I) was performed in this study. The results showed that the pol gene had the most copy number in tissues compared with gag, envA, and envB. Our experiment will offer a rapid and accurate method for the detection of the copy number in various tissues and was especially suitable for the selection of tissues or organs in future clinical xenotransplantation.