WorldWideScience

Sample records for bioimage analysis methods

  1. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  2. Chapter 17: bioimage informatics for systems pharmacology.

    Directory of Open Access Journals (Sweden)

    Fuhai Li

    2013-04-01

    Full Text Available Recent advances in automated high-resolution fluorescence microscopy and robotic handling have made the systematic and cost effective study of diverse morphological changes within a large population of cells possible under a variety of perturbations, e.g., drugs, compounds, metal catalysts, RNA interference (RNAi. Cell population-based studies deviate from conventional microscopy studies on a few cells, and could provide stronger statistical power for drawing experimental observations and conclusions. However, it is challenging to manually extract and quantify phenotypic changes from the large amounts of complex image data generated. Thus, bioimage informatics approaches are needed to rapidly and objectively quantify and analyze the image data. This paper provides an overview of the bioimage informatics challenges and approaches in image-based studies for drug and target discovery. The concepts and capabilities of image-based screening are first illustrated by a few practical examples investigating different kinds of phenotypic changes caEditorsused by drugs, compounds, or RNAi. The bioimage analysis approaches, including object detection, segmentation, and tracking, are then described. Subsequently, the quantitative features, phenotype identification, and multidimensional profile analysis for profiling the effects of drugs and targets are summarized. Moreover, a number of publicly available software packages for bioimage informatics are listed for further reference. It is expected that this review will help readers, including those without bioimage informatics expertise, understand the capabilities, approaches, and tools of bioimage informatics and apply them to advance their own studies.

  3. A Computational Framework for Bioimaging Simulation

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  4. A Computational Framework for Bioimaging Simulation.

    Science.gov (United States)

    Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  5. A Computational Framework for Bioimaging Simulation.

    Directory of Open Access Journals (Sweden)

    Masaki Watabe

    Full Text Available Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.

  6. Digital liver biopsy: Bio-imaging of fatty liver for translational and clinical research.

    Science.gov (United States)

    Mancini, Marcello; Summers, Paul; Faita, Francesco; Brunetto, Maurizia R; Callea, Francesco; De Nicola, Andrea; Di Lascio, Nicole; Farinati, Fabio; Gastaldelli, Amalia; Gridelli, Bruno; Mirabelli, Peppino; Neri, Emanuele; Salvadori, Piero A; Rebelos, Eleni; Tiribelli, Claudio; Valenti, Luca; Salvatore, Marco; Bonino, Ferruccio

    2018-02-27

    The rapidly growing field of functional, molecular and structural bio-imaging is providing an extraordinary new opportunity to overcome the limits of invasive liver biopsy and introduce a "digital biopsy" for in vivo study of liver pathophysiology. To foster the application of bio-imaging in clinical and translational research, there is a need to standardize the methods of both acquisition and the storage of the bio-images of the liver. It can be hoped that the combination of digital, liquid and histologic liver biopsies will provide an innovative synergistic tri-dimensional approach to identifying new aetiologies, diagnostic and prognostic biomarkers and therapeutic targets for the optimization of personalized therapy of liver diseases and liver cancer. A group of experts of different disciplines (Special Interest Group for Personalized Hepatology of the Italian Association for the Study of the Liver, Institute for Biostructures and Bio-imaging of the National Research Council and Bio-banking and Biomolecular Resources Research Infrastructure) discussed criteria, methods and guidelines for facilitating the requisite application of data collection. This manuscript provides a multi-Author review of the issue with special focus on fatty liver.

  7. Multi-element bioimaging of Arabidopsis thaliana roots

    DEFF Research Database (Denmark)

    Persson, Daniel Olof; Chen, Anle; Aarts, Mark G.M.

    2016-01-01

    Better understanding of root function is central for the development of plants with more efficient nutrient uptake and translocation. We here present a method for multielement bioimaging at the cellular level in roots of the genetic model system Arabidopsis (Arabidopsis thaliana). Using conventio......Better understanding of root function is central for the development of plants with more efficient nutrient uptake and translocation. We here present a method for multielement bioimaging at the cellular level in roots of the genetic model system Arabidopsis (Arabidopsis thaliana). Using...... omics techniques. To demonstrate the potential of the method, we analyzed a mutant of Arabidopsis unable to synthesize the metal chelator nicotianamine. The mutant accumulated substantially more zinc and manganese than the wild type in the tissues surrounding the vascular cylinder. For iron, the images...... looked completely different, with iron bound mainly in the epidermis of the wild-type plants but confined to the cortical cell walls of the mutant. The method offers the power of inductively coupled plasma-mass spectrometry to be fully employed, thereby providing a basis for detailed studies of ion...

  8. Phase shifting white light interferometry using colour CCD for optical metrology and bio-imaging applications

    Science.gov (United States)

    Upputuri, Paul Kumar; Pramanik, Manojit

    2018-02-01

    Phase shifting white light interferometry (PSWLI) has been widely used for optical metrology applications because of their precision, reliability, and versatility. White light interferometry using monochrome CCD makes the measurement process slow for metrology applications. WLI integrated with Red-Green-Blue (RGB) CCD camera is finding imaging applications in the fields optical metrology and bio-imaging. Wavelength dependent refractive index profiles of biological samples were computed from colour white light interferograms. In recent years, whole-filed refractive index profiles of red blood cells (RBCs), onion skin, fish cornea, etc. were measured from RGB interferograms. In this paper, we discuss the bio-imaging applications of colour CCD based white light interferometry. The approach makes the measurement faster, easier, cost-effective, and even dynamic by using single fringe analysis methods, for industrial applications.

  9. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  10. Functionalized Nanolipobubbles Embedded Within a Nanocomposite Hydrogel: a Molecular Bio-imaging and Biomechanical Analysis of the System.

    Science.gov (United States)

    Mufamadi, Maluta S; Choonara, Yahya E; Kumar, Pradeep; du Toit, Lisa C; Modi, Girish; Naidoo, Dinesh; Iyuke, Sunny E; Pillay, Viness

    2017-04-01

    The purpose of this study was to explore the use of molecular bio-imaging systems and biomechanical dynamics to elucidate the fate of a nanocomposite hydrogel system prepared by merging FITC-labeled nanolipobubbles within a cross-linked hydrogel network. The nanocomposite hydrogel system was characterized by size distribution analysis and zeta potential as well as shears thinning behavior, elastic modulus (G'), viscous loss moduli (G"), TEM, and FTIR. In addition, molecular bio-imaging via Vevo ultrasound and Cell-viZio techniques evaluated the stability and distribution of the nanolipobubbles within the cross-linked hydrogel. FITC-labeled and functionalized nanolipobubbles had particle sizes between 135 and 158 nm (PdI = 0.129 and 0.190) and a zeta potential of -34 mV. TEM and ultrasound imaging revealed the uniformity and dimensional stability of the functionalized nanolipobubbles pre- and post-embedment into the cross-linked hydrogel. Biomechanical characterization of the hydrogel by shear thinning behavior was governed by the polymer concentration and the cross-linker, glutaraldehyde. Ultrasound analysis and Cell-viZio bio-imaging were highly suitable to visualize the fluorescent image-guided nanolipobubbles and their morphology post-embedment into the hydrogel to form the NanoComposite system. Since the nanocomposite is intended for targeted treatment of neurodegenerative disorders, the distribution of the functionalized nanolipobubbles into PC12 neuronal cells was also ascertained via confocal microscopy. Results demonstrated effective release and localization of the nanolipobubbles within PC12 neuronal cells. The molecular structure of the synthetic surface peptide remained intact for an extended period to ensure potency for targeted delivery from the hydrogel ex vivo. These findings provide further insight into the properties of nanocomposite hydrogels for specialized drug delivery.

  11. New horizons in biomagnetics and bioimaging

    International Nuclear Information System (INIS)

    Ueno, Shogo; Sekino, Masaki

    2006-01-01

    This paper reviews recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation (TMS), magnetic resonance imaging (MRI), and cancer therapy based on magnetic stimulation. The technique of localized and vectorial TMS has made it possible to obtain non-invasive functional mapping of the human brain, and the development of new bioimaging technologies such as current distribution MRI and conductivity MRI may make it possible to understand the dynamics of brain functions, which include millisecond-level changes in functional regions and dynamic relations between brain neuronal networks. These techniques are leading medicine and biology toward new horizons through novel applications of magnetism. (author)

  12. Functional mesoporous silica nanoparticles for bio-imaging applications.

    Science.gov (United States)

    Cha, Bong Geun; Kim, Jaeyun

    2018-03-22

    Biomedical investigations using mesoporous silica nanoparticles (MSNs) have received significant attention because of their unique properties including controllable mesoporous structure, high specific surface area, large pore volume, and tunable particle size. These unique features make MSNs suitable for simultaneous diagnosis and therapy with unique advantages to encapsulate and load a variety of therapeutic agents, deliver these agents to the desired location, and release the drugs in a controlled manner. Among various clinical areas, nanomaterials-based bio-imaging techniques have advanced rapidly with the development of diverse functional nanoparticles. Due to the unique features of MSNs, an imaging agent supported by MSNs can be a promising system for developing targeted bio-imaging contrast agents with high structural stability and enhanced functionality that enable imaging of various modalities. Here, we review the recent achievements on the development of functional MSNs for bio-imaging applications, including optical imaging, magnetic resonance imaging (MRI), positron emission tomography (PET), computed tomography (CT), ultrasound imaging, and multimodal imaging for early diagnosis. With further improvement in noninvasive bio-imaging techniques, the MSN-supported imaging agent systems are expected to contribute to clinical applications in the future. This article is categorized under: Diagnostic Tools > In vivo Nanodiagnostics and Imaging Nanotechnology Approaches to Biology > Nanoscale Systems in Biology. © 2018 Wiley Periodicals, Inc.

  13. Upconverting nanophosphors for bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Shuang Fang; Zhuo Rui [Department of MAE, Princeton University, Princeton, NJ 08544 (United States); Riehn, Robert [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Tung, Chih-kuan; Dalland, Joanna; Austin, Robert H [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Ryu, William S [Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544 (United States)

    2009-10-07

    Upconverting nanoparticles (UCNPs) when excited in the near-infrared (NIR) region display anti-Stokes emission whereby the emitted photon is higher in energy than the excitation energy. The material system achieves that by converting two or more infrared photons into visible photons. The use of the infrared confers benefits to bioimaging because of its deeper penetrating power in biological tissues and the lack of autofluorescence. We demonstrate here sub-10 nm, upconverting rare earth oxide UCNPs synthesized by a combustion method that can be stably suspended in water when amine modified. The amine modified UCNPs show specific surface immobilization onto patterned gold surfaces. Finally, the low toxicity of the UCNPs is verified by testing on the multi-cellular C. elegans nematode.

  14. Bioimage Informatics in the context of Drosophila research.

    Science.gov (United States)

    Jug, Florian; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel

    2014-06-15

    Modern biological research relies heavily on microscopic imaging. The advanced genetic toolkit of Drosophila makes it possible to label molecular and cellular components with unprecedented level of specificity necessitating the application of the most sophisticated imaging technologies. Imaging in Drosophila spans all scales from single molecules to the entire populations of adult organisms, from electron microscopy to live imaging of developmental processes. As the imaging approaches become more complex and ambitious, there is an increasing need for quantitative, computer-mediated image processing and analysis to make sense of the imagery. Bioimage Informatics is an emerging research field that covers all aspects of biological image analysis from data handling, through processing, to quantitative measurements, analysis and data presentation. Some of the most advanced, large scale projects, combining cutting edge imaging with complex bioimage informatics pipelines, are realized in the Drosophila research community. In this review, we discuss the current research in biological image analysis specifically relevant to the type of systems level image datasets that are uniquely available for the Drosophila model system. We focus on how state-of-the-art computer vision algorithms are impacting the ability of Drosophila researchers to analyze biological systems in space and time. We pay particular attention to how these algorithmic advances from computer science are made usable to practicing biologists through open source platforms and how biologists can themselves participate in their further development. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    Science.gov (United States)

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H + ) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H + sensor to minimize the undesired signal overlap by H + diffusion. Using this bio-image sensor, we can obtain H + diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Advances in Bio-Imaging From Physics to Signal Understanding Issues State-of-the-Art and Challenges

    CERN Document Server

    Racoceanu, Daniel; Gouaillard, Alexandre

    2012-01-01

    Advances in Imaging Devices and Image processing stem from cross-fertilization between many fields of research such as Chemistry, Physics, Mathematics and Computer Sciences. This BioImaging Community feel the urge to integrate more intensively its various results, discoveries and innovation into ready to use tools that can address all the new exciting challenges that Life Scientists (Biologists, Medical doctors, ...) keep providing, almost on a daily basis. Devising innovative chemical probes, for example, is an archetypal goal in which image quality improvement must be driven by the physics of acquisition, the image processing and analysis algorithms and the chemical skills in order to design an optimal bioprobe. This book offers an overview of the current advances in many research fields related to bioimaging and highlights the current limitations that would need to be addressed in the next decade to design fully integrated BioImaging Device.

  17. An Overview of data science uses in bioimage informatics.

    Science.gov (United States)

    Chessel, Anatole

    2017-02-15

    This review aims at providing a practical overview of the use of statistical features and associated data science methods in bioimage informatics. To achieve a quantitative link between images and biological concepts, one typically replaces an object coming from an image (a segmented cell or intracellular object, a pattern of expression or localisation, even a whole image) by a vector of numbers. They range from carefully crafted biologically relevant measurements to features learnt through deep neural networks. This replacement allows for the use of practical algorithms for visualisation, comparison and inference, such as the ones from machine learning or multivariate statistics. While originating mainly, for biology, in high content screening, those methods are integral to the use of data science for the quantitative analysis of microscopy images to gain biological insight, and they are sure to gather more interest as the need to make sense of the increasing amount of acquired imaging data grows more pressing. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Upconverting and NIR emitting rare earth based nanostructures for NIR-bioimaging

    Science.gov (United States)

    Hemmer, Eva; Venkatachalam, Nallusamy; Hyodo, Hiroshi; Hattori, Akito; Ebina, Yoshie; Kishimoto, Hidehiro; Soga, Kohei

    2013-11-01

    In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near-infrared (NIR) range under NIR excitation may overcome those problems. Due to the outstanding optical and magnetic properties of lanthanide ions (Ln3+), nanoscopic host materials doped with Ln3+, e.g. Y2O3:Er3+,Yb3+, are promising candidates for NIR-NIR bioimaging. Ln3+-doped gadolinium-based inorganic nanostructures, such as Gd2O3:Er3+,Yb3+, have a high potential as opto-magnetic markers allowing the combination of time-resolved optical imaging and magnetic resonance imaging (MRI) of high spatial resolution. Recent progress in our research on over-1000 nm NIR fluorescent nanoprobes for in vivo NIR-NIR bioimaging will be discussed in this review.In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near

  19. Biomagnetics and bioimaging for medical applications

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shoogo [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)]. E-mail: ueno@medes.m.u-tokyo.ac.jp; Sekino, Masaki [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2006-09-15

    This paper reviews medical applications of the recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation, magnetoencephalography, magnetic resonance imaging, cancer therapy based on magnetic stimulation, and magnetic control of cell orientation and cell growth. These techniques are leading medicine and biology into a new horizon through the novel applications of magnetism.

  20. Biomagnetics and bioimaging for medical applications

    International Nuclear Information System (INIS)

    Ueno, Shoogo; Sekino, Masaki

    2006-01-01

    This paper reviews medical applications of the recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation, magnetoencephalography, magnetic resonance imaging, cancer therapy based on magnetic stimulation, and magnetic control of cell orientation and cell growth. These techniques are leading medicine and biology into a new horizon through the novel applications of magnetism

  1. Nanodiamonds and silicon quantum dots: ultrastable and biocompatible luminescent nanoprobes for long-term bioimaging.

    Science.gov (United States)

    Montalti, M; Cantelli, A; Battistelli, G

    2015-07-21

    Fluorescence bioimaging is a powerful, versatile, method for investigating, both in vivo and in vitro, the complex structures and functions of living organisms in real time and space, also using super-resolution techniques. Being poorly invasive, fluorescence bioimaging is suitable for long-term observation of biological processes. Long-term detection is partially prevented by photobleaching of organic fluorescent probes. Semiconductor quantum dots, in contrast, are ultrastable, fluorescent contrast agents detectable even at the single nanoparticle level. Emission color of quantum dots is size dependent and nanoprobes emitting in the near infrared (NIR) region are ideal for low back-ground in vivo imaging. Biocompatibility of nanoparticles, containing toxic elements, is debated. Recent safety concerns enforced the search for alternative ultrastable luminescent nanoprobes. Most recent results demonstrated that optimized silicon quantum dots (Si QDs) and fluorescent nanodiamonds (FNDs) show almost no photobleaching in a physiological environment. Moreover in vitro and in vivo toxicity studies demonstrated their unique biocompatibility. Si QDs and FNDs are hence ideal diagnostic tools and promising non-toxic vectors for the delivery of therapeutic cargos. Most relevant examples of applications of Si QDs and FNDs to long-term bioimaging are discussed in this review comparing the toxicity and the stability of different nanoprobes.

  2. Bioimaging of teeth and their surrounding tissues and biofilm

    DEFF Research Database (Denmark)

    Dige, Irene; Spin-Neto, Rubens; Kraft, David Christian Evar

    At the Department of Dentistry and Oral Health, bioimaging is a central part of our research of dental tissues and diseases in the oral cavity. We conduct research in the understanding, preventing, and treating of such diseases and there has been a strategic focus on the image-based investigation...... of clinical problems. For example, because of the etiological role of biofilms in many diseases including dental caries and periodontitis, we have investigated biofilm ecology combining newer molecular techniques such as Confocal Laser Scanning Microscopy (CLSM) and fluorescence techniques. These methods...

  3. Semantic segmentation of bioimages using convolutional neural networks

    CSIR Research Space (South Africa)

    Wiehman, S

    2016-07-01

    Full Text Available Convolutional neural networks have shown great promise in both general image segmentation problems as well as bioimage segmentation. In this paper, the application of different convolutional network architectures is explored on the C. elegans live...

  4. [Gender equality activity in the Bioimaging Society].

    Science.gov (United States)

    Suzaki, Etsuko

    2013-09-01

    Gender equality activity in the Bioimaging Society was initiated in 2005 when it joined the Japan Inter-Society Liaison Association Committee for Promoting Equal Participation of Men and Women in Science and Engineering (EPMEWSE). The Gender Equality Committee of the Bioimaging Society is acting on this issue by following the policy of the EPMEWSE, and has also been planning and conducting lectures at annual meetings of the society to gain the understanding, consents, and cooperation of the members of the society to become conscious of gender equality. Women's participation in the society has been promoted through the activities of the Gender Equality Committee, and the number of women officers in the society has since increased from two women out of 40 members in 2005 to five out of 44 in 2013. The activities of the Gender Equality Committee of the Japanese Association of Anatomists (JAA) have just started. There are more than 400 women belonging to the JAA. When these women members join together and collaborate, women's participation in the JAA will increase.

  5. Bio-imaging of colorectal cancer models using near infrared labeled epidermal growth factor.

    Directory of Open Access Journals (Sweden)

    Gadi Cohen

    Full Text Available Novel strategies that target the epidermal growth factor receptor (EGFR have led to the clinical development of monoclonal antibodies, which treat metastatic colorectal cancer (mCRC but only subgroups of patients with increased wild type KRAS and EGFR gene copy, respond to these agents. Furthermore, resistance to EGFR blockade inevitably occurred, making future therapy difficult. Novel bio-imaging (BOI methods may assist in quantization of EGFR in mCRC tissue thus complementing the immunohistochemistry methodology, in guiding the future treatment of these patients. The aim of the present study was to explore the usefulness of near infrared-labeled EGF (EGF-NIR for bio-imaging of CRC using in vitro and in vivo orthotopic tumor CRC models and ex vivo human CRC tissues. We describe the preparation and characterization of EGF-NIR and investigate binding, using BOI of a panel of CRC cell culture models resembling heterogeneity of human CRC tissues. EGF-NIR was specifically and selectively bound by EGFR expressing CRC cells, the intensity of EGF-NIR signal to background ratio (SBR reflected EGFR levels, dose-response and time course imaging experiments provided optimal conditions for quantization of EGFR levels by BOI. EGF-NIR imaging of mice with HT-29 orthotopic CRC tumor indicated that EGF-NIR is more slowly cleared from the tumor and the highest SBR between tumor and normal adjacent tissue was achieved two days post-injection. Furthermore, images of dissected tissues demonstrated accumulation of EGF-NIR in the tumor and liver. EGF-NIR specifically and strongly labeled EGFR positive human CRC tissues while adjacent CRC tissue and EGFR negative tissues expressed weak NIR signals. This study emphasizes the use of EGF-NIR for preclinical studies. Combined with other methods, EGF-NIR could provide an additional bio-imaging specific tool in the standardization of measurements of EGFR expression in CRC tissues.

  6. Quantum dots in bio-imaging: Revolution by the small

    International Nuclear Information System (INIS)

    Arya, Harinder; Kaul, Zeenia; Wadhwa, Renu; Taira, Kazunari; Hirano, Takashi; Kaul, Sunil C.

    2005-01-01

    Visual analysis of biomolecules is an integral avenue of basic and applied biological research. It has been widely carried out by tagging of nucleotides and proteins with traditional fluorophores that are limited in their application by features such as photobleaching, spectral overlaps, and operational difficulties. Quantum dots (QDs) are emerging as a superior alternative and are poised to change the world of bio-imaging and further its applications in basic and applied biology. The interdisciplinary field of nanobiotechnology is experiencing a revolution and QDs as an enabling technology have become a harbinger of this hybrid field. Within a decade, research on QDs has evolved from being a pure science subject to the one with high-end commercial applications

  7. Advanced bioimaging technologies in assessment of the quality of bone and scaffold materials. Techniques and applications

    International Nuclear Information System (INIS)

    Qin Ling; Leung, Kwok Sui; Griffith, J.F.

    2007-01-01

    This book provides a perspective on the current status of bioimaging technologies developed to assess the quality of musculoskeletal tissue with an emphasis on bone and cartilage. It offers evaluations of scaffold biomaterials developed for enhancing the repair of musculoskeletal tissues. These bioimaging techniques include micro-CT, nano-CT, pQCT/QCT, MRI, and ultrasound, which provide not only 2-D and 3-D images of the related organs or tissues, but also quantifications of the relevant parameters. The advance bioimaging technologies developed for the above applications are also extended by incorporating imaging contrast-enhancement materials. Thus, this book will provide a unique platform for multidisciplinary collaborations in education and joint R and D among various professions, including biomedical engineering, biomaterials, and basic and clinical medicine. (orig.)

  8. Nanodiamonds for optical bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Yuen Yung; Chang, Huan-Cheng [Institute of Atomic and Molecular Sciences, Academia Sinica, Taipei 10617, Taiwan (China); Cheng, Chia-Liang, E-mail: yyhui@pub.iams.sinica.edu.t, E-mail: clcheng@mail.ndhu.edu.t, E-mail: hchang@gate.sinica.edu.t [Department of Physics, National Dong-Hwa University, Hualien 97401, Taiwan (China)

    2010-09-22

    Diamond has received increasing attention for its promising biomedical applications. The material is highly biocompatible and can be easily conjugated with bioactive molecules. Recently, nanoscale diamond has been applied as light scattering labels and luminescent optical markers. The luminescence, arising from photoexcitation of colour centres, can be substantially enhanced when type Ib diamond nanocrystals are bombarded by a high-energy particle beam and then annealed to form negatively charged nitrogen-vacancy centres. The centre absorbs strongly at 560 nm, fluoresces efficiently in the far-red region and is exceptionally photostable (without photoblinking and photobleaching). It is an ideal candidate for long-term imaging and tracking in complex cellular environments. This review summarizes recent advances in the development of fluorescent nanodiamonds for optical bioimaging with single particle sensitivity and nanometric resolution.

  9. Nanodiamonds for optical bioimaging

    International Nuclear Information System (INIS)

    Hui, Yuen Yung; Chang, Huan-Cheng; Cheng, Chia-Liang

    2010-01-01

    Diamond has received increasing attention for its promising biomedical applications. The material is highly biocompatible and can be easily conjugated with bioactive molecules. Recently, nanoscale diamond has been applied as light scattering labels and luminescent optical markers. The luminescence, arising from photoexcitation of colour centres, can be substantially enhanced when type Ib diamond nanocrystals are bombarded by a high-energy particle beam and then annealed to form negatively charged nitrogen-vacancy centres. The centre absorbs strongly at 560 nm, fluoresces efficiently in the far-red region and is exceptionally photostable (without photoblinking and photobleaching). It is an ideal candidate for long-term imaging and tracking in complex cellular environments. This review summarizes recent advances in the development of fluorescent nanodiamonds for optical bioimaging with single particle sensitivity and nanometric resolution.

  10. Neodymium-doped nanoparticles for infrared fluorescence bioimaging: The role of the host

    Energy Technology Data Exchange (ETDEWEB)

    Rosal, Blanca del; Pérez-Delgado, Alberto; Rocha, Ueslen; Martín Rodríguez, Emma; Jaque, Daniel, E-mail: daniel.jaque@uam.es [Fluorescence Imaging Group, Dpto. de Física de Materiales, Facultad de Ciencias, Universidad Autónoma de Madrid, Campus de Cantoblanco, Madrid 28049 (Spain); Misiak, Małgorzata; Bednarkiewicz, Artur [Wroclaw Research Centre EIT+, ul. Stabłowicka 147, 54-066 Wrocław (Poland); Institute of Physics, University of Tartu, 14c Ravila Str., 50411 Tartu (Estonia); Vanetsev, Alexander S. [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Orlovskii, Yurii [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Prokhorov General Physics Institute RAS, 38 Vavilov Str., 119991 Moscow (Russian Federation); Jovanović, Dragana J.; Dramićanin, Miroslav D. [Vinča Institute of Nuclear Sciences, University of Belgrade, P.O. Box 522, Belgrade 11001 (Serbia); Upendra Kumar, K.; Jacinto, Carlos [Grupo de Fotônica e Fluidos Complexos, Instituto de Física, Universidade Federal de Alagoas, 57072-900 Maceió-AL (Brazil); Navarro, Elizabeth [Depto. de Química, Eco Catálisis, UAM-Iztapalapa, Sn. Rafael Atlixco 186, México 09340, D.F (Mexico); and others

    2015-10-14

    The spectroscopic properties of different infrared-emitting neodymium-doped nanoparticles (LaF{sub 3}:Nd{sup 3+}, SrF{sub 2}:Nd{sup 3+}, NaGdF{sub 4}: Nd{sup 3+}, NaYF{sub 4}: Nd{sup 3+}, KYF{sub 4}: Nd{sup 3+}, GdVO{sub 4}: Nd{sup 3+}, and Nd:YAG) have been systematically analyzed. A comparison of the spectral shapes of both emission and absorption spectra is presented, from which the relevant role played by the host matrix is evidenced. The lack of a “universal” optimum system for infrared bioimaging is discussed, as the specific bioimaging application and the experimental setup for infrared imaging determine the neodymium-doped nanoparticle to be preferentially used in each case.

  11. Fluorescent magnetic hybrid nanoprobe for multimodal bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Koktysh, Dmitry [Department of Chemistry, Vanderbilt University, Station B 351822, Nashville, TN 37235 (United States); Bright, Vanessa; Pham, Wellington, E-mail: dmitry.koktysh@vanderbilt.edu, E-mail: wellington.pham@vanderbilt.edu [Institute of Imaging Science, Vanderbilt University, 1161 21st Avenue South AA, 1105 MCN, Nashville, TN 37232 (United States)

    2011-07-08

    A fluorescent magnetic hybrid imaging nanoprobe (HINP) was fabricated by the conjugation of superparamagnetic Fe{sub 3}O{sub 4} nanoparticles and visible light emitting ({approx}600 nm) fluorescent CdTe/CdS quantum dots (QDs). The assembly strategy used the covalent linking of the oxidized dextran shell of magnetic particles to the glutathione ligands of QDs. The synthesized HINP formed stable water-soluble colloidal dispersions. The structure and properties of the particles were characterized by transmission electron and atomic force microscopy, energy dispersive x-ray analysis and inductively coupled plasma optical emission spectroscopy, dynamic light scattering analysis, optical absorption and photoluminescence spectroscopy, and fluorescent imaging. The luminescence imaging region of the nanoprobe was extended to the near-infrared (NIR) ({approx}800 nm) by conjugation of the superparamagnetic nanoparticles with synthesized CdHgTe/CdS QDs. Cadmium, mercury based QDs in HINP can be easily replaced by novel water-soluble glutathione stabilized AgInS{sub 2}/ZnS QDs to present a new class of cadmium-free multimodal imaging agents. The observed NIR photoluminescence of fluorescent magnetic nanocomposites supports their use for bioimaging. The developed HINP provides dual-imaging channels for simultaneous optical and magnetic resonance imaging.

  12. Yb3+,Er3+,Eu3+-codoped YVO4 material for bioimaging with dual mode excitation

    International Nuclear Information System (INIS)

    Thao, Chu Thi Bich; Huy, Bui The; Sharipov, Mirkomil; Kim, Jin-Ik.; Dao, Van-Duong; Moon, Ja-Young; Lee, Yong-Ill

    2017-01-01

    We propose an efficient bioimaging strategy using Yb 3+ ,Er 3+ ,Eu 3+ -triplet doped YVO 4 nanoparticles which were synthesized with polymer as a template. The obtained particles possess nanoscale, uniform, and flexible excitation. The effect of Eu 3+ ions on the luminescence properties of YVO 4 :Yb 3+ ,Er 3+ ,Eu 3+ was investigated. The upconversion mechanism of the prepared material was also discussed. The structure and optical properties of the prepared material were characterized by using X-ray diffraction (XRD), Fourier-transform IR spectroscopy (FTIR), scanning electron microscopy (SEM), Transmission electron microscopy (TEM) upconversion and photoluminescence spectra. The Commission International de I′Eclairage (CIE) chromaticity coordinates was investigated to confirm the performance of color luminescent emission. The prepared YVO 4 :Yb 3+ ,Er 3+ ,Eu 3+ nanoparticles could be easily dispersed in water by surface modification with cysteine (Cys) and glutathione (GSH). The aqueous dispersion of the modified YVO 4 :Yb 3+ ,Er 3+ ,Eu 3+ exhibits bright upconversion and downconversion luminescence and has been applied for bioimaging of HeLa cells. Our developed material with dual excitation offers a promising advance in bioimaging. - Highlights: • Prepared particles possess nanoscale size, uniform, and larger scale. • The material exhibits strong emission under dual mode excitations. • The surface material has been applied for bioimaging of HeLa cell. • Low cytotoxicity, no auto-fluorescence

  13. Silicon nanomaterials platform for bioimaging, biosensing, and cancer therapy.

    Science.gov (United States)

    Peng, Fei; Su, Yuanyuan; Zhong, Yiling; Fan, Chunhai; Lee, Shuit-Tong; He, Yao

    2014-02-18

    Silicon nanomaterials are an important class of nanomaterials with great potential for technologies including energy, catalysis, and biotechnology, because of their many unique properties, including biocompatibility, abundance, and unique electronic, optical, and mechanical properties, among others. Silicon nanomaterials are known to have little or no toxicity due to favorable biocompatibility of silicon, which is an important precondition for biological and biomedical applications. In addition, huge surface-to-volume ratios of silicon nanomaterials are responsible for their unique optical, mechanical, or electronic properties, which offer exciting opportunities for design of high-performance silicon-based functional nanoprobes, nanosensors, and nanoagents for biological analysis and detection and disease treatment. Moreover, silicon is the second most abundant element (after oxygen) on earth, providing plentiful and inexpensive resources for large-scale and low-cost preparation of silicon nanomaterials for practical applications. Because of these attractive traits, and in parallel with a growing interest in their design and synthesis, silicon nanomaterials are extensively investigated for wide-ranging applications, including energy, catalysis, optoelectronics, and biology. Among them, bioapplications of silicon nanomaterials are of particular interest. In the past decade, scientists have made an extensive effort to construct a silicon nanomaterials platform for various biological and biomedical applications, such as biosensors, bioimaging, and cancer treatment, as new and powerful tools for disease diagnosis and therapy. Nonetheless, there are few review articles covering these important and promising achievements to promote the awareness of development of silicon nanobiotechnology. In this Account, we summarize recent representative works to highlight the recent developments of silicon functional nanomaterials for a new, powerful platform for biological and

  14. The Intersection of CMOS Microsystems and Upconversion Nanoparticles for Luminescence Bioimaging and Bioassays

    Directory of Open Access Journals (Sweden)

    Liping Wei

    2014-09-01

    Full Text Available Organic fluorophores and quantum dots are ubiquitous as contrast agents for bio-imaging and as labels in bioassays to enable the detection of biological targets and processes. Upconversion nanoparticles (UCNPs offer a different set of opportunities as labels in bioassays and for bioimaging. UCNPs are excited at near-infrared (NIR wavelengths where biological molecules are optically transparent, and their luminesce in the visible and ultraviolet (UV wavelength range is suitable for detection using complementary metal-oxide-semiconductor (CMOS technology. These nanoparticles provide multiple sharp emission bands, long lifetimes, tunable emission, high photostability, and low cytotoxicity, which render them particularly useful for bio-imaging applications and multiplexed bioassays. This paper surveys several key concepts surrounding upconversion nanoparticles and the systems that detect and process the corresponding luminescence signals. The principle of photon upconversion, tuning of emission wavelengths, UCNP bioassays, and UCNP time-resolved techniques are described. Electronic readout systems for signal detection and processing suitable for UCNP luminescence using CMOS technology are discussed. This includes recent progress in miniaturized detectors, integrated spectral sensing, and high-precision time-domain circuits. Emphasis is placed on the physical attributes of UCNPs that map strongly to the technical features that CMOS devices excel in delivering, exploring the interoperability between the two technologies.

  15. A microwave-assisted solution combustion synthesis to produce europium-doped calcium phosphate nanowhiskers for bioimaging applications.

    Science.gov (United States)

    Wagner, Darcy E; Eisenmann, Kathryn M; Nestor-Kalinoski, Andrea L; Bhaduri, Sarit B

    2013-09-01

    Biocompatible nanoparticles possessing fluorescent properties offer attractive possibilities for multifunctional bioimaging and/or drug and gene delivery applications. Many of the limitations with current imaging systems center on the properties of the optical probes in relation to equipment technical capabilities. Here we introduce a novel high aspect ratio and highly crystalline europium-doped calcium phosphate nanowhisker produced using a simple microwave-assisted solution combustion synthesis method for use as a multifunctional bioimaging probe. X-ray diffraction confirmed the material phase as europium-doped hydroxyapatite. Fluorescence emission and excitation spectra and their corresponding peaks were identified using spectrofluorimetry and validated with fluorescence, confocal and multiphoton microscopy. The nanowhiskers were found to exhibit red and far red wavelength fluorescence under ultraviolet excitation with an optimal peak emission of 696 nm achieved with a 350 nm excitation. Relatively narrow emission bands were observed, which may permit their use in multicolor imaging applications. Confocal and multiphoton microscopy confirmed that the nanoparticles provide sufficient intensity to be utilized in imaging applications. Copyright © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  16. Hybridization chain reaction: a versatile molecular tool for biosensing, bioimaging, and biomedicine.

    Science.gov (United States)

    Bi, Sai; Yue, Shuzhen; Zhang, Shusheng

    2017-07-17

    Developing powerful, simple and low-cost DNA amplification techniques is of great significance to bioanalysis and biomedical research. Thus far, many signal amplification strategies have been developed, such as polymerase chain reaction (PCR), rolling circle amplification (RCA), and DNA strand displacement amplification (SDA). In particular, hybridization chain reaction (HCR), a type of toehold-mediated strand displacement (TMSD) reaction, has attracted great interest because of its enzyme-free nature, isothermal conditions, simple protocols, and excellent amplification efficiency. In a typical HCR, an analyte initiates the cross-opening of two DNA hairpins, yielding nicked double helices that are analogous to alternating copolymers. As an efficient amplification platform, HCR has been utilized for the sensitive detection of a wide variety of analytes, including nucleic acids, proteins, small molecules, and cells. In recent years, more complicated sets of monomers have been designed to develop nonlinear HCR, such as branched HCR and even dendritic systems, achieving quadratic and exponential growth mechanisms. In addition, HCR has attracted enormous attention in the fields of bioimaging and biomedicine, including applications in fluorescence in situ hybridization (FISH) imaging, live cell imaging, and targeted drug delivery. In this review, we introduce the fundamentals of HCR and examine the visualization and analysis techniques for HCR products in detail. The most recent HCR developments in biosensing, bioimaging, and biomedicine are subsequently discussed with selected examples. Finally, the review provides insight into the challenges and future perspectives of HCR.

  17. Three-photon-excited luminescence from unsymmetrical cyanostilbene aggregates: morphology tuning and targeted bioimaging.

    Science.gov (United States)

    Mandal, Amal Kumar; Sreejith, Sivaramapanicker; He, Tingchao; Maji, Swarup Kumar; Wang, Xiao-Jun; Ong, Shi Li; Joseph, James; Sun, Handong; Zhao, Yanli

    2015-05-26

    We report an experimental observation of aggregation-induced enhanced luminescence upon three-photon excitation in aggregates formed from a class of unsymmetrical cyanostilbene derivatives. Changing side chains (-CH3, -C6H13, -C7H15O3, and folic acid) attached to the cyanostilbene core leads to instantaneous formation of aggregates with sizes ranging from micrometer to nanometer scale in aqueous conditions. The crystal structure of a derivative with a methyl side chain reveals the planarization in the unsymmetrical cyanostilbene core, causing luminescence from corresponding aggregates upon three-photon excitation. Furthermore, folic acid attached cyanostilbene forms well-dispersed spherical nanoaggregates that show a high three-photon cross-section of 6.0 × 10(-80) cm(6) s(2) photon(-2) and high luminescence quantum yield in water. In order to demonstrate the targeted bioimaging capability of the nanoaggregates, three cell lines (HEK293 healthy cell line, MCF7 cancerous cell line, and HeLa cancerous cell line) were employed for the investigations on the basis of their different folate receptor expression level. Two kinds of nanoaggregates with and without the folic acid targeting ligand were chosen for three-photon bioimaging studies. The cell viability of three types of cells incubated with high concentration of nanoaggregates still remained above 70% after 24 h. It was observed that the nanoaggregates without the folic acid unit could not undergo the endocytosis by both healthy and cancerous cell lines. No obvious endocytosis of folic acid attached nanoaggregates was observed from the HEK293 and MCF7 cell lines having a low expression of the folate receptor. Interestingly, a significant amount of endocytosis and internalization of folic acid attached nanoaggregates was observed from HeLa cells with a high expression of the folate receptor under three-photon excitation, indicating targeted bioimaging of folic acid attached nanoaggregates to the cancer cell line

  18. A novel fluorescence probe based on triphenylamine Schiff base for bioimaging and responding to pH and Fe3+

    International Nuclear Information System (INIS)

    Wang, Lei; Yang, Xiaodong; Chen, Xiuli; Zhou, Yuping; Lu, Xiaodan; Yan, Chenggong; Xu, Yikai; Liu, Ruiyuan; Qu, Jinqing

    2017-01-01

    A novel fluorescence probe 1 based on triphenylamine was synthesized and characterized by NMR, IR, high resolution mass spectrometry and elemental analysis. Its fluorescence was quenched when pH below 2. There was a linear relationship between the fluorescence intensity and pH value ranged from 2 to 7. And its fluorescence emission was reversibility in acidic and alkaline solution. Furthermore, it exhibited remarkable selectivity and high sensitivity to Fe 3+ and was able to detect Fe 3+ in aqueous solution with low detection limit of 0.511 μM. Job plot showed that the binding stoichiometry of 1 with Fe 3+ was 1:1. Further observations of 1 H NMR titration suggested that coordination interaction between Fe 3+ and nitrogen atom on C =N bond promoted the intramolecular charge transfer (ICT) or energy transfer process causing fluorescence quenching. Additionally, 1 was also able to be applied for detecting Fe 3+ in living cell and bioimaging. - Graphical abstract: Triphenylamine based fluorescence probe can detect pH and Fe 3+ simultaneously in aqueous solution and be applied for detecting Fe 3+ in living cell and bioimaging. - Highlights: • The fluorescence probe is sensitive to pH in strong acid conditions. • The fluorescence chemosensor can detect pH and Fe 3+ simultaneously. • The recognition is able to carry out in aqueous solution. • The probe can also be applied for detecting Fe 3+ in living cell and bioimaging. • The sensor is synthesized easily with one step.

  19. Bioimaging mass spectrometry of trace elements – recent advance and applications of LA-ICP-MS: A review

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J.Sabine, E-mail: s.becker@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany); Matusch, Andreas, E-mail: a.matusch@fz-juelich.de [Institute for Neuroscience and Medicine (INM-2), Forschungszentrum Jülich, Jülich D-52425 (Germany); Wu, Bei, E-mail: b.wu@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany)

    2014-07-04

    Highlights: • Bioimaging LA-ICP-MS is established for trace metals within biomedical specimens. • Trace metal imaging allows to study brain function and neurodegenerative diseases. • Laser microdissection ICP-MS was applied to mouse brain hippocampus and wheat root. - Abstract: Bioimaging using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) offers the capability to quantify trace elements and isotopes within tissue sections with a spatial resolution ranging about 10–100 μm. Distribution analysis adds to clarifying basic questions of biomedical research and enables bioaccumulation and bioavailability studies for ecological and toxicological risk assessment in humans, animals and plants. Major application fields of mass spectrometry imaging (MSI) and metallomics have been in brain and cancer research, animal model validation, drug development and plant science. Here we give an overview of latest achievements in methods and applications. Recent improvements in ablation systems, operation and cell design enabled progressively better spatial resolutions down to 1 μm. Meanwhile, a body of research has accumulated covering basic principles of the element architecture in animals and plants that could consistently be reproduced by several laboratories such as the distribution of Fe, Cu, Zn in rodent brain. Several studies investigated the distribution and delivery of metallo-drugs in animals. Hyper-accumulating plants and pollution indicator organisms have been the key topics in environmental science. Increasingly, larger series of samples are analyzed, may it be in the frame of comparisons between intervention and control groups, of time kinetics or of three-dimensional atlas approaches.

  20. Microwave assisted synthesis of luminescent carbonaceous nanoparticles from silk fibroin for bioimaging.

    Science.gov (United States)

    Gao, Hongzhi; Teng, Choon Peng; Huang, Donghong; Xu, Wanqing; Zheng, Chaohui; Chen, Yisong; Liu, Minghuan; Yang, Da-Peng; Lin, Ming; Li, Zibiao; Ye, Enyi

    2017-11-01

    Bombyx mori silk as a natural protein based biopolymer with high nitrogen content, is abundant and sustainable because of its mass product all over the world per year. In this study, we developed a facile and fast microwave-assisted synthesis of luminescent carbonaceous nanoparticles using Bombyx mori silk fibroin and silk solution as the precursors. As a result, the obtained carbonaceous nanoparticles exhibit a photoluminescence quantum yield of ~20%, high stability, low cytotoxicity, high biocompatibility. Most importantly, we successfully demonstrated bioimaging using these luminescent carbonaceous nanoparticles with excitation dependent luminescence. In addition, the microwave-assisted hydrothermal method can be extended to convert other biomass into functional nanomaterials. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Bioimaging of M1 cells using ceramic nanophosphors: Synthesis and toxicity assay of Y2O3 nanoparticles

    International Nuclear Information System (INIS)

    Venkatachalam, N; Soga, K; Tsuji, T; Okumura, Y; Fukuda, R

    2009-01-01

    Er 3+ doped Y 2 O 3 nanoparticles were synthesized by enzymatic and polymer-assisted homogeneous co-precipitation methods. Resultant particle size is about 30-40 nm with narrow size distribution whereas the particle size is smaller than those acquired by conventional homogeneous and alkali precipitation methods. The particles shows bright green (550 nm) and red (660 nm) upconversion (UC) as well as near infrared (NIR) fluorescence (1550 nm) under 980 nm excitation. Bioimaging of M1 cells using the nanoparticles were successfully attempted. It is observed that 0.5 mg/ml of nanoparticles is the nominal concentration for bioimaging of M1 cells under the physiological conditions. The cellular uptake of nanoparticles is evidenced from bright field, UC and NIR fluorescence images of live M1 cells. Our studies suggest that lower concentration of nanoparticles is sufficient for imaging when the particles are taken in the M1 cells and also the concentration can keep the cells alive. Further it was demonstrated that under the physiological conditions, Y 2 O 3 nanoparticles emit UC and NIR fluorescence in M1 cells even after the surface modification with PEG-b-PAAc polymer. Moreover, surface modified nanoparticles shows lower toxic effect in M1 cells while compare to bare nanoparticles.

  2. Photon upconversion towards applications in energy conversion and bioimaging

    Science.gov (United States)

    Sun, Qi-C.; Ding, Yuchen C.; Sagar, Dodderi M.; Nagpal, Prashant

    2017-12-01

    The field of plasmonics can play an important role in developing novel devices for application in energy and healthcare. In this review article, we consider the progress made in design and fabrication of upconverting nanoparticles and metal nanostructures for precisely manipulating light photons, with a wavelength of several hundred nanometers, at nanometer length scales, and describe how to tailor their interactions with molecules and surfaces so that two or more lower energy photons can be used to generate a single higher energy photon in a process called photon upconversion. This review begins by introducing the current state-of-the-art in upconverting nanoparticle synthesis and achievements in color tuning and upconversion enhancement. Through understanding and tailoring physical processes, color tuning and strong upconversion enhancement have been demonstrated by coupling with surface plasmon polariton waves, especially for low intensity or diffuse infrared radiation. Since more than 30% of incident sunlight is not utilized in most photovoltaic cells, this photon upconversion is one of the promising approaches to break the so-called Shockley-Queisser thermodynamic limit for a single junction solar cell. Furthermore, since the low energy photons typically cover the biological window of optical transparency, this approach can also be particularly beneficial for novel biosensing and bioimaging techniques. Taken together, the recent research boosts the applications of photon upconversion using designed metal nanostructures and nanoparticles for green energy, bioimaging, and therapy.

  3. Ratiometric Afterglow Nanothermometer for Simultaneous in Situ Bioimaging and Local Tissue Temperature Sensing

    NARCIS (Netherlands)

    Yang, J.; Liu, Y.; Zhao, Y.; Gong, Z.; Zhang, M.; Yan, D.; Zhu, H.; Liu, C.; Xu, C.; Zhang, H.

    2017-01-01

    Simultaneous in situ bioimage tracing and temperature sensing have been two of the foci of modern biomedicine that have given birth to designing novel luminescent nanothermometers with dual functions. To minimize the disadvantages of existing approaches, like the surface effect of nanoparticles,

  4. Single-pulse CARS based multimodal nonlinear optical microscope for bioimaging.

    Science.gov (United States)

    Kumar, Sunil; Kamali, Tschackad; Levitte, Jonathan M; Katz, Ori; Hermann, Boris; Werkmeister, Rene; Považay, Boris; Drexler, Wolfgang; Unterhuber, Angelika; Silberberg, Yaron

    2015-05-18

    Noninvasive label-free imaging of biological systems raises demand not only for high-speed three-dimensional prescreening of morphology over a wide-field of view but also it seeks to extract the microscopic functional and molecular details within. Capitalizing on the unique advantages brought out by different nonlinear optical effects, a multimodal nonlinear optical microscope can be a powerful tool for bioimaging. Bringing together the intensity-dependent contrast mechanisms via second harmonic generation, third harmonic generation and four-wave mixing for structural-sensitive imaging, and single-beam/single-pulse coherent anti-Stokes Raman scattering technique for chemical sensitive imaging in the finger-print region, we have developed a simple and nearly alignment-free multimodal nonlinear optical microscope that is based on a single wide-band Ti:Sapphire femtosecond pulse laser source. Successful imaging tests have been realized on two exemplary biological samples, a canine femur bone and collagen fibrils harvested from a rat tail. Since the ultra-broad band-width femtosecond laser is a suitable source for performing high-resolution optical coherence tomography, a wide-field optical coherence tomography arm can be easily incorporated into the presented multimodal microscope making it a versatile optical imaging tool for noninvasive label-free bioimaging.

  5. A novel fluorescence probe based on triphenylamine Schiff base for bioimaging and responding to pH and Fe{sup 3+}

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Lei; Yang, Xiaodong; Chen, Xiuli [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China); Zhou, Yuping [Guangdong Provincial key Laboratory of New Drug Screening, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Lu, Xiaodan; Yan, Chenggong; Xu, Yikai [Medical Imaging Center, Nanfang Hospital, Southern Medical University, Guangzhou 510515 (China); Liu, Ruiyuan, E-mail: ruiyliu@smu.edu.cn [Guangdong Provincial key Laboratory of New Drug Screening, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Qu, Jinqing, E-mail: cejqqu@scut.edu.cn [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China)

    2017-03-01

    A novel fluorescence probe 1 based on triphenylamine was synthesized and characterized by NMR, IR, high resolution mass spectrometry and elemental analysis. Its fluorescence was quenched when pH below 2. There was a linear relationship between the fluorescence intensity and pH value ranged from 2 to 7. And its fluorescence emission was reversibility in acidic and alkaline solution. Furthermore, it exhibited remarkable selectivity and high sensitivity to Fe{sup 3+} and was able to detect Fe{sup 3+} in aqueous solution with low detection limit of 0.511 μM. Job plot showed that the binding stoichiometry of 1 with Fe{sup 3+} was 1:1. Further observations of {sup 1}H NMR titration suggested that coordination interaction between Fe{sup 3+} and nitrogen atom on C =N bond promoted the intramolecular charge transfer (ICT) or energy transfer process causing fluorescence quenching. Additionally, 1 was also able to be applied for detecting Fe{sup 3+} in living cell and bioimaging. - Graphical abstract: Triphenylamine based fluorescence probe can detect pH and Fe{sup 3+} simultaneously in aqueous solution and be applied for detecting Fe{sup 3+} in living cell and bioimaging. - Highlights: • The fluorescence probe is sensitive to pH in strong acid conditions. • The fluorescence chemosensor can detect pH and Fe{sup 3+} simultaneously. • The recognition is able to carry out in aqueous solution. • The probe can also be applied for detecting Fe{sup 3+} in living cell and bioimaging. • The sensor is synthesized easily with one step.

  6. Quick synthesis of 2-propanol derived fluorescent carbon dots for bioimaging applications

    Science.gov (United States)

    Angamuthu, Raja; Palanisamy, Priya; Vasudevan, Vasanthakumar; Nagarajan, Sedhu; Rajendran, Ramesh; Vairamuthu, Raj

    2018-04-01

    Herein, for the first time, we present a one-pot ingenious preparative method for fluorescent carbon dots from 2-propanol (2P-CDs) without external treatments. Structure, morphology, chemical composition and fluorescence properties of the 2P-CDs were examined. These results confirm that the as-synthesized 2P-CDs are amorphous, monodispersed, spherical and the average particle size is 2.5 ± 0.7 nm. Most importantly, excitation-dependent emission properties were observed, which suggest that these 2P-CDs may be used in multicolor bioimaging applications. When incubated with HeLa cells, the 2P-CDs exhibit low cytotoxicity, and positive biocompatibility. Confocal microscopy image shows the uptake of 2P-CDs by HeLa cells and the application of probable biomarker is demonstrated.

  7. Fluorescent carbon nanoparticles derived from natural materials of mango fruit for bio-imaging probes

    Science.gov (United States)

    Jeong, Chan Jin; Roy, Arup Kumer; Kim, Sung Han; Lee, Jung-Eun; Jeong, Ji Hoon; Insik; Park, Sung Young

    2014-11-01

    Water soluble fluorescent carbon nanoparticles (FCP) obtained from a single natural source, mango fruit, were developed as unique materials for non-toxic bio-imaging with different colors and particle sizes. The prepared FCPs showed blue (FCP-B), green (FCP-G) and yellow (FCP-Y) fluorescence, derived by the controlled carbonization method. The FCPs demonstrated hydrodynamic diameters of 5-15 nm, holding great promise for clinical applications. The biocompatible FCPs demonstrated great potential in biological fields through the results of in vitro imaging and in vivo biodistribution. Using intravenously administered FCPs with different colored particles, we precisely defined the clearance and biodistribution, showing rapid and efficient urinary excretion for safe elimination from the body. These findings therefore suggest the promising possibility of using natural sources for producing fluorescent materials.Water soluble fluorescent carbon nanoparticles (FCP) obtained from a single natural source, mango fruit, were developed as unique materials for non-toxic bio-imaging with different colors and particle sizes. The prepared FCPs showed blue (FCP-B), green (FCP-G) and yellow (FCP-Y) fluorescence, derived by the controlled carbonization method. The FCPs demonstrated hydrodynamic diameters of 5-15 nm, holding great promise for clinical applications. The biocompatible FCPs demonstrated great potential in biological fields through the results of in vitro imaging and in vivo biodistribution. Using intravenously administered FCPs with different colored particles, we precisely defined the clearance and biodistribution, showing rapid and efficient urinary excretion for safe elimination from the body. These findings therefore suggest the promising possibility of using natural sources for producing fluorescent materials. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr04805a

  8. Persistent Luminescence Nanophosphor Involved Near-Infrared Optical Bioimaging for Investigation of Foodborne Probiotics Biodistribution in Vivo: A Proof-of-Concept Study.

    Science.gov (United States)

    Liu, Yaoyao; Liu, Jing-Min; Zhang, Dongdong; Ge, Kun; Wang, Peihua; Liu, Huilin; Fang, Guozhen; Wang, Shuo

    2017-09-20

    Probiotics has attracted great attention in food nutrition and safety research field, but thus far there are limited analytical techniques for visualized and real-time monitoring of the probiotics when they are ingested in vivo. Herein, the optical bioimaging technique has been introduced for investigation of foodborne probiotics biodistribution in vivo, employing the near-infrared (NIR) emitting persistent luminescence nanophosphors (PLNPs) of Cr 3+ -doped zinc gallogermanate (ZGGO) as the contrast nanoprobes. The ultrabrightness, super long afterglow, polydispersed size, low toxicity, and excellent photostability and biocompatibility of PLNPs were demonstrated to be qualified as a tracer for labeling probiotics via antibody (anti-Gram positive bacteria LTA antibody) recognition as well as contrast agent for long-term bioimaging the probiotics. In vivo optical bioimaging assay showed that the LTA antibody functionalized ZGGO nanoprobes that could be efficiently tagged to the probiobics were successfully applied for real-time monitoring and nondamaged probing of the biodistribution of probiotics inside the living body after oral administration. This work presents a proof-of-concept that exploited the bioimaging methodology for real-time and nondamaged researching the foodborne probiotics behaviors in vivo, which would open up a novel way of food safety detection and nutrition investigation.

  9. Biochemistry and biomedicine of quantum dots: from biodetection to bioimaging, drug discovery, diagnostics, and therapy.

    Science.gov (United States)

    Yao, Jun; Li, Pingfan; Li, Lin; Yang, Mei

    2018-07-01

    According to recent research, nanotechnology based on quantum dots (QDs) has been widely applied in the field of bioimaging, drug delivery, and drug analysis. Therefore, it has become one of the major forces driving basic and applied research. The application of nanotechnology in bioimaging has been of concern. Through in vitro labeling, it was found that luminescent QDs possess many properties such as narrow emission, broad UV excitation, bright fluorescence, and high photostability. The QDs also show great potential in whole-body imaging. The QDs can be combined with biomolecules, and hence, they can be used for targeted drug delivery and diagnosis. The characteristics of QDs make them useful for application in pharmacy and pharmacology. This review focuses on various applications of QDs, especially in imaging, drug delivery, pharmaceutical analysis, photothermal therapy, biochips, and targeted surgery. Finally, conclusions are made by providing some critical challenges and a perspective of how this field can be expected to develop in the future. Quantum dots (QDs) is an emerging field of interdisciplinary subject that involves physics, chemistry, materialogy, biology, medicine, and so on. In addition, nanotechnology based on QDs has been applied in depth in biochemistry and biomedicine. Some forward-looking fields emphatically reflected in some extremely vital areas that possess inspiring potential applicable prospects, such as immunoassay, DNA analysis, biological monitoring, drug discovery, in vitro labelling, in vivo imaging, and tumor target are closely connected to human life and health and has been the top and forefront in science and technology to date. Furthermore, this review has not only involved the traditional biochemical detection but also particularly emphasized its potential applications in life science and biomedicine. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  10. Yb{sup 3+},Er{sup 3+},Eu{sup 3+}-codoped YVO{sub 4} material for bioimaging with dual mode excitation

    Energy Technology Data Exchange (ETDEWEB)

    Thao, Chu Thi Bich [Department of Chemistry, Changwon National University, Changwon 641-773 (Korea, Republic of); Huy, Bui The [Department of Chemistry, Changwon National University, Changwon 641-773 (Korea, Republic of); Institute of Research and Development, Duy Tan University, K7/25 Quang Trung, Da Nang (Viet Nam); Sharipov, Mirkomil [Department of Chemistry, Changwon National University, Changwon 641-773 (Korea, Republic of); Kim, Jin-Ik. [Department of Biochemistry and Health Science, Changwon National University, Changwon 641-773 (Korea, Republic of); Dao, Van-Duong [Department of Chemical Engineering & Applied Chemistry, Chungnam National University, Daejeon 305-764 (Korea, Republic of); Moon, Ja-Young [Department of Biochemistry and Health Science, Changwon National University, Changwon 641-773 (Korea, Republic of); Lee, Yong-Ill, E-mail: yilee@changwon.ac.kr [Department of Chemistry, Changwon National University, Changwon 641-773 (Korea, Republic of)

    2017-06-01

    We propose an efficient bioimaging strategy using Yb{sup 3+},Er{sup 3+},Eu{sup 3+}-triplet doped YVO{sub 4} nanoparticles which were synthesized with polymer as a template. The obtained particles possess nanoscale, uniform, and flexible excitation. The effect of Eu{sup 3+} ions on the luminescence properties of YVO{sub 4}:Yb{sup 3+},Er{sup 3+},Eu{sup 3+} was investigated. The upconversion mechanism of the prepared material was also discussed. The structure and optical properties of the prepared material were characterized by using X-ray diffraction (XRD), Fourier-transform IR spectroscopy (FTIR), scanning electron microscopy (SEM), Transmission electron microscopy (TEM) upconversion and photoluminescence spectra. The Commission International de I′Eclairage (CIE) chromaticity coordinates was investigated to confirm the performance of color luminescent emission. The prepared YVO{sub 4}:Yb{sup 3+},Er{sup 3+},Eu{sup 3+} nanoparticles could be easily dispersed in water by surface modification with cysteine (Cys) and glutathione (GSH). The aqueous dispersion of the modified YVO{sub 4}:Yb{sup 3+},Er{sup 3+},Eu{sup 3+} exhibits bright upconversion and downconversion luminescence and has been applied for bioimaging of HeLa cells. Our developed material with dual excitation offers a promising advance in bioimaging. - Highlights: • Prepared particles possess nanoscale size, uniform, and larger scale. • The material exhibits strong emission under dual mode excitations. • The surface material has been applied for bioimaging of HeLa cell. • Low cytotoxicity, no auto-fluorescence.

  11. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology.

    Science.gov (United States)

    Rohde, Gustavo K; Ozolek, John A; Parwani, Anil V; Pantanowitz, Liron

    2014-01-01

    Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI) which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University's (CMU) Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.

  12. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology

    Directory of Open Access Journals (Sweden)

    Gustavo K Rohde

    2014-01-01

    Full Text Available Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University′s (CMU Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.

  13. A way toward analyzing high-content bioimage data by means of semantic annotation and visual data mining

    Science.gov (United States)

    Herold, Julia; Abouna, Sylvie; Zhou, Luxian; Pelengaris, Stella; Epstein, David B. A.; Khan, Michael; Nattkemper, Tim W.

    2009-02-01

    In the last years, bioimaging has turned from qualitative measurements towards a high-throughput and highcontent modality, providing multiple variables for each biological sample analyzed. We present a system which combines machine learning based semantic image annotation and visual data mining to analyze such new multivariate bioimage data. Machine learning is employed for automatic semantic annotation of regions of interest. The annotation is the prerequisite for a biological object-oriented exploration of the feature space derived from the image variables. With the aid of visual data mining, the obtained data can be explored simultaneously in the image as well as in the feature domain. Especially when little is known of the underlying data, for example in the case of exploring the effects of a drug treatment, visual data mining can greatly aid the process of data evaluation. We demonstrate how our system is used for image evaluation to obtain information relevant to diabetes study and screening of new anti-diabetes treatments. Cells of the Islet of Langerhans and whole pancreas in pancreas tissue samples are annotated and object specific molecular features are extracted from aligned multichannel fluorescence images. These are interactively evaluated for cell type classification in order to determine the cell number and mass. Only few parameters need to be specified which makes it usable also for non computer experts and allows for high-throughput analysis.

  14. Nanostructures Derived from Starch and Chitosan for Fluorescence Bio-Imaging

    Science.gov (United States)

    Zu, Yinxue; Bi, Jingran; Yan, Huiping; Wang, Haitao; Song, Yukun; Zhu, Bei-Wei; Tan, Mingqian

    2016-01-01

    Fluorescent nanostructures (NSs) derived from polysaccharides have drawn great attention as novel fluorescent probes for potential bio-imaging applications. Herein, we reported a facile alkali-assisted hydrothermal method to fabricate polysaccharide NSs using starch and chitosan as raw materials. Transmission electron microscopy (TEM) demonstrated that the average particle sizes are 14 nm and 75 nm for starch and chitosan NSs, respectively. Fourier transform infrared (FT-IR) spectroscopy analysis showed that there are a large number of hydroxyl or amino groups on the surface of these polysaccharide-based NSs. Strong fluorescence with an excitation-dependent emission behaviour was observed under ultraviolet excitation. Interestingly, the photostability of the NSs was found to be superior to fluorescein and rhodamine B. The quantum yield of starch NSs could reach 11.12% under the excitation of 360 nm. The oxidative metal ions including Cu(II), Hg(II)and Fe(III) exhibited a quench effect on the fluorescence intensity of the prepared NSs. Both of the two kinds of the multicoloured NSs showed a maximum fluorescence intensity at pH 7, while the fluorescence intensity decreased dramatically when they were put in an either acidic or basic environment (at pH 3 or 11). The cytotoxicity study of starch NSs showed that low cell cytotoxicity and 80% viability was found after 24 h incubation, when their concentration was less than 10 mg/mL. The study also showed the possibility of using the multicoloured starch NSs for mouse melanoma cells and guppy fish imaging. PMID:28335258

  15. Ultra-bright emission from hexagonal boron nitride defects as a new platform for bio-imaging and bio-labelling

    Science.gov (United States)

    Elbadawi, Christopher; Tran, Trong Toan; Shimoni, Olga; Totonjian, Daniel; Lobo, Charlene J.; Grosso, Gabriele; Moon, Hyowan; Englund, Dirk R.; Ford, Michael J.; Aharonovich, Igor; Toth, Milos

    2016-12-01

    Bio-imaging requires robust ultra-bright probes without causing any toxicity to the cellular environment, maintain their stability and are chemically inert. In this work we present hexagonal boron nitride (hBN) nanoflakes which exhibit narrowband ultra-bright single photon emitters1. The emitters are optically stable at room temperature and under ambient environment. hBN has also been noted to be noncytotoxic and seen significant advances in functionalization with biomolecules2,3. We further demonstrate two methods of engineering this new range of extremely robust multicolour emitters across the visible and near infrared spectral ranges for large scale sensing and biolabeling applications.

  16. Star polymer-based unimolecular micelles and their application in bio-imaging and diagnosis.

    Science.gov (United States)

    Jin, Xin; Sun, Pei; Tong, Gangsheng; Zhu, Xinyuan

    2018-02-03

    As a novel kind of polymer with covalently linked core-shell structure, star polymers behave in nanostructure in aqueous medium at all concentration range, as unimolecular micelles at high dilution condition and multi-micelle aggregates in other situations. The unique morphologies endow star polymers with excellent stability and functions, making them a promising platform for bio-application. A variety of functions including imaging and therapeutics can be achieved through rational structure design of star polymers, and the existence of plentiful end-groups on shell offers the opportunity for further modification. In the last decades, star polymers have become an attracting platform on fabrication of novel nano-systems for bio-imaging and diagnosis. Focusing on the specific topology and physicochemical properties of star polymers, we have reviewed recent development of star polymer-based unimolecular micelles and their bio-application in imaging and diagnosis. The main content of this review summarizes the synthesis of integrated architecture of star polymers and their self-assembly behavior in aqueous medium, focusing especially on the recent advances on their bio-imaging application and diagnosis use. Finally, we conclude with remarks and give some outlooks for further exploration in this field. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Size effects in the quantum yield of Cd Te quantum dots for optimum fluorescence bioimaging

    International Nuclear Information System (INIS)

    Jacinto, C.; Rocha, U.S.; Maestro, L.M.; Garcia-Sole, J.; Jaque, D.

    2011-01-01

    Full text: Semiconductor nano-crystals, usually referred as Quantum Dots (QDs) are nowadays regarded as one of the building-blocks in modern photonics. They constitute bright and photostable fluorescence sources whose emission and absorption properties can be adequately tailored through their size. Recent advances on the controlled modification of their surface has made possible the development of water soluble QDs, without causing any deterioration in their fluorescence properties. This has made them excellent optical selective markers to be used in fluorescence bio-imaging experiments. The suitability of colloidal QDs for bio-imaging is pushed forward by their large two-photon absorption cross section so that their visible luminescence (associated to the recombination of electro-hole pairs) can be also efficiently excited under infrared excitation (two-photon excitation). This, in turns, allows for large penetration depths in tissues, minimization of auto-fluorescence and achievement of superior spatial imaging resolution. In addition, recent works have demonstrated the ability of QDs to act as nano-thermometers based on the thermal sensitivity of their fluorescence bands. Based on all these outstanding properties, QDs have been successfully used to mark individual receptors in cell membranes, to intracellular temperature measurements and to label living embryos at different stages. Most of the QD based bio-images reported up to now were obtained by using whether CdSe or CdTe QDs since both are currently commercial available with a high degree of quality. They show similar fluorescence properties and optical performance when used in bio-imaging. Nevertheless, CdTe-QDs have very recently attracted much attention since the hyper-thermal sensitivity of their fluorescence bands was discovered. Based on this, it has been postulated that intracellular thermal sensing with resolutions as large as 0.25 deg C can be achieved based on CdTe-QDs, three times better than

  18. Oleyl-hyaluronan micelles loaded with upconverting nanoparticles for bio-imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pospisilova, Martina, E-mail: martina.pospisilova@contipro.com; Mrazek, Jiri; Matuska, Vit; Kettou, Sofiane; Dusikova, Monika; Svozil, Vit; Nesporova, Kristina; Huerta-Angeles, Gloria; Vagnerova, Hana; Velebny, Vladimir [Contipro Biotech (Czech Republic)

    2015-09-15

    Hyaluronan (HA) represents an interesting polymer for nanoparticle coating due to its biocompatibility and enhanced cell interaction via CD44 receptor. Here, we describe incorporation of oleate-capped β–NaYF{sub 4}:Yb{sup 3+}, Er{sup 3+} nanoparticles (UCNP-OA) into amphiphilic HA by microemulsion method. Resulting structures have a spherical, micelle-like appearance with a hydrodynamic diameter of 180 nm. UCNP-OA-loaded HA micelles show a good stability in PBS buffer and cell culture media. The intensity of green emission of UCNP-OA-loaded HA micelles in water is about five times higher than that of ligand-free UCNP, indicating that amphiphilic HA effectively protects UCNP luminescence from quenching by water molecules. We found that UCNP-OA-loaded HA micelles in concentrations up to 50 μg mL{sup −1} increase cell viability of normal human dermal fibroblasts (NHDF), while viability of human breast adenocarcinoma cells MDA–MB–231 is reduced at these concentrations. The utility of UCNP-OA-loaded HA micelles as a bio-imaging probe was demonstrated in vitro by successful labelling of NHDF and MDA–MB–231 cells overexpressing the CD44 receptor.

  19. Comparative studies of upconversion luminescence characteristics and cell bioimaging based on one-step synthesized upconversion nanoparticles capped with different functional groups

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Ming-Kiu [Department of Applied Physics, The Hong Kong Polytechnic University, Hong Kong (China); Chan, Chi-Fai; Wong, Ka-Leung [Department of Chemistry, Hong Kong Baptist University (Hong Kong); Hao, Jianhua, E-mail: jh.hao@polyu.edu.hk [Department of Applied Physics, The Hong Kong Polytechnic University, Hong Kong (China)

    2015-01-15

    Herein, three types of upconverting NaGdF{sub 4}:Yb/Er nanoparticles (UCNPs) have been synthesized via one-step hydrothermal synthesis with polyethylene glycol (PEG), polyethylenimine (PEI) and 6-aminocapronic acid (6AA) functionalization. To evident the presence of these groups, FTIR spectra and ζ-potentials were measured to support the successful capping of PEG, PEI and 6AA on the UCNPs. The regular morphology and cubic phase of these functionalized UCNPs were attributed to the capping effect of the surfactants. Tunable upconversion luminescence (UCL) from red to green were observed under 980 nm laser excitation and the UCL tuning was attributed to the presence of various surface ligands. Moreover, surface group dependent UCL bioimaging was performed in HeLa cells. The enhanced UCL bioimaging demonstrated by PEI functionalized UCNPs evident high cell uptake. The significant cell uptake is explained by the electrostatic attraction between the amino groups (–NH{sub 2}) and the cell membrane. Moreover, the functionalized UCNPs demonstrated low cytotoxicity in MTT assay. Additional, paramagnetic property was presented by these UCNPs under magnetic field. - Highlights: • Tunable upconversion emission by capped functional groups under fixed composition. • Surface dependent upconversion luminescence bioimaging in HeLa cells. • Low cytotoxicity. • Additional paramagnetic property due to Gd{sup 3+} ions.

  20. Alexa Fluor-labeled Fluorescent Cellulose Nanocrystals for Bioimaging Solid Cellulose in Spatially Structured Microenvironments

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G.; Kelly, Ryan T.; Orr, Galya; Hu, Dehong; Dehoff, Karl J.; Brockman, Fred J.; Wilkins, Michael J.

    2015-03-18

    Cellulose nanocrystal materials have been labeled with modern Alexa Fluor dyes in a process that first links the dye to a cyanuric chloride molecule. Subsequent reaction with cellulose nanocrystals provides dyed solid microcrystalline cellulose material that can be used for bioimaging and suitable for deposition in films and spatially structured microenvironments. It is demonstrated with single molecular fluorescence microscopy that these films are subject to hydrolysis by cellulose enzymes.

  1. Rapid colorimetric sensing of gadolinium by EGCG-derived AgNPs: the development of a nanohybrid bioimaging probe.

    Science.gov (United States)

    Singh, Rohit Kumar; Mishra, Sourav; Jena, Satyapriya; Panigrahi, Bijayananda; Das, Bhaskar; Jayabalan, Rasu; Parhi, Pankaj Kumar; Mandal, Dindyal

    2018-04-17

    Polyphenol functionalized silver nanoparticles (AgNPs) have been developed and demonstrated as colorimetric sensors for the selective detection of gadolinium. The newly obtained AgNP-Gd3+ conjugates exhibit high aqueous dispersibility and excitation dependent fluorescence emission. The conjugates offer multicolor bioimaging potential owing to their excellent luminescence properties.

  2. Confinement of carbon dots localizing to the ultrathin layered double hydroxides toward simultaneous triple-mode bioimaging and photothermal therapy.

    Science.gov (United States)

    Weng, Yangziwan; Guan, Shanyue; Lu, Heng; Meng, Xiangmin; Kaassis, Abdessamad Y; Ren, Xiaoxue; Qu, Xiaozhong; Sun, Chenghua; Xie, Zheng; Zhou, Shuyun

    2018-07-01

    It is a great challenge to develop multifunctional nanocarriers for cancer diagnosis and therapy. Herein, versatile CDs/ICG-uLDHs nanovehicles for triple-modal fluorescence/photoacoustic/two-photon bioimaging and effective photothermal therapy were prepared via a facile self-assembly of red emission carbon dots (CDs), indocyanine green (ICG) with the ultrathin layered double hydroxides (uLDHs). Due to the J-aggregates of ICG constructed in the self-assembly process, CDs/ICG-uLDHs was able to stabilize the photothermal agent ICG and enhanced its photothermal efficiency. Furthermore, the unique confinement effect of uLDHs has extended the fluorescence lifetime of CDs in favor of bioimaging. Considering the excellent in vitro and in vivo phototherapeutics and multimodal imaging effects, this work provides a promising platform for the construction of multifunctional theranostic nanocarrier system for the cancer treatment. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. A special issue on reviews in biomedical applications of nanomaterials, tissue engineering, stem cells, bioimaging, and toxicity.

    Science.gov (United States)

    Nalwa, Hari Singh

    2014-10-01

    This second special issue of the Journal of Biomedical Nanotechnology in a series contains another 30 state-of-the-art reviews focused on the biomedical applications of nanomaterials, biosensors, bone tissue engineering, MRI and bioimaging, single-cell detection, stem cells, endothelial progenitor cells, toxicity and biosafety of nanodrugs, nanoparticle-based new therapeutic approaches for cancer, hepatic and cardiovascular disease.

  4. Synthesis of Au NP@MoS2 Quantum Dots Core@Shell Nanocomposites for SERS Bio-Analysis and Label-Free Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Xixi Fei

    2017-06-01

    Full Text Available In this work, we report a facile method using MoS2 quantum dots (QDs as reducers to directly react with HAuCl4 for the synthesis of Au nanoparticle@MoS2 quantum dots (Au NP@MoS2 QDs core@shell nanocomposites with an ultrathin shell of ca. 1 nm. The prepared Au NP@MoS2 QDs reveal high surface enhanced Raman scattering (SERS performance regarding sensitivity as well as the satisfactory SERS reproducibility and stability. The limit of detection of the hybrids for crystal violet can reach 0.5 nM with a reasonable linear response range from 0.5 μM to 0.5 nM (R2 ≈ 0.974. Furthermore, the near-infrared SERS detection based on Au NP@MoS2 QDs in living cells is achieved with distinct Raman signals which are clearly assigned to the various cellular components. Meanwhile, the distinguishable SERS images are acquired from the 4T1 cells with the incubation of Au NP@MoS2 QDs. Consequently, the straightforward strategy of using Au NP@MoS2 QDs exhibits great potential as a superior SERS substrate for chemical and biological detection as well as bio-imaging.

  5. The BioImage Study: novel approaches to risk assessment in the primary prevention of atherosclerotic cardiovascular disease--study design and objectives

    NARCIS (Netherlands)

    Muntendam, Pieter; McCall, Carol; Sanz, Javier; Falk, Erling; Fuster, Valentin; Badimon, Juan José; Botnar, René M.; Daemen, Mat J. A. P.; Fayad, Zahi A.; Garcia, Mario; Ginsburg, Geoffrey S.; Hazen, Stanley L.; King, Spencer B.; Moreno, Pedro R.; Nordestgaard, Børge G.; Rudd, James H. F.; Shah, Predimon K.; Sillesen, Henrik; van der Steen, Antonius F. W.; Yacoub, Magdi H.; Yuan, Chun; Bui, James T.; Chen, Christopher J.; Seow, Albert; Plump, Andrew; Raichlen, Joel; Smit, Paul; Stolzenbach, James C.; Urquhart, Richard

    2010-01-01

    The identification of asymptomatic individuals at risk for near-term atherothrombotic events to ensure optimal preventive treatment remains a challenging goal. In the BioImage Study, novel approaches are tested in a typical health-plan population. Based on certain demographic and risk

  6. Hydrothermal synthesis of NaLuF4:153Sm,Yb,Tm nanoparticles and their application in dual-modality upconversion luminescence and SPECT bioimaging.

    Science.gov (United States)

    Yang, Yang; Sun, Yun; Cao, Tianye; Peng, Juanjuan; Liu, Ying; Wu, Yongquan; Feng, Wei; Zhang, Yingjian; Li, Fuyou

    2013-01-01

    Upconversion luminescence (UCL) properties and radioactivity have been integrated into NaLuF(4):(153)Sm,Yb,Tm nanoparticles by a facile one-step hydrothermal method, making these nanoparticles potential candidates for UCL and single-photon emission computed tomography (SPECT) dual-modal bioimaging in vivo. The introduction of small amount of radioactive (153)Sm(3+) can hardly vary the upconversion luminescence properties of the nanoparticles. The as-designed nanoparticles showed very low cytotoxicity, no obvious tissue damage in 7 days, and excellent in vitro and in vivo performances in dual-modal bioimaging. By means of a combination of UCL and SPECT imaging in vivo, the distribution of the nanoparticles in living animals has been studied, and the results indicated that these particles were mainly accumulated in the liver and spleen. Therefore, the concept of (153)Sm(3+)/Yb(3+)/Tm(3+) co-doped NaLuF(4) nanoparticles for UCL and SPECT dual-modality imaging in vivo of whole-body animals may serve as a platform for next-generation probes for ultra-sensitive molecular imaging from the cellular scale to whole-body evaluation. It also introduces an easy methodology to quantify in vivo biodistribution of nanomaterials which still needs further understanding as a community. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Supramolecular assembly affording a ratiometric two-photon fluorescent nanoprobe for quantitative detection and bioimaging.

    Science.gov (United States)

    Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong

    2017-12-01

    Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.

  8. Gadolinia nanofibers as a multimodal bioimaging and potential radiation therapy agent

    Science.gov (United States)

    Grishin, A. M.; Jalalian, A.; Tsindlekht, M. I.

    2015-05-01

    Continuous bead-free C-type cubic gadolinium oxide (Gd2O3) nanofibers 20-30 μm long and 40-100 nm in diameter were sintered by sol-gel calcination assisted electrospinning technique. Dipole-dipole interaction of neighboring Gd3+ ions in nanofibers with large length-to-diameter aspect ratio results in some kind of superparamagnetic behavior: fibers are magnetized twice stronger than Gd2O3 powder. Being compared with commercial Gd-DTPA/Magnevist®, Gd2O3 diethyleneglycol-coated (Gd2O3-DEG) fibers show high 1/T1 and 1/T2 proton relaxivities. Intense room temperature photoluminescence, high NMR relaxivity and high neutron scattering cross-section of 157Gd nucleus promise to integrate Gd2O3 fibers for multimodal bioimaging and neutron capture therapy.

  9. Gadolinia nanofibers as a multimodal bioimaging and potential radiation therapy agent

    Directory of Open Access Journals (Sweden)

    A. M. Grishin

    2015-05-01

    Full Text Available Continuous bead-free C-type cubic gadolinium oxide (Gd2O3 nanofibers 20-30 μm long and 40-100 nm in diameter were sintered by sol-gel calcination assisted electrospinning technique. Dipole-dipole interaction of neighboring Gd3+ ions in nanofibers with large length-to-diameter aspect ratio results in some kind of superparamagnetic behavior: fibers are magnetized twice stronger than Gd2O3 powder. Being compared with commercial Gd-DTPA/Magnevist®, Gd2O3 diethyleneglycol-coated (Gd2O3-DEG fibers show high 1/T1 and 1/T2 proton relaxivities. Intense room temperature photoluminescence, high NMR relaxivity and high neutron scattering cross-section of 157Gd nucleus promise to integrate Gd2O3 fibers for multimodal bioimaging and neutron capture therapy.

  10. Bioengineered II-VI semiconductor quantum dot-carboxymethylcellulose nanoconjugates as multifunctional fluorescent nanoprobes for bioimaging live cells

    Science.gov (United States)

    Mansur, Alexandra A. P.; Mansur, Herman S.; Mansur, Rafael L.; de Carvalho, Fernanda G.; Carvalho, Sandhra M.

    2018-01-01

    Colloidal semiconductor quantum dots (QDs) are light-emitting ultra-small nanoparticles, which have emerged as a new class of nanoprobes with unique optical properties for bioimaging and biomedical diagnostic. However, to be used for most biomedical applications the biocompatibility and water-solubility are mandatory that can achieved through surface modification forming QD-nanoconjugates. In this study, semiconductor II-VI quantum dots of type MX (M = Cd, Pb, Zn, X = S) were directly synthesized in aqueous media and at room temperature using carboxymethylcellulose sodium salt (CMC) behaving simultaneously as stabilizing and surface biofunctional ligand. These nanoconjugates were extensively characterized using UV-visible spectroscopy, photoluminescence spectroscopy, X-ray photoelectron spectroscopy, Fourier transform infrared spectroscopy, X-ray diffraction, transmission electron microscopy, dynamic light scattering and zeta potential. The results demonstrated that the biopolymer was effective on nucleating and stabilizing the colloidal nanocrystals of CdS, ZnS, and PbS with the average diameter ranging from 2.0 to 5.0 nm depending on the composition of the semiconductor core, which showed quantum-size confinement effect. These QD/polysaccharide conjugates showed luminescent activity from UV-visible to near-infrared range of the spectra under violet laser excitation. Moreover, the bioassays performed proved that these novel nanoconjugates were biocompatible and behaved as composition-dependent fluorescent nanoprobes for in vitro live cell bioimaging with very promising perspectives to be used in numerous biomedical applications and nanomedicine.

  11. Bis-pyridinium quadrupolar derivatives. High Stokes shift selective probes for bio-imaging

    Science.gov (United States)

    Salice, Patrizio; Versari, Silvia; Bradamante, Silvia; Meinardi, Francesco; Macchi, Giorgio; Pagani, Giorgio A.; Beverina, Luca

    2013-11-01

    We describe the design, synthesis and characterization of five high Stokes shift quadrupolar heteroaryl compounds suitable as fluorescent probes in bio-imaging. In particular, we characterize the photophysical properties and the intracellular localization in Human Umbilical Vein Endothelial Cells (HUVEC) and Human Mesenchymal Stem Cells (HMSCs) for each dye. We show that, amongst all of the investigated derivatives, the 2,5-bis[1-(4-N-methylpyridinium)ethen-2-yl)]- N-methylpyrrole salt is the best candidates as selective mitochondrial tracker. Finally, we recorded the full emission spectrum of the most performing - exclusively mitochondrial selective - fluorescent probe directly from HUVEC stained cells. The emission spectrum collected from the stained mitochondria shows a remarkably more pronounced vibronic structure with respect to the emission of the free fluorophore in solution.

  12. Carbon Nanodots: Dual‐Color‐Emitting Carbon Nanodots for Multicolor Bioimaging and Optogenetic Control of Ion Channels (Adv. Sci. 11/2017)

    OpenAIRE

    Kim, Hyemin; Park, Yoonsang; Beack, Songeun; Han, Seulgi; Jung, Dooyup; Cha, Hyung Joon; Kwon, Woosung; Hahn, Sei Kwang

    2017-01-01

    Carbon nanodots (CNDs) have been widely investigated for theranostic applications including fluorescence imaging, photoacoustic imaging, photothermal therapy, and photodynamic therapy. In article number 1700325, Sei Kwang Hahn, Woosung Kwon, and co‐workers develop dual‐color‐emitting CNDs uniquely designed by the electronic structure engineering for both futuristic multi‐color bioimaging and optogenetic control of ion channels.

  13. Folate receptor targeting silica nanoparticle probe for two-photon fluorescence bioimaging

    Science.gov (United States)

    Wang, Xuhua; Yao, Sheng; Ahn, Hyo-Yang; Zhang, Yuanwei; Bondar, Mykhailo V.; Torres, Joseph A.; Belfield, Kevin D.

    2010-01-01

    Narrow dispersity organically modified silica nanoparticles (SiNPs), diameter ~30 nm, entrapping a hydrophobic two-photon absorbing fluorenyl dye, were synthesized by hydrolysis of triethoxyvinylsilane and (3-aminopropyl)triethoxysilane in the nonpolar core of Aerosol-OT micelles. The surface of the SiNPs were functionalized with folic acid, to specifically deliver the probe to folate receptor (FR) over-expressing Hela cells, making these folate two-photon dye-doped SiNPs potential candidates as probes for two-photon fluorescence microscopy (2PFM) bioimaging. In vitro studies using FR over-expressing Hela cells and low FR expressing MG63 cells demonstrated specific cellular uptake of the functionalized nanoparticles. One-photon fluorescence microscopy (1PFM) imaging, 2PFM imaging, and two-photon fluorescence lifetime microscopy (2P-FLIM) imaging of Hela cells incubated with folate-modified two-photon dye-doped SiNPs were demonstrated. PMID:21258480

  14. Molecular engineering of two-photon fluorescent probes for bioimaging applications

    Science.gov (United States)

    Liu, Hong-Wen; Liu, Yongchao; Wang, Peng; Zhang, Xiao-Bing

    2017-03-01

    During the past two decades, two-photon microscopy (TPM), which utilizes two near-infrared photons as the excitation source, has emerged as a novel, attractive imaging tool for biological research. Compared with one-photon microscopy, TPM offers several advantages, such as lowering background fluorescence in living cells and tissues, reducing photodamage to biosamples, and a photobleaching phenomenon, offering better 3D spatial localization, and increasing penetration depth. Small-molecule-based two-photon fluorescent probes have been well developed for the detection and imaging of various analytes in biological systems. In this review, we will give a general introduction of molecular engineering of two-photon fluorescent probes based on different fluorescence response mechanisms for bioimaging applications during the past decade. Inspired by the desired advantages of small-molecule two-photon fluorescent probes in biological imaging applications, we expect that more attention will be devoted to the development of new two-photon fluorophores and applications of TPM in areas of bioanalysis and disease diagnosis.

  15. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  16. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  17. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging.

    Science.gov (United States)

    Zhou, Tao; Jia, Lei; Luo, Yi-Feng; Xu, Jun; Chen, Ru-Hua; Ge, Zhi-Jun; Ma, Tie-Liang; Chen, Hong; Zhu, Tao-Feng

    A novel multifunctional halloysite nanotube (HNT)-based Fe 3 O 4 @HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane) 3 nanocomposite (Fe-HNT-Eu NC) with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI) contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T 2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers.

  18. Gadolinia nanofibers as a multimodal bioimaging and potential radiation therapy agent

    Energy Technology Data Exchange (ETDEWEB)

    Grishin, A. M., E-mail: grishin@kth.se, E-mail: grishin@inmatech.com [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Petrozavodsk State University, 185910 Petrozavodsk, Karelian Republic (Russian Federation); Jalalian, A. [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Tsindlekht, M. I. [Racah Institute of Physics, Hebrew University of Jerusalem, 91904 Jerusalem (Israel)

    2015-05-15

    Continuous bead-free C-type cubic gadolinium oxide (Gd{sub 2}O{sub 3}) nanofibers 20-30 μm long and 40-100 nm in diameter were sintered by sol-gel calcination assisted electrospinning technique. Dipole-dipole interaction of neighboring Gd{sup 3+} ions in nanofibers with large length-to-diameter aspect ratio results in some kind of superparamagnetic behavior: fibers are magnetized twice stronger than Gd{sub 2}O{sub 3} powder. Being compared with commercial Gd-DTPA/Magnevist{sup ®}, Gd{sub 2}O{sub 3} diethyleneglycol-coated (Gd{sub 2}O{sub 3}-DEG) fibers show high 1/T{sub 1} and 1/T{sub 2} proton relaxivities. Intense room temperature photoluminescence, high NMR relaxivity and high neutron scattering cross-section of {sup 157}Gd nucleus promise to integrate Gd{sub 2}O{sub 3} fibers for multimodal bioimaging and neutron capture therapy.

  19. CMEIAS bioimage informatics that define the landscape ecology of immature microbial biofilms developed on plant rhizoplane surfaces

    Directory of Open Access Journals (Sweden)

    Frank B Dazzo

    2015-10-01

    Full Text Available Colonization of the rhizoplane habitat is an important activity that enables certain microorganisms to promote plant growth. Here we describe various types of computer-assisted microscopy that reveal important ecological insights of early microbial colonization behavior within biofilms on plant root surfaces grown in soil. Examples of the primary data are obtained by analysis of processed images of rhizoplane biofilm landscapes analyzed at single-cell resolution using the emerging technology of CMEIAS bioimage informatics software. Included are various quantitative analyses of the in situ biofilm landscape ecology of microbes during their pioneer colonization of white clover roots, and of a rhizobial biofertilizer strain colonized on rice roots where it significantly enhances the productivity of this important crop plant. The results show that spatial patterns of immature biofilms developed on rhizoplanes that interface rhizosphere soil are highly structured (rather than distributed randomly when analyzed at the appropriate spatial scale, indicating that regionalized microbial cell-cell interactions and the local environment can significantly affect their cooperative and competitive colonization behaviors.

  20. System and method for making quantum dots

    KAUST Repository

    Bakr, Osman M.

    2015-05-28

    Embodiments of the present disclosure provide for methods of making quantum dots (QDs) (passivated or unpassivated) using a continuous flow process, systems for making QDs using a continuous flow process, and the like. In one or more embodiments, the QDs produced using embodiments of the present disclosure can be used in solar photovoltaic cells, bio-imaging, IR emitters, or LEDs.

  1. Nitrogen-Vacancy color center in diamond-emerging nanoscale applications in bioimaging and biosensing.

    Science.gov (United States)

    Balasubramanian, Gopalakrishnan; Lazariev, Andrii; Arumugam, Sri Ranjini; Duan, De-Wen

    2014-06-01

    Nitrogen-Vacancy (NV) color center in diamond is a flourishing research area that, in recent years, has displayed remarkable progress. The system offers great potential for realizing futuristic applications in nanoscience, benefiting a range of fields from bioimaging to quantum-sensing. The ability to image single NV color centers in a nanodiamond and manipulate NV electron spin optically under ambient condition is the main driving force behind developments in nanoscale sensing and novel imaging techniques. In this article we discuss current status on the applications of fluorescent nanodiamonds (FND) for optical super resolution nanoscopy, magneto-optical (spin-assisted) sub-wavelength localization and imaging. We present emerging applications such as single molecule spin imaging, nanoscale imaging of biomagnetic fields, sensing molecular fluctuations and temperatures in live cellular environments. We summarize other current advances and future prospects of NV diamond for imaging and sensing pertaining to bio-medical applications. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Mechanochemistry of Chitosan-Coated Zinc Sulfide (ZnS) Nanocrystals for Bio-imaging Applications

    Science.gov (United States)

    Bujňáková, Zdenka; Dutková, Erika; Kello, Martin; Mojžiš, Ján; Baláž, Matej; Baláž, Peter; Shpotyuk, Oleh

    2017-05-01

    The ZnS nanocrystals were prepared in chitosan solution (0.1 wt.%) using a wet ultra-fine milling. The obtained suspension was stable and reached high value of zeta potential (+57 mV). The changes in FTIR spectrum confirmed the successful surface coating of ZnS nanoparticles by chitosan. The prepared ZnS nanocrystals possessed interesting optical properties verified in vitro. Four cancer cells were selected (CaCo-2, HCT116, HeLa, and MCF-7), and after their treatment with the nanosuspension, the distribution of ZnS in the cells was studied using a fluorescence microscope. The particles were clearly seen; they passed through the cell membrane and accumulated in cytosol. The biological activity of the cells was not influenced by nanoparticles, they did not cause cell death, and only the granularity of cells was increased as a consequence of cellular uptake. These results confirm the potential of ZnS nanocrystals using in bio-imaging applications.

  3. Elemental bioimaging of nanosilver-coated prostheses using X-ray fluorescence spectroscopy and laser ablation-inductively coupled plasma-mass spectrometry.

    Science.gov (United States)

    Blaske, Franziska; Reifschneider, Olga; Gosheger, Georg; Wehe, Christoph A; Sperling, Michael; Karst, Uwe; Hauschild, Gregor; Höll, Steffen

    2014-01-07

    The distribution of different chemical elements from a nanosilver-coated bone implant was visualized, combining the benefits of two complementary methods for elemental bioimaging, the nondestructive micro X-ray fluorescence (μ-XRF), and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). Challenges caused by the physically inhomogeneous materials including bone and soft tissues were addressed by polymer embedding. With the use of μ-XRF, fast sample mapping was achieved obtaining titanium and vanadium signals from the metal implant as well as phosphorus and calcium signals representing hard bone tissue and sulfur distribution representing soft tissues. Only by the use of LA-ICP-MS, the required high sensitivity and low detection limits for the determination of silver were obtained. Metal distribution within the part of cancellous bone was revealed for silver as well as for the implant constituents titanium, vanadium, and aluminum. Furthermore, the detection of coinciding high local zirconium and aluminum signals at the implant surface indicates remaining blasting abrasive from preoperative surface treatment of the nanosilver-coated device.

  4. Thermal, optical and vibrational studies of tyrosine doped LaF3:Ce nanoparticles for bioimaging and biotagging

    Science.gov (United States)

    Singh, Amit T.

    2018-05-01

    Upconversion quantum dots of tyrosine doped LaF3:Ce nanoparticles have been synthesized by wet chemical route. The thermal studies (TGA/DTA) confirm the crystallinity and stability of different phases of synthesized nanoparticles. The UV-Visible spectra show multiple absorption edges at 215.60 nm and 243.10 nm indicating quantum dot nature of the synthesized nanoparticles. The PL spectra showed upconversion with sharp emission peak at 615 nm (red colour). The FT-RAMAN spectra of the synthesized nanoparticles show the modification of the surface of the nanoparticles in the form of functional groups and skeletal groups. Upconversion nature of the synthesized nanoparticles indicates their potential application in bioimaging and biotagging.

  5. Fabrication of transferrin functionalized gold nanoclusters/graphene oxide nanocomposite for turn-on near-infrared fluorescent bioimaging of cancer cells and small animals.

    Science.gov (United States)

    Wang, Yong; Chen, Jia-Tong; Yan, Xiu-Ping

    2013-02-19

    Transferrin (Tf)-functionalized gold nanoclusters (Tf-AuNCs)/graphene oxide (GO) nanocomposite (Tf-AuNCs/GO) was fabricated as a turn-on near-infrared (NIR) fluorescent probe for bioimaging cancer cells and small animals. A one-step approach was developed to prepare Tf-AuNCs via a biomineralization process with Tf as the template. Tf acted not only as a stabilizer and a reducer but also as a functional ligand for targeting the transferrin receptor (TfR). The prepared Tf-AuNCs gave intense NIR fluorescence that can avoid interference from biological media such as tissue autofluorescence and scattering light. The assembly of Tf-AuNCs and GO gave the Tf-AuNCs/GO nanocomposite, a turn-on NIR fluorescent probe with negligible background fluorescence due to the super fluorescence quenching property of GO. The NIR fluorescence of the Tf-AuNCs/GO nanocomposite was effectively restored in the presence of TfR, due to the specific interaction between Tf and TfR and the competition of TfR with the GO for the Tf in Tf-AuNCs/GO composite. The developed turn-on NIR fluorescence probe offered excellent water solubility, stability, and biocompatibility, and exhibited high specificity to TfR with negligible cytotoxicity. The probe was successfully applied for turn-on fluorescent bioimaging of cancer cells and small animals.

  6. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Polito, Anthony B.; Maurer-Gardner, Elizabeth I.; Hussain, Saber M., E-mail: saber.hussain@us.af.mil [Air Force Research Laboratory, Molecular Bioeffects Branch, Bioeffects Division, Human Effectiveness Directorate (United States)

    2015-12-15

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract.

  7. A comparison of analytical methods for detection of [14C]trichloro acetic acid-derived radioactivity in needles and branches of spruce (Picea sp.)

    International Nuclear Information System (INIS)

    Kretzschmar, M.; Matucha, M.; Uhlirova, H.

    1994-01-01

    The branches (wood and needles) of spruces of varying age treated with [ 14 C]trichloro acetic acid (3.7 GBq/mmol) were studied, using the following methods: Qualitative: - Conventional macroautoradiography with X-ray film and histological classification. Quantitative: - 14 C combustion analysis with the sample oxidizer A 307 (Canberra/Packard) followed by measurement of radioactivity using the LS counter 6000 (Beckman Instrumentts); - digital autoradiography with the Digital Autoradiograph LB 286 (Berthold GmbH); -digital autoradiography with the Bio-imaging Analyzer BAS 2000 (Fuji Film Co.). (orig.)

  8. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-09-15

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a study of the performance of the self-seeding scheme accounting for spatiotemporal coupling caused by the use of a single crystal monochromator. Our analysis indicates that this distortion is easily suppressed by the right choice of diamond crystal planes and that the proposed undulator source yields about the same performance as in the case for a X-ray seed pulse with no coupling. Simulations show that the FEL power reaches 2 TW in the 3 keV-5 keV photon energy range, which is the most preferable for single biomolecule imaging.

  9. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2012-09-01

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a study of the performance of the self-seeding scheme accounting for spatiotemporal coupling caused by the use of a single crystal monochromator. Our analysis indicates that this distortion is easily suppressed by the right choice of diamond crystal planes and that the proposed undulator source yields about the same performance as in the case for a X-ray seed pulse with no coupling. Simulations show that the FEL power reaches 2 TW in the 3 keV-5 keV photon energy range, which is the most preferable for single biomolecule imaging.

  10. A new simple phthalimide-based fluorescent probe for highly selective cysteine and bioimaging for living cells

    Science.gov (United States)

    Shen, Youming; Zhang, Xiangyang; Zhang, Youyu; Zhang, Chunxiang; Jin, Junling; Li, Haitao

    2017-10-01

    A new turn-on phthalimide fluorescent probe has designed and synthesized for sensing cysteine (Cys) based on excited state intramolecular proton transfer (ESIPT) process. It is consisted of a 3-hydroxyphthalimide derivative moiety as the fluorophore and an acrylic ester group as a recognition receptor. The acrylic ester acts as an ESIPT blocking agent. Upon addition of cystein, intermolecular nucleophilic attack of cysteine on acrylic ester releases the fluorescent 3-hydroxyphthalimide derivative, thereby enabling the ESIPT process and leading to enhancement of fluorescence. The probe displays high sensitivity, excellent selectivity and with large Stokes shift toward cysteine. The linear interval range of the fluorescence titration ranged from 0 to 1.0 × 10- 5 M and detection limit is low (6 × 10- 8 M). In addition, the probe could be used for bio-imaging in living cells.

  11. Antibiotic Conjugated Fluorescent Carbon Dots as a Theranostic Agent for Controlled Drug Release, Bioimaging, and Enhanced Antimicrobial Activity

    Directory of Open Access Journals (Sweden)

    Mukeshchand Thakur

    2014-01-01

    Full Text Available A novel report on microwave assisted synthesis of bright carbon dots (C-dots using gum arabic (GA and its use as molecular vehicle to ferry ciprofloxacin hydrochloride, a broad spectrum antibiotic, is reported in the present work. Density gradient centrifugation (DGC was used to separate different types of C-dots. After careful analysis of the fractions obtained after centrifugation, ciprofloxacin was attached to synthesize ciprofloxacin conjugated with C-dots (Cipro@C-dots conjugate. Release of ciprofloxacin was found to be extremely regulated under physiological conditions. Cipro@C-dots were found to be biocompatible on Vero cells as compared to free ciprofloxacin (1.2 mM even at very high concentrations. Bare C-dots (∼13 mg mL−1 were used for microbial imaging of the simplest eukaryotic model—Saccharomyces cerevisiae (yeast. Bright green fluorescent was obtained when live imaging was performed to view yeast cells under fluorescent microscope suggesting C-dots incorporation inside the cells. Cipro@C-dots conjugate also showed enhanced antimicrobial activity against both model gram positive and gram negative microorganisms. Thus, the Cipro@C-dots conjugate paves not only a way for bioimaging but also an efficient new nanocarrier for controlled drug release with high antimicrobial activity, thereby serving potential tool for theranostics.

  12. Turn-off fluorescence sensor for the detection of ferric ion in water using green synthesized N-doped carbon dots and its bio-imaging.

    Science.gov (United States)

    Edison, Thomas Nesakumar Jebakumar Immanuel; Atchudan, Raji; Shim, Jae-Jin; Kalimuthu, Senthilkumar; Ahn, Byeong-Cheol; Lee, Yong Rok

    2016-05-01

    This paper reports turn-off fluorescence sensor for Fe(3+) ion in water using fluorescent N-doped carbon dots as a probe. A simple and efficient hydrothermal carbonization of Prunus avium fruit extract for the synthesis of fluorescent nitrogen-doped carbon dots (N-CDs) is described. This green approach proceeds quickly and provides good quality N-CDs. The mean size of synthesized N-CDs was approximately 7nm calculated from the high-resolution transmission electron microscopic images. X-ray photoelectron spectroscopy and Fourier transform infrared spectroscopy revealed the presence of -OH, -NH2, -COOH, and -CO functional groups over the surface of CDs. The N-CDs showed excellent fluorescent properties, and emitted blue fluorescence at 411nm upon excitation at 310nm. The calculated quantum yield of the synthesized N-CDs is 13% against quinine sulfate as a reference fluorophore. The synthesized N-CDs were used as a fluorescent probe towards the selective and sensitive detection of biologically important Fe(3+) ions in water by fluorescence spectroscopy and for bio-imaging of MDA-MB-231 cells. The limit of detection (LOD) and the Stern-Volmer quenching constant for the synthesized N-CDs were 0.96μM and 2.0958×10(3)M of Fe(3+) ions. The green synthesized N-CDs are efficiently used as a promising candidate for the detection of Fe(3+) ions and bio-imaging. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Surface-confined fluorescence enhancement of Au nanoclusters anchoring to a two-dimensional ultrathin nanosheet toward bioimaging

    Science.gov (United States)

    Tian, Rui; Yan, Dongpeng; Li, Chunyang; Xu, Simin; Liang, Ruizheng; Guo, Lingyan; Wei, Min; Evans, David G.; Duan, Xue

    2016-05-01

    Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC/ELDH hybrid material exhibits excellent imaging properties with good stability and biocompatibility in the intracellular environment. Therefore, this work provides a facile strategy to achieve highly luminescent Au NCs via surface-confined emission enhancement imposed by ultrathin inorganic nanosheets, which can be potentially used in bio-imaging and cell labelling.Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC

  14. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Zhou T

    2016-09-01

    Full Text Available Tao Zhou,1 Lei Jia,1 Yi-Feng Luo,2 Jun Xu,1 Ru-Hua Chen,2 Zhi-Jun Ge,2 Tie-Liang Ma,2 Hong Chen,2 Tao-Feng Zhu2 1Department of Physics and Chemistry, Henan Polytechnic University, Jiaozuo, Henan, 2The Affiliated Yixing Hospital of Jiangsu University, Yixing, Jiangsu, People’s Republic of China Abstract: A novel multifunctional halloysite nanotube (HNT-based Fe3O4@HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane3 nanocomposite (Fe-HNT-Eu NC with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers. Keywords: halloysite nanotube, lanthanide complex, iron oxide, luminescence, contrast agent

  15. A Novel Type of Aqueous Dispersible Ultrathin-Layered Double Hydroxide Nanosheets for in Vivo Bioimaging and Drug Delivery.

    Science.gov (United States)

    Yan, Li; Zhou, Mengjiao; Zhang, Xiujuan; Huang, Longbiao; Chen, Wei; Roy, Vellaisamy A L; Zhang, Wenjun; Chen, Xianfeng

    2017-10-04

    Layered double hydroxide (LDH) nanoparticles have been widely used for various biomedical applications. However, because of the difficulty of surface functionalization of LDH nanoparticles, the systemic administration of these nanomaterials for in vivo therapy remains a bottleneck. In this work, we develop a novel type of aqueous dispersible two-dimensional ultrathin LDH nanosheets with a size of about 50 nm and a thickness of about 1.4 to 4 nm. We are able to covalently attach positively charged rhodamine B fluorescent molecules to the nanosheets, and the nanohybrid retains strong fluorescence in liquid and even dry powder form. Therefore, it is available for bioimaging. Beyond this, it is convenient to modify the nanosheets with neutral poly(ethylene glycol) (PEG), so the nanohybrid is suitable for drug delivery through systemic administration. Indeed, in the test of using these nanostructures for delivery of a negatively charged anticancer drug, methotrexate (MTX), in a mouse model, dramatically improved therapeutic efficacy is achieved, indicated by the effective inhibition of tumor growth. Furthermore, our systematic in vivo safety investigation including measuring body weight, determining biodistribution in major organs, hematology analysis, blood biochemical assay, and hematoxylin and eosin stain demonstrates that the new material is biocompatible. Overall, this work represents a major development in the path of modifying functional LDH nanomaterials for clinical applications.

  16. Metal and Complementary Molecular Bioimaging in Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Nady eBraidy

    2014-07-01

    Full Text Available Alzheimer’s disease (AD is the leading cause of dementia in the elderly. AD represents a complex neurological disorder which is best understood as the consequence of a number of interconnected genetic and lifestyle variables, which culminate in multiple changes to brain structure and function. At a molecular level, metal dyshomeostasis is frequently observed in AD due to anomalous binding of metals such as Iron (Fe, Copper (Cu and Zinc (Zn, or impaired regulation of redox-active metals which can induce the formation of cytotoxic reactive oxygen species and neuronal damage. Neuroimaging of metals in a variety of intact brain cells and tissues is emerging as an important tool for increasing our understanding of the role of metal dysregulation in AD. Several imaging techniques have been used to study the cerebral metallo-architecture in biological specimens to obtain spatially resolved data on chemical elements present in a sample. Hyperspectral techniques, such as particle-induced X-ray emission (PIXE, energy dispersive X-ray spectroscopy (EDS, X-ray fluorescence microscopy (XFM, synchrotron X-ray fluorescence (SXRF, secondary ion mass spectrometry (SIMS, and laser ablation inductively coupled mass spectrometry (LA-ICPMS can reveal relative intensities and even semi-quantitative concentrations of a large set of elements with differing spatial resolution and detection sensitivities. Other mass spectrometric and spectroscopy imaging techniques such as laser ablation electrospray ionisation mass spectrometry (LA ESI-MS, MALDI imaging mass spectrometry (MALDI-IMS, and Fourier transform infrared spectroscopy (FTIR can be used to correlate changes in elemental distribution with the underlying pathology in AD brain specimens. The current review aims to discuss the advantages and challenges of using these emerging elemental and molecular imaging techniques, and highlight clinical achievements in AD research using bioimaging techniques.

  17. Multimodal Theranostic Nanoformulations Permit Magnetic Resonance Bioimaging of Antiretroviral Drug Particle Tissue-Cell Biodistribution

    Science.gov (United States)

    Kevadiya, Bhavesh D.; Woldstad, Christopher; Ottemann, Brendan M.; Dash, Prasanta; Sajja, Balasrinivasa R.; Lamberty, Benjamin; Morsey, Brenda; Kocher, Ted; Dutta, Rinku; Bade, Aditya N.; Liu, Yutong; Callen, Shannon E.; Fox, Howard S.; Byrareddy, Siddappa N.; McMillan, JoEllyn M.; Bronich, Tatiana K.; Edagwa, Benson J.; Boska, Michael D.; Gendelman, Howard E.

    2018-01-01

    RATIONALE: Long-acting slow effective release antiretroviral therapy (LASER ART) was developed to improve patient regimen adherence, prevent new infections, and facilitate drug delivery to human immunodeficiency virus cell and tissue reservoirs. In an effort to facilitate LASER ART development, “multimodal imaging theranostic nanoprobes” were created. These allow combined bioimaging, drug pharmacokinetics and tissue biodistribution tests in animal models. METHODS: Europium (Eu3+)- doped cobalt ferrite (CF) dolutegravir (DTG)- loaded (EuCF-DTG) nanoparticles were synthesized then fully characterized based on their size, shape and stability. These were then used as platforms for nanoformulated drug biodistribution. RESULTS: Folic acid (FA) decoration of EuCF-DTG (FA-EuCF-DTG) nanoparticles facilitated macrophage targeting and sped drug entry across cell barriers. Macrophage uptake was higher for FA-EuCF-DTG than EuCF-DTG nanoparticles with relaxivities of r2 = 546 mM-1s-1 and r2 = 564 mM-1s-1 in saline, and r2 = 850 mM-1s-1 and r2 = 876 mM-1s-1 in cells, respectively. The values were ten or more times higher than what was observed for ultrasmall superparamagnetic iron oxide particles (r2 = 31.15 mM-1s-1 in saline) using identical iron concentrations. Drug particles were detected in macrophage Rab compartments by dual fluorescence labeling. Replicate particles elicited sustained antiretroviral responses. After parenteral injection of FA-EuCF-DTG and EuCF-DTG into rats and rhesus macaques, drug, iron and cobalt levels, measured by LC-MS/MS, magnetic resonance imaging, and ICP-MS were coordinate. CONCLUSION: We posit that these theranostic nanoprobes can assess LASER ART drug delivery and be used as part of a precision nanomedicine therapeutic strategy. PMID:29290806

  18. Bioimaging of metallothioneins in ocular tissue sections by laser ablation-ICP-MS using bioconjugated gold nanoclusters as specific tags.

    Science.gov (United States)

    Cruz-Alonso, María; Fernandez, Beatriz; Álvarez, Lydia; González-Iglesias, Héctor; Traub, Heike; Jakubowski, Norbert; Pereiro, Rosario

    2017-12-18

    An immunohistochemical method is described to visualize the distribution of metallothioneins 1/2 (MT 1/2) and metallothionein 3 (MT 3) in human ocular tissue. It is making use of (a) antibodies conjugated to gold nanoclusters (AuNCs) acting as labels, and (b) laser ablation (LA) coupled to inductively coupled plasma - mass spectrometry (ICP-MS). Water-soluble fluorescent AuNCs (with an average size of 2.7 nm) were synthesized and then conjugated to antibody by carbodiimide coupling. The surface of the modified AuNCs was then blocked with hydroxylamine to avoid nonspecific interactions with biological tissue. Immunoassays for MT 1/2 and MT 3 in ocular tissue sections (5 μm thick) from two post mortem human donors were performed. Imaging studies were then performed by fluorescence using confocal microscopy, and LA-ICP-MS was performed in the retina to measure the signal for gold. Signal amplification by the >500 gold atoms in each nanocluster allowed the antigens (MT 1/2 and MT 3) to be imaged by LA-ICP-MS using a laser spot size as small as 4 μm. The image patterns found in retina are in good agreement with those obtained by conventional fluorescence immunohistochemistry which was used as an established reference method. Graphical abstract Gold nanoclusters (AuNCs) conjugated to a primary specific antibody serve as a label for amplified bioimaging of metallothioneins (MTs) by laser ablation coupled to inductively coupled plasma - mass spectrometry (ICP-MS) in human ocular tissue sections.

  19. A chromogenic and fluorogenic rhodol-based chemosensor for hydrazine detection and its application in live cell bioimaging

    Science.gov (United States)

    Tiensomjitr, Khomsan; Noorat, Rattha; Chomngam, Sinchai; Wechakorn, Kanokorn; Prabpai, Samran; Kanjanasirirat, Phongthon; Pewkliang, Yongyut; Borwornpinyo, Suparerk; Kongsaeree, Palangpon

    2018-04-01

    A rhodol-based fluorescent probe has been developed as a selective hydrazine chemosensor using levulinate as a recognition site. The rhodol levulinate probe (RL) demonstrated high selectivity and sensitivity toward hydrazine among other molecules. The chromogenic response of RL solution to hydrazine from colorless to pink could be readily observed by the naked eye, while strong fluorescence emission could be monitored upon excitation at 525 nm. The detection process occurred via a ring-opening process of the spirolactone initiated by hydrazinolysis, triggering the fluorescence emission with a 53-fold enhancement. The probe rapidly reacted with hydrazine in aqueous medium with the detection limit of 26 nM (0.83 ppb), lower than the threshold limit value (TLV) of 10 ppb suggested by the U.S. Environmental Protection Agency. Furthermore, RL-impregnated paper strips could detect hydrazine vapor. For biological applicability of RL, its membrane-permeable property led to bioimaging of hydrazine in live HepG2 cells by confocal fluorescence microscopy.

  20. BIRD: Bio-Image Referral Database. Design and implementation of a new web based and patient multimedia data focused system for effective medical diagnosis and therapy.

    Science.gov (United States)

    Pinciroli, Francesco; Masseroli, Marco; Acerbo, Livio A; Bonacina, Stefano; Ferrari, Roberto; Marchente, Mario

    2004-01-01

    This paper presents a low cost software platform prototype supporting health care personnel in retrieving patient referral multimedia data. These information are centralized in a server machine and structured by using a flexible eXtensible Markup Language (XML) Bio-Image Referral Database (BIRD). Data are distributed on demand to requesting client in an Intranet network and transformed via eXtensible Stylesheet Language (XSL) to be visualized in an uniform way on market browsers. The core server operation software has been developed in PHP Hypertext Preprocessor scripting language, which is very versatile and useful for crafting a dynamic Web environment.

  1. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  2. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Fusion genetic analysis of jasmonate-signalling mutants in Arabidopsis

    DEFF Research Database (Denmark)

    Jensen, Anders Bøgh; Raventos, D.; Mundy, John Williams

    2002-01-01

    as two recessive mutants, designated joe1 and 2, that overexpress the reporter. Genetic analysis indicated that reporter overexpression in the joe mutants requires COI. joe1 responded to MeJA with increased anthocyanin accumulation, while joe2 responded with decreased root growth inhibition. In addition...... activity was also induced by the protein kinase inhibitor staurosporine and antagonized by the protein phosphatase inhibitor okadaic acid. FLUC bio-imaging, RNA gel-blot analysis and progeny analyses identified three recessive mutants that underexpress the FLUC reporter, designated jue1, 2 and 3, as well...

  4. Fluorescent pH-Sensing Probe Based on Biorefinery Wood Lignosulfonate and Its Application in Human Cancer Cell Bioimaging.

    Science.gov (United States)

    Xue, Yuyuan; Liang, Wanshan; Li, Yuan; Wu, Ying; Peng, Xinwen; Qiu, Xueqing; Liu, Jinbin; Sun, Runcang

    2016-12-28

    A water-soluble, ratiometric fluorescent pH probe, L-SRhB, was synthesized via grafting spirolactam Rhodamine B (SRhB) to lignosulfonate (LS). As the ring-opening product of L-SRhB, FL-SRhB was also prepared. The pH-response experiment indicated that L-SRhB showed a rapid response to pH changes from 4.60 to 6.20 with a pK a of 5.35, which indicated that L-SRhB has the potential for pH detection of acidic organelle. In addition, the two probes were internalized successfully by living cells through the endocytosis pathway and could distinguish normal cells from cancer cells by different cell staining rates. In addition, L-SRhB showed obvious cytotoxicity to cancer cells, whereas it was nontoxic to normal cells in the same condition. L-SRhB might have potential in cancer therapy. L-SRhB might be a promising ratiometric fluorescent pH sensor and bioimaging dye for the recognition of cancer cells. The results also provided a new perspective to the high-value utilization of lignin.

  5. Reduced background autofluorescence for cell imaging using nanodiamonds and lanthanide chelates.

    Science.gov (United States)

    Cordina, Nicole M; Sayyadi, Nima; Parker, Lindsay M; Everest-Dass, Arun; Brown, Louise J; Packer, Nicolle H

    2018-03-14

    Bio-imaging is a key technique in tracking and monitoring important biological processes and fundamental biomolecular interactions, however the interference of background autofluorescence with targeted fluorophores is problematic for many bio-imaging applications. This study reports on two novel methods for reducing interference with cellular autofluorescence for bio-imaging. The first method uses fluorescent nanodiamonds (FNDs), containing nitrogen vacancy centers. FNDs emit at near-infrared wavelengths typically higher than most cellular autofluorescence; and when appropriately functionalized, can be used for background-free imaging of targeted biomolecules. The second method uses europium-chelating tags with long fluorescence lifetimes. These europium-chelating tags enhance background-free imaging due to the short fluorescent lifetimes of cellular autofluorescence. In this study, we used both methods to target E-selectin, a transmembrane glycoprotein that is activated by inflammation, to demonstrate background-free fluorescent staining in fixed endothelial cells. Our findings indicate that both FND and Europium based staining can improve fluorescent bio-imaging capabilities by reducing competition with cellular autofluorescence. 30 nm nanodiamonds coated with the E-selectin antibody was found to enable the most sensitive detective of E-selectin in inflamed cells, with a 40-fold increase in intensity detected.

  6. Core-shell designs of photoluminescent nanodiamonds with porous silica coatings for bioimaging and drug delivery II: application.

    Science.gov (United States)

    Prabhakar, Neeraj; Näreoja, Tuomas; von Haartman, Eva; Karaman, Didem Şen; Jiang, Hua; Koho, Sami; Dolenko, Tatiana A; Hänninen, Pekka E; Vlasov, Denis I; Ralchenko, Victor G; Hosomi, Satoru; Vlasov, Igor I; Sahlgren, Cecilia; Rosenholm, Jessica M

    2013-05-07

    Recent advances within materials science and its interdisciplinary applications in biomedicine have emphasized the potential of using a single multifunctional composite material for concurrent drug delivery and biomedical imaging. Here we present a novel composite material consisting of a photoluminescent nanodiamond (ND) core with a porous silica (SiO2) shell. This novel multifunctional probe serves as an alternative nanomaterial to address the existing problems with delivery and subsequent tracing of the particles. Whereas the unique optical properties of ND allows for long-term live cell imaging and tracking of cellular processes, mesoporous silica nanoparticles (MSNs) have proven to be efficient drug carriers. The advantages of both ND and MSNs were hereby integrated in the new composite material, ND@MSN. The optical properties provided by the ND core rendered the nanocomposite suitable for microscopy imaging in fluorescence and reflectance mode, as well as super-resolution microscopy as a STED label; whereas the porous silica coating provided efficient intracellular delivery capacity, especially in surface-functionalized form. This study serves as a demonstration how this novel nanomaterial can be exploited for both bioimaging and drug delivery for future theranostic applications.

  7. DiversePathsJ: diverse shortest paths for bioimage analysis.

    Science.gov (United States)

    Uhlmann, Virginie; Haubold, Carsten; Hamprecht, Fred A; Unser, Michael

    2018-02-01

    We introduce a formulation for the general task of finding diverse shortest paths between two end-points. Our approach is not linked to a specific biological problem and can be applied to a large variety of images thanks to its generic implementation as a user-friendly ImageJ/Fiji plugin. It relies on the introduction of additional layers in a Viterbi path graph, which requires slight modifications to the standard Viterbi algorithm rules. This layered graph construction allows for the specification of various constraints imposing diversity between solutions. The software allows obtaining a collection of diverse shortest paths under some user-defined constraints through a convenient and user-friendly interface. It can be used alone or be integrated into larger image analysis pipelines. http://bigwww.epfl.ch/algorithms/diversepathsj. michael.unser@epfl.ch or fred.hamprecht@iwr.uni-heidelberg.de. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  8. Fluorescent carbon dot-gated multifunctional mesoporous silica nanocarriers for redox/enzyme dual-responsive targeted and controlled drug delivery and real-time bioimaging.

    Science.gov (United States)

    Wang, Ying; Cui, Yu; Zhao, Yating; He, Bing; Shi, Xiaoli; Di, Donghua; Zhang, Qiang; Wang, Siling

    2017-08-01

    A distinctive and personalized nanocarrier is described here for controlled and targeted antitumor drug delivery and real-time bioimaging by combining a redox/enzyme dual-responsive disulfide-conjugated carbon dot with mesoporous silica nanoparticles (MSN-SS-CD HA ). The carbon dot with controlling and targeting abilities was prepared through a polymerizing reaction by applying citric acid and HA as starting materials (named CD HA ). The as-prepared MSN-SS-CD HA exhibited not only superior photostability and excellent biocompatibility, but also the ability to target A549 cells with overexpression of CD44 receptors. Upon loading the antitumor drug, doxorubicin (DOX), into the mesoporous channels of MSN nanoparticles, CD HA with a diameter size of 3nm completely blocked the pore entrance of DOX-encapsulated MSN nanoparticles with a pore size of about 3nm, thus preventing the premature leakage of DOX and increasing the antitumor activity until being triggered by specific stimuli in the tumor environment. The results of the cell imaging and cytotoxicity studies demonstrated that the redox/enzyme dual-responsive DOX-encapsulated MSN-SS-CD HA nanoparticles can selectively deliver and control the release of DOX into tumor cells. Ex vivo fluorescence images showed a much stronger fluorescence of MSN-SS-CD HA -DOX in the tumor site than in normal tissues, greatly facilitating the accumulation of DOX in the target tissue. However, its counterpart, MSN-SH-DOX exhibited no or much lower tumor cytotoxicity and drug accumulation in tumor tissue. In addition, MSN-SS-CD was also used as a control to investigate the ability of MSN-SS-CD HA to target A549 cells. The results obtained indicated that MSN-SS-CD HA possessed a higher cellular uptake through the CD44 receptor-mediated endocytosis compared with MSN-SS-CD in the A549 cells. Such specific redox/enzyme dual-responsive targeted nanocarriers are a useful strategy achieving selective controlled and targeted delivery of

  9. A new "off-on" fluorescent probe for Al(3+) in aqueous solution based on rhodamine B and its application to bioimaging.

    Science.gov (United States)

    Huang, Qi; Zhang, Qingyou; Wang, Enze; Zhou, Yanmei; Qiao, Han; Pang, Lanfang; Yu, Fang

    2016-01-05

    In this paper, a new fluorescent probe has been synthesized and applied as "off-on" sensor for the detection of Al(3+) with a high sensitivity and excellent selectivity in aqueous media. The sensor was easily prepared by one step reaction between rhodamine B hydrazide and pyridoxal hydrochloride named RBP. The structure of the sensor has been characterized by nuclear magnetic resonance and electron spray ionization-mass spectrometry. The fluorescence intensity and absorbance for the sensor showed a good linearity with the concentration of Al(3+) in the range of 0-12.5μM and 8-44μM, respectively, with detection limits of 0.23μM and 1.90μM. The sensor RBP was preliminarily applied to the determination of Al(3+) in water samples from the lake of Henan University and tap water with satisfying results. Moreover, it can be used as a bioimaging reagent for imaging of Al(3+) in living cells. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    International Nuclear Information System (INIS)

    Yang, Xiaodong; Gao, Ya; Huang, Zhibing; Chen, Xiaohui; Ke, Zhiyong; Zhao, Peiliang; Yan, Yichen; Liu, Ruiyuan; Qu, Jinqing

    2015-01-01

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully

  11. Interfacial synthesis of polyethyleneimine-protected copper nanoclusters: Size-dependent tunable photoluminescence, pH sensor and bioimaging.

    Science.gov (United States)

    Wang, Chan; Yao, Yagang; Song, Qijun

    2016-04-01

    The copper nanoclusters (CuNCs) offer excellent potential as functional biological probes due to their unique photoluminescence (PL) properties. Herein, CuNCs capped with hyperbranched polyethylenimine (PEI) were prepared by the interfacial etching approach. The resultant PEI-CuNCs exhibited good dispersion and strong fluorescence with high quantum yields (QYs, up to 7.5%), which would be endowed for bioimaging system. By changing the reaction temperatures from 25 to 150 °C, the size of PEI-CuNCs changed from 1.8 to 3.5 nm, and thus tunable PL were achieved, which was confirmed by transmission electron microscopy (TEM) imagings and PL spectra. Besides, PEI-CuNCs had smart absorption characteristics that the color changes from colorless to blue with changing the pH value from 2.0 to 13.2, and thus they could be used as color indicator for pH detection. In addition, the PEI-CuNCs exhibited good biocompatibility and low cytotoxicity to 293T cells through MTT assay. Owing to the positively charged of PEI-CuNCs surface, they had the ability to capture DNA, and the PEI-CuNCs/DNA complexes could get access to cells for efficient gene expression. Armed with these attractive properties, the synthesized PEI-CuNCs are quite promising in biological applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  13. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  14. Meso-ester and carboxylic acid substituted BODIPYs with far-red and near-infrared emission for bioimaging applications

    KAUST Repository

    Ni, Yong

    2014-01-21

    A series of meso-ester-substituted BODIPY derivatives 1-6 are synthesized and characterized. In particular, dyes functionalized with oligo(ethylene glycol) ether styryl or naphthalene vinylene groups at the α positions of the BODIPY core (3-6) become partially soluble in water, and their absorptions and emissions are located in the far-red or near-infrared region. Three synthetic approaches are attempted to access the meso-carboxylic acid (COOH)-substituted BODIPYs 7 and 8 from the meso-ester-substituted BODIPYs. Two feasible synthetic routes are developed successfully, including one short route with only three steps. The meso-COOH-substituted BODIPY 7 is completely soluble in pure water, and its fluorescence maximum reaches around 650 nm with a fluorescence quantum yield of up to 15 %. Time-dependent density functional theory calculations are conducted to understand the structure-optical properties relationship, and it is revealed that the Stokes shift is dependent mainly on the geometric change from the ground state to the first excited singlet state. Furthermore, cell staining tests demonstrate that the meso-ester-substituted BODIPYs (1 and 3-6) and one of the meso-COOH-substituted BODIPYs (8) are very membrane-permeable. These features make these meso-ester- and meso-COOH-substituted BODIPY dyes attractive for bioimaging and biolabeling applications in living cells. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  16. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  17. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  18. An organic dye with very large Stokes-shift and broad tunability of fluorescence: Potential two-photon probe for bioimaging and ultra-sensitive solid-state gas sensor

    Energy Technology Data Exchange (ETDEWEB)

    He, Tingchao; Tian, Xiaoqing; Lin, Xiaodong, E-mail: linxd@szu.edu.cn, E-mail: hdsun@ntu.edu.sg [College of Physics Science and Technology, Shenzhen University, Shenzhen 518060 (China); Wang, Yue; Zhao, Xin; Sun, Handong, E-mail: linxd@szu.edu.cn, E-mail: hdsun@ntu.edu.sg [Division of Physics and Applied Physics, and Centre for Disruptive Photonic Technologies (CDPT), School of Physical and Mathematical Sciences, Nanyang Technological University, 21 Nanyang Link, Singapore 637371 (Singapore); Gao, Yang; Grimsdale, Andrew C. [School of Materials Science and Engineering, Nanyang Technological University, Singapore 639798 (Singapore)

    2016-01-04

    Light-emitting nonlinear optical molecules, especially those with large Stokes shifts and broad tunability of their emission wavelength, have attracted considerable attention for various applications including biomedical imaging and fluorescent sensors. However, most fluorescent chromophores have only limited potential for such applications due to small Stokes shifts, narrow tunability of fluorescence emissions, and small optical nonlinearity in highly polar solvents. In this work, we demonstrate that a two-photon absorbing stilbene chromophore exhibits a large two-photon absorption action cross-section (ηδ = 320 GM) in dimethylsulfoxide (DMSO) and shows broad fluorescence tunability (125 nm) by manipulating the polarity of the surrounding medium. Importantly, a very large Stokes shift of up to 227 nm is achieved in DMSO. Thanks to these features, this chromophore can be utilized as a two-photon probe for bioimaging applications and in an ultrasensitive solid-state gas detector.

  19. Denoising of Microscopy Images: A Review of the State-of-the-Art, and a New Sparsity-Based Method.

    Science.gov (United States)

    Meiniel, William; Olivo-Marin, Jean-Christophe; Angelini, Elsa D

    2018-08-01

    This paper reviews the state-of-the-art in denoising methods for biological microscopy images and introduces a new and original sparsity-based algorithm. The proposed method combines total variation (TV) spatial regularization, enhancement of low-frequency information, and aggregation of sparse estimators and is able to handle simple and complex types of noise (Gaussian, Poisson, and mixed), without any a priori model and with a single set of parameter values. An extended comparison is also presented, that evaluates the denoising performance of the thirteen (including ours) state-of-the-art denoising methods specifically designed to handle the different types of noises found in bioimaging. Quantitative and qualitative results on synthetic and real images show that the proposed method outperforms the other ones on the majority of the tested scenarios.

  20. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use......This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  1. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  2. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  3. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  4. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  5. Membrane mimetic surface functionalization of nanoparticles: Methods and applications

    Science.gov (United States)

    Weingart, Jacob; Vabbilisetty, Pratima; Sun, Xue-Long

    2013-01-01

    Nanoparticles (NPs), due to their size-dependent physical and chemical properties, have shown remarkable potential for a wide range of applications over the past decades. Particularly, the biological compatibilities and functions of NPs have been extensively studied for expanding their potential in areas of biomedical application such as bioimaging, biosensing, and drug delivery. In doing so, surface functionalization of NPs by introducing synthetic ligands and/or natural biomolecules has become a critical component in regards to the overall performance of the NP system for its intended use. Among known examples of surface functionalization, the construction of an artificial cell membrane structure, based on phospholipids, has proven effective in enhancing biocompatibility and has become a viable alternative to more traditional modifications, such as direct polymer conjugation. Furthermore, certain bioactive molecules can be immobilized onto the surface of phospholipid platforms to generate displays more reminiscent of cellular surface components. Thus, NPs with membrane-mimetic displays have found use in a range of bioimaging, biosensing, and drug delivery applications. This review herein describes recent advances in the preparations and characterization of integrated functional NPs covered by artificial cell membrane structures and their use in various biomedical applications. PMID:23688632

  6. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  7. Gravimetric and titrimetric methods of analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    Gravimetric and titrimetric methods of analysis are considered. Methods of complexometric titration are mentioned, as well as methods of increasing sensitivity in titrimetry. Gravimetry and titrimetry are applied during analysis for traces of geological materials

  8. Phase transformation and spectroscopic adjustment of Gd{sub 2}O{sub 3}:Eu{sup 3+} synthesized by hydrothermal method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zijun; Wang, Pei; Zhong, Jiuping, E-mail: zhongjp@mail.sysu.edu.cn; Liang, Hongbin; Wang, Jing

    2014-08-01

    The microcrystalline Gd{sub 2}O{sub 3}:Eu{sup 3+} phosphors were synthesized by the hydrothermal method with post annealing treatment. The powder X-ray diffraction (XRD) indicated the phase transformation from cubic to monoclinic occurred at about 1673 K. The morphologies and sizes were characterized by scanning electron microscopy (SEM). It was found that the morphology of Gd{sub 2}O{sub 3}:Eu{sup 3+} was altered from nanorod to microparticle as the phase changed from cubic to monoclinic. In order to evaluate the effects of sites and phases on luminescence behaviors, the photoluminescence (PL) properties of both phases were investigated. Dominant red emission was observed due to an efficient energy transfer among the sites as well as the strong excitation of O{sup 2−}–Eu{sup 3+} charge transfer band. It was calculated that the monoclinic structure has a higher degree of distortion. More importantly, the phase transformation resulted in the red shift of the strongest emission peak of Eu{sup 3+} from 610.5 to 622.5 nm, closer to the optical transmission window for bioimaging. - Highlights: • Raising annealing temperature induces phase transformation from cubic to monoclinic. • Different phases and sites lead to distinct photoluminescence properties. • Monoclinic structure has higher degree of distortion and it is calculated. • Monoclinic phase emitting at longer wavelength is proposed for bioimaging.

  9. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  10. β-Ga2O3:Cr(3+) nanoparticle: A new platform with near infrared photoluminescence for drug targeting delivery and bio-imaging simultaneously.

    Science.gov (United States)

    Wang, Xin-Shi; Situ, Jun-Qing; Ying, Xiao-Ying; Chen, Hui; Pan, Hua-fei; Jin, Yi; Du, Yong-Zhong

    2015-08-01

    Multifunctional nanoparticles which integrate the therapeutic agents and bio-imaging agents into one carrier are emerging as a promising therapeutic platform. Herein, GaOOH:Cr(3+) was firstly synthesized using improved hydrothermal method (atmospheric pressure, 95 °C), and by manipulating the pH of the reaction medium, GaOOH:Cr(3+) with different sizes (125.70 nm, 200.60 nm and 313.90 nm) were synthesized. Then β-Ga2O3:Cr(3+) nanoparticles with porous structures were developed as a result of the calcination of GaOOH:Cr(3+). The fabricated, porous β-Ga2O3:Cr(3+) nanoparticles could effectively absorb doxorubicin hydrochloride (DOX) (loading rate: 8% approximately) and had near infrared photoluminescence with a 695 nm emission. Furthermore, β-Ga2O3:Cr(3+) nanoparticles were coated with l-Cys modified hyaluronic acid (HA-Cys) by exploiting the electrostatic interaction and the cross-link effect of disulfide bond to improve the stability. The DOX loaded HA-Cys coated β-Ga2O3:Cr(3+) nanoparticles (HA/β-Ga2O3:Cr(3+)/DOX) showed an oxidation-reduction sensitive drug release behavior. The HA-Cys coated β-Ga2O3:Cr(3+) nanoparticles showed a low cytotoxicity on MCF-7 and Hela cell lines. The cellular uptake of HA/β-Ga2O3:Cr(3+)/DOX using the near infrared photoluminescence of β-Ga2O3:Cr(3+) nanoparticles and the fluorescence of DOX demonstrated the HA/β-Ga2O3:Cr(3+)/DOX could internalize into tumor cells quickly, which was affected by the size and shape of β-Ga2O3:Cr(3+)nanoparticles. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  11. Seismic design and analysis methods

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1993-01-01

    Seismic load is in many areas of the world the most important loading situation from the point of view of structural strength. Taking this into account it is understandable, that there has been a strong allocation of resources in the seismic analysis during the past ten years. In this study there are three areas of the center of gravity: (1) Random vibrations; (2) Soil-structure interaction and (3) The methods for determining structural response. The solution of random vibration problems is clarified with the aid of applications in this study and from the point of view of mathematical treatment and mathematical formulations it is deemed sufficient to give the relevant sources. In the soil-structure interaction analysis the focus has been the significance of frequency dependent impedance functions. As a result it was obtained, that the description of the soil with the aid of frequency dependent impedance functions decreases the structural response and it is thus always the preferred method when compared to more conservative analysis types. From the methods to determine the C structural response the following four were tested: (1) The time history method; (2) The complex frequency-response method; (3) Response spectrum method and (4) The equivalent static force method. The time history appeared to be the most accurate method and the complex frequency-response method did have the widest area of application. (orig.). (14 refs., 35 figs.)

  12. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  13. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  14. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  15. ImagePy: an open-source, Python-based and platform-independent software package for boimage analysis.

    Science.gov (United States)

    Wang, Anliang; Yan, Xiaolong; Wei, Zhijun

    2018-04-27

    This note presents the design of a scalable software package named ImagePy for analysing biological images. Our contribution is concentrated on facilitating extensibility and interoperability of the software through decoupling the data model from the user interface. Especially with assistance from the Python ecosystem, this software framework makes modern computer algorithms easier to be applied in bioimage analysis. ImagePy is free and open source software, with documentation and code available at https://github.com/Image-Py/imagepy under the BSD license. It has been tested on the Windows, Mac and Linux operating systems. wzjdlut@dlut.edu.cn or yxdragon@imagepy.org.

  16. The surface analysis methods; Les methodes d`analyse des surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Deville, J.P. [Institut de Physique et Chimie, 67 - Strasbourg (France)

    1998-11-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.) 11 refs.

  17. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  18. Nanostructure materials for biosensing and bioimaging applications

    Science.gov (United States)

    Law, Wing Cheung

    not fully understand, three possible factors are concluded after systematic researches: (i) an increase of the absolute mass in each binding event, (ii) an increase in the bulk refractive index of the analyte, and (iii) coupling between the localized surface plasmon resonance (LSPR) of metallic nanoparticles and surface plasmon resonance (SPR) of the sensing film. Indeed, the role of plasmonic coupling in sensitivity enhancement is still an open question. In order to obtain a better understanding of this phenomenon, at the end of part I, extended studies were performed to investigate how the LSPR properties of metallic nanoparticle labels correlate with the enhancement factor. For this purpose, gold nanorods (Au-NRs) were chosen as the amplification labels because of the easy tunability of LSPR peak of Au-NR. After reading the "Result and Discussion" section, the readers will have better understanding of "plasmonic coupling" between the sensing film and the metallic labels with suitable operating laser source. In the second part of the thesis, the bioimaging part, the application of nanostructure materials in live cancer cell imaging and small animal imaging were demonstrated. There are different types of imaging technique available in laboratories and clinics: optical imaging, computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), thermography and ultrasound imaging. Although such imaging techniques have been well developed and used over a decade, improving the sensitivity, enhancing the contrast, decreasing the acquisition time and reducing the toxicity of the contrast agent are highly desirable. For optical imaging, the scientists discovered that the use of near infrared fluorescence materials can assist the surgeon to locate the tumor, the nerve and the lymph node more accurately. For CT scan, the use of Au-NR as the contrast agent can improve the sensitivity. Iron oxide nanoparticle or gadolinium ion containing

  19. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  20. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  1. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    Jaklevic, J.M.

    1976-01-01

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  2. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  3. Conceptual design of an undulator system for a dedicated bio-imaging beamline at the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-05-15

    We describe a future possible upgrade of the European XFEL consisting in the construction of an undulator beamline dedicated to life science experiments. The availability of free undulator tunnels at the European XFEL facility offers a unique opportunity to build a beamline optimized for coherent diffraction imaging of complex molecules, like proteins and other biologically interesting structures. Crucial parameters for such bio-imaging beamline are photon energy range, peak power, and pulse duration. Key component of the setup is the undulator source. The peak power is maximized in the photon energy range between 3 keV and 13 keV by the use of a very efficient combination of self-seeding, fresh bunch and tapered undulator techniques. The unique combination of ultra-high peak power of 1 TW in the entire energy range, and ultrashort pulse duration tunable from 2 fs to 10 fs, would allow for single shot coherent imaging of protein molecules with size larger than 10 nm. Also, the new beamline would enable imaging of large biological structures in the water window, between 0.3 keV and 0.4 keV. In order to make use of standardized components, at present we favor the use of SASE3-type undulator segments. The number segments, 40, is determined by the tapered length for the design output power of 1 TW. The present plan assumes the use of a nominal electron bunch with charge of 0.1 nC. Experiments will be performed without interference with the other three undulator beamlines. Therefore, the total amount of scheduled beam time per year is expected to be up to 4000 hours.

  4. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  5. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  6. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  7. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it req......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  8. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  9. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  10. comparison of elastic-plastic FE method and engineering method for RPV fracture mechanics analysis

    International Nuclear Information System (INIS)

    Sun Yingxue; Zheng Bin; Zhang Fenggang

    2009-01-01

    This paper described the FE analysis of elastic-plastic fracture mechanics for a crack in RPV belt line using ABAQUS code. It calculated and evaluated the stress intensity factor and J integral of crack under PTS transients. The result is also compared with that by engineering analysis method. It shows that the results using engineering analysis method is a little larger than the results using FE analysis of 3D elastic-plastic fracture mechanics, thus the engineering analysis method is conservative than the elastic-plastic fracture mechanics method. (authors)

  11. Methods for seismic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Gantenbein, F.

    1990-01-01

    The seismic analysis of a complex structure, such as a nuclear power plant, is done in various steps. An overview of the methods, used in each of these steps will be given in the following chapters: Seismic analysis of the buildings taking into account structures with important mass or stiffness. The input to the building analysis, called ground motion, is described by an accelerogram or a response spectra. In this step, soil structure interaction has to be taken into account. Various methods are available: Impedance, finite element. The response of the structure can be calculated by spectral method or by time history analysis; advantages and limitations of each method will be shown. Calculation of floor response spectrum which are the data for the equipment analysis. Methods to calculate this spectrum will be described. Seismic analysis of the equipments. Presentation of the methods for both monosupported and multisupported equipment will be given. In addition methods to analyse equipments which present non-linearities associated to the boundary conditions such as impacts, sliding will be presented. (author). 30 refs, 15 figs

  12. Agar/gelatin bilayer gel matrix fabricated by simple thermo-responsive sol-gel transition method.

    Science.gov (United States)

    Wang, Yifeng; Dong, Meng; Guo, Mengmeng; Wang, Xia; Zhou, Jing; Lei, Jian; Guo, Chuanhang; Qin, Chaoran

    2017-08-01

    We present a simple and environmentally-friendly method to generate an agar/gelatin bilayer gel matrix for further biomedical applications. In this method, the thermally responsive sol-gel transitions of agar and gelatin combined with the different transition temperatures are exquisitely employed to fabricate the agar/gelatin bilayer gel matrix and achieve separate loading for various materials (e.g., drugs, fluorescent materials, and nanoparticles). Importantly, the resulting bilayer gel matrix provides two different biopolymer environments (a polysaccharide environment vs a protein environment) with a well-defined border, which allows the loaded materials in different layers to retain their original properties (e.g., magnetism and fluorescence) and reduce mutual interference. In addition, the loaded materials in the bilayer gel matrix exhibit an interesting release behavior under the control of thermal stimuli. Consequently, the resulting agar/gelatin bilayer gel matrix is a promising candidate for biomedical applications in drug delivery, controlled release, fluorescence labeling, and bio-imaging. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Dimensionality Reduction Methods: Comparative Analysis of methods PCA, PPCA and KPCA

    Directory of Open Access Journals (Sweden)

    Jorge Arroyo-Hernández

    2016-01-01

    Full Text Available The dimensionality reduction methods are algorithms mapping the set of data in subspaces derived from the original space, of fewer dimensions, that allow a description of the data at a lower cost. Due to their importance, they are widely used in processes associated with learning machine. This article presents a comparative analysis of PCA, PPCA and KPCA dimensionality reduction methods. A reconstruction experiment of worm-shape data was performed through structures of landmarks located in the body contour, with methods having different number of main components. The results showed that all methods can be seen as alternative processes. Nevertheless, thanks to the potential for analysis in the features space and the method for calculation of its preimage presented, KPCA offers a better method for recognition process and pattern extraction

  14. Instrumental neutron activation analysis as a routine method for rock analysis

    International Nuclear Information System (INIS)

    Rosenberg, R.J.

    1977-06-01

    Instrumental neutron activation methods for the analysis of geological samples have been developed. Special emphasis has been laid on the improvement of sensitivity and accuracy in order to maximize tha quality of the analyses. Furthermore, the procedures have been automated as far as possible in order to minimize the cost of the analysis. A short review of the basic literature is given followed by a description of the principles of the method. All aspects concerning the sensitivity are discussed thoroughly in view of the analyst's possibility of influencing them. Experimentally determined detection limits for Na, Al, K, Ca, Sc, Cr, Ti, V, Mn, Fe, Ni, Co, Rb, Zr, Sb, Cs, Ba, La, Ce, Nd, Sm, Eu, Gd, Tb, Dy, Yb, Lu, Hf, Ta, Th and U are given. The errors of the method are discussed followed by actions taken to avoid them. The most significant error was caused by flux deviation, but this was avoided by building a rotating sample holder for rotating the samples during irradiation. A scheme for the INAA of 32 elements is proposed. The method has been automated as far as possible and an automatic γ-spectrometer and a computer program for the automatic calculation of the results are described. Furthermore, a completely automated uranium analyzer based on delayed neutron counting is described. The methods are discussed in view of their applicability to rock analysis. It is stated that the sensitivity varies considerably from element to element and instrumental activation analysis is an excellent method for the analysis of some specific elements like lanthanides, thorium and uranium but less so for many other elements. The accuracy is good varying from 2% to 10% for most elements. Instrumental activation analysis for most elements is rather an expensive method there being, however, a few exceptions. The most important of these is uranium. The analysis of uranium by delayed neutron counting is an inexpensive means for the analysis of large numbers of samples needed for

  15. One-step microwave synthesis of photoluminescent carbon nanoparticles from sodium dextran sulfate water solution

    Science.gov (United States)

    Kokorina, Alina A.; Goryacheva, Irina Y.; Sapelkin, Andrei V.; Sukhorukov, Gleb B.

    2018-04-01

    Photoluminescent (PL) carbon nanoparticles (CNPs) have been synthesized by one-step microwave irradiation from water solution of sodium dextran sulfate (DSS) as the sole carbon source. Microwave (MW) method is very simple and cheap and it provides fast synthesis of CNPs. We have varied synthesis time for obtaining high luminescent CNPs. The synthesized CNPs exhibit excitation-dependent photoluminescent. Final CNPs water solution has a blue- green luminescence. CNPs have low cytotoxicity, good photostability and can be potentially suitable candidates for bioimaging, analysis or analytical tests.

  16. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  17. Analysis methods (from 301 to 351)

    International Nuclear Information System (INIS)

    Analysis methods of materials used in the nuclear field (uranium, plutonium and their compounds, zirconium, magnesium, water...) and determination of impurities. Only reliable methods are selected [fr

  18. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  19. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    International Nuclear Information System (INIS)

    Svenson, Ola

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses

  20. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola [Stockholm Univ. (Sweden). Dept. of Psychology

    2000-02-01

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses.

  1. 21 CFR 2.19 - Methods of analysis.

    Science.gov (United States)

    2010-04-01

    ... Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL GENERAL ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... issues of the “Journal of the Association of Official Analytical Chemists”), which are incorporated by...

  2. Inelastic analysis methods for piping systems

    International Nuclear Information System (INIS)

    Boyle, J.T.; Spence, J.

    1980-01-01

    The analysis of pipework systems which operate in an environment where local inelastic strains are evident is one of the most demanding problems facing the stress analyst in the nuclear field. The spatial complexity of even the most modest system makes a detailed analysis using finite element techniques beyond the scope of current computer technology. For this reason the emphasis has been on simplified methods. It is the aim of this paper to provide a reasonably complete, state-of-the-art review of inelastic pipework analysis methods and to attempt to highlight areas where reliable information is lacking and further work is needed. (orig.)

  3. Limitations of systemic accident analysis methods

    Directory of Open Access Journals (Sweden)

    Casandra Venera BALAN

    2016-12-01

    Full Text Available In terms of system theory, the description of complex accidents is not limited to the analysis of the sequence of events / individual conditions, but highlights nonlinear functional characteristics and frames human or technical performance in relation to normal functioning of the system, in safety conditions. Thus, the research of the system entities as a whole is no longer an abstraction of a concrete situation, but an exceeding of the theoretical limits set by analysis based on linear methods. Despite the issues outlined above, the hypothesis that there isn’t a complete method for accident analysis is supported by the nonlinearity of the considered function or restrictions, imposing a broad vision of the elements introduced in the analysis, so it can identify elements corresponding to nominal parameters or trigger factors.

  4. Web2.0 paves new ways for collaborative and exploratory analysis of Chemical Compounds in Spectrometry Data

    Directory of Open Access Journals (Sweden)

    Loyek Christian

    2011-06-01

    Full Text Available In nowadays life science projects, sharing data and data interpretation is becoming increasingly important. This considerably calls for novel information technology approaches, which enable the integration of expert knowledge from different disciplines in combination with advanced data analysis facilities in a collaborative manner. Since the recent development of web technologies offers scientific communities new ways for cooperation and communication, we propose a fully web-based software approach for the collaborative analysis of bioimage data and demonstrate the applicability of Web2.0 techniques to ion mobility spectrometry image data. Our approach allows collaborating experts to easily share, explore and discuss complex image data without any installation of software packages. Scientists only need a username and a password to get access to our system and can directly start exploring and analyzing their data.

  5. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  6. Study on mixed analysis method for fatigue analysis of oblique safety injection nozzle on main piping

    International Nuclear Information System (INIS)

    Lu Xifeng; Zhang Yixiong; Ai Honglei; Wang Xinjun; He Feng

    2014-01-01

    The simplified analysis method and the detailed analysis method were used for the fatigue analysis of the nozzle on the main piping. Because the structure of the oblique safety injection nozzle is complex and some more severe transients are subjected. The results obtained are more penalized and cannot be validate when the simplified analysis method used for the fatigue analysis. It will be little conservative when the detailed analysis method used, but it is more complex and time-consuming and boring labor. To reduce the conservatism and save time, the mixed analysis method which combining the simplified analysis method with the detailed analysis method is used for the fatigue analysis. The heat transfer parameters between the fluid and the structure which used for analysis were obtained by heat transfer property experiment. The results show that the mixed analysis which heat transfer property is considered can reduce the conservatism effectively, and the mixed analysis method is a more effective and practical method used for the fatigue analysis of the oblique safety injection nozzle. (authors)

  7. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. P. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  8. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  9. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    Yoo, Bong; Lee, Jae-Han; Koo, Gyeng-Hoi

    2002-01-01

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  10. Radiochemistry and nuclear methods of analysis

    International Nuclear Information System (INIS)

    Ehmann, W.D.; Vance, D.

    1991-01-01

    This book provides both the fundamentals of radiochemistry as well as specific applications of nuclear techniques to analytical chemistry. It includes such areas of application as radioimmunoassay and activation techniques using very short-lined indicator radionuclides. It emphasizes the current nuclear methods of analysis such as neutron activation PIXE, nuclear reaction analysis, Rutherford backscattering, isotope dilution analysis and others

  11. Analysis of vignette method data in sociological research

    Directory of Open Access Journals (Sweden)

    Zh V Puzanova

    2016-12-01

    Full Text Available The article considers the vignette method as a projective technique that can be an alternative for the traditional methods in mass surveys. The authors present an example of the study of the social representations of an intelligent man with the vignette method: identify the meaning of the concept ‘social representations’ suggested by S. Moscovici, a scientist who introduced its conceptualization and empirical interpretation; and describe the structure of social representations which consists of a ‘core’ and a ‘periphery’ according to the theory of J.-C. Abric. The article shows the process of creating vignettes, choosing their number and conditions for the tool application. The main emphasis is made on the analysis of data obtained through the vignette method by calculating indices, discriminant analysis and logistic regression, and the explanation for the application of these three techniques is given. The authors describe the research procedure, the creation of the tool and the sample; compare the results of each method of analysis. Discriminant analysis and logistic regression data confirm each other, which is an important verification of the results of different methods of analysis.

  12. Applying homotopy analysis method for solving differential-difference equation

    International Nuclear Information System (INIS)

    Wang Zhen; Zou Li; Zhang Hongqing

    2007-01-01

    In this Letter, we apply the homotopy analysis method to solving the differential-difference equations. A simple but typical example is applied to illustrate the validity and the great potential of the generalized homotopy analysis method in solving differential-difference equation. Comparisons are made between the results of the proposed method and exact solutions. The results show that the homotopy analysis method is an attractive method in solving the differential-difference equations

  13. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  14. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  15. Multiple Beta Spectrum Analysis Method Based on Spectrum Fitting

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Uk Jae; Jung, Yun Song; Kim, Hee Reyoung [UNIST, Ulsan (Korea, Republic of)

    2016-05-15

    When the sample of several mixed radioactive nuclides is measured, it is difficult to divide each nuclide due to the overlapping of spectrums. For this reason, simple mathematical analysis method for spectrum analysis of the mixed beta ray source has been studied. However, existing research was in need of more accurate spectral analysis method as it has a problem of accuracy. The study will describe the contents of the separation methods of the mixed beta ray source through the analysis of the beta spectrum slope based on the curve fitting to resolve the existing problem. The fitting methods including It was understood that sum of sine fitting method was the best one of such proposed methods as Fourier, polynomial, Gaussian and sum of sine to obtain equation for distribution of mixed beta spectrum. It was shown to be the most appropriate for the analysis of the spectrum with various ratios of mixed nuclides. It was thought that this method could be applied to rapid spectrum analysis of the mixed beta ray source.

  16. Substoichiometric method in the simple radiometric analysis

    International Nuclear Information System (INIS)

    Ikeda, N.; Noguchi, K.

    1979-01-01

    The substoichiometric method is applied to simple radiometric analysis. Two methods - the standard reagent method and the standard sample method - are proposed. The validity of the principle of the methods is verified experimentally in the determination of silver by the precipitation method, or of zinc by the ion-exchange or solvent-extraction method. The proposed methods are simple and rapid compared with the conventional superstoichiometric method. (author)

  17. Comparison of Method for the Simultaneous Analysis of Bioactive for the Eurycoma longifolia jack using different Analysis Methods

    International Nuclear Information System (INIS)

    Salmah Moosa; Sobri Hussein; Rusli Ibrahim; Maizatul Akmam Md Nasir

    2011-01-01

    Eurycoma longifolia jack (Tongkat Ali, Genus: Eurycoma; Family, Simaroubaceae) is one of the most popular tropical herbal plants. The plant contains a series of quassinoids, which are mainly responsible for its bitter taste. The plant extract, especially roots, are exclusively used (traditionally) for enhancing testosterone levels in men. The roots also have been used in indigenous traditional medicines for its unique anti-malarial, anti-pyretic, antiulcer, cytotoxic and aphrodisiac properties. As part of an on-going research on the bioactive compound of Eurycoma longifolia and evaluation for an optimized analysis method and parameter that influence in LC-MS analysis were carried out. Identification of the bioactive compounds was based on comparison of calculated retention time and mass spectral data with literature values. Examination of the Eurycoma longifolia sample showed some variations and differences in terms of parameters in LC-MS. However, combined method using methanol as the solvent with injection volume 1.0 μL and analysis in ultra scan mode and acetic acid as acidic modifier is the optimum method for LCMS analysis of Eurycoma longifolia because it successfully detected the optimum mass of compounds with good resolution and perfectly separated within a short analysis time. (author)

  18. Current status of methods for shielding analysis

    International Nuclear Information System (INIS)

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed

  19. Direct methods of soil-structure interaction analysis for earthquake loadings

    International Nuclear Information System (INIS)

    Yun, J. B.; Kim, J. M.; Kim, Y. S. and others

    1993-07-01

    The objectives of this study are to review the methods of soil- structure interaction system analysis, particularly the direct method, and to carry out the blind prediction analysis of the Forced Vibration Test(FVT) before backfill in the course of Hualien LSST project. The scope and contents of this study are as follows : theoretical review on soil-structure interaction analysis methods, free-field response analysis methods, modelling methods of unbounded exterior region, hualien LSST FVT blind prediction analysis before backfill. The analysis results are found to be very well compared with the field test results

  20. Inorganic nanolayers: structure, preparation, and biomedical applications

    Directory of Open Access Journals (Sweden)

    Saifullah B

    2015-09-01

    Full Text Available Bullo Saifullah, Mohd Zobir B HusseinMaterials Synthesis and Characterization Laboratory, Institute of Advanced Technology (ITMA, Universiti Putra Malaysia, Serdang, MalaysiaAbstract: Hydrotalcite-like compounds are two-dimensional inorganic nanolayers also known as clay minerals or anionic clays or layered double hydroxides/layered hydroxy salts, and have emerged as a single type of material with numerous biomedical applications, such as drug delivery, gene delivery, cosmetics, and biosensing. Inorganic nanolayers are promising materials due to their fascinating properties, such as ease of preparation, ability to intercalate different type of anions (inorganic, organic, biomolecules, and even genes, high thermal stability, delivery of intercalated anions in a sustained manner, high biocompatibility, and easy biodegradation. Inorganic nanolayers have been the focus for researchers over the last decade, resulting in widening application horizons, especially in the field of biomedical science. These nanolayers have been widely applied in drug and gene delivery. They have also been applied in biosensing technology, and most recently in bioimaging science. The suitability of inorganic nanolayers for application in drug delivery, gene delivery, biosensing technology, and bioimaging science makes them ideal materials to be applied for theranostic purposes. In this paper, we review the structure, methods of preparation, and latest advances made by inorganic nanolayers in such biomedical applications as drug delivery, gene delivery, biosensing, and bioimaging.Keywords: inorganic nanolayers, layered double hydroxides, layered hydroxy salts, drug delivery, biosensors, bioimaging

  1. Genome analysis methods - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods Genome analysis... methods Data detail Data name Genome analysis methods DOI 10.18908/lsdba.nbdc01194-01-005 De...scription of data contents The current status and related information of the genomic analysis about each org...anism (March, 2014). In the case of organisms carried out genomic analysis, the d...e File name: pgdbj_dna_marker_linkage_map_genome_analysis_methods_en.zip File URL: ftp://ftp.biosciencedbc.j

  2. Multivariate Methods for Meta-Analysis of Genetic Association Studies.

    Science.gov (United States)

    Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G

    2018-01-01

    Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.

  3. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    A practical guide to the methods in general use for the complete analysis of silicate rock material and for the determination of all those elements present in major, minor or trace amounts in silicate...

  4. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  5. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  6. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  7. A review of analysis methods about thermal buckling

    International Nuclear Information System (INIS)

    Moulin, D.; Combescure, A.; Acker, D.

    1987-01-01

    This paper highlights the main items emerging from a large bibliographical survey carried out on strain-induced buckling analysis methods applicable in the building of fast neutron reactor structures. The work is centred on the practical analysis methods used in construction codes to account for the strain-buckling of thin and slender structures. Methods proposed in the literature concerning past and present studies are rapidly described. Experimental, theoretical and numerical methods are considered. Methods applicable to design and their degree of validation are indicated

  8. Multifunctional nanoparticle-EpCAM aptamer bioconjugates: a paradigm for targeted drug delivery and imaging in cancer therapy.

    Science.gov (United States)

    Das, Manasi; Duan, Wei; Sahoo, Sanjeeb K

    2015-02-01

    The promising proposition of multifunctional nanoparticles for cancer diagnostics and therapeutics has inspired the development of theranostic approach for improved cancer therapy. Moreover, active targeting of drug carrier to specific target site is crucial for providing efficient delivery of therapeutics and imaging agents. In this regard, the present study investigates the theranostic capabilities of nutlin-3a loaded poly (lactide-co-glycolide) nanoparticles, functionalized with a targeting ligand (EpCAM aptamer) and an imaging agent (quantum dots) for cancer therapy and bioimaging. A wide spectrum of in vitro analysis (cellular uptake study, cytotoxicity assay, cell cycle and apoptosis analysis, apoptosis associated proteins study) revealed superior therapeutic potentiality of targeted NPs over other formulations in EpCAM expressing cells. Moreover, our nanotheranostic system served as a superlative bio-imaging modality both in 2D monolayer culture and tumor spheroid model. Our result suggests that, these aptamer-guided multifunctional NPs may act as indispensable nanotheranostic approach toward cancer therapy. This study investigated the theranostic capabilities of nutlin-3a loaded poly (lactide-co-glycolide) nanoparticles functionalized with a targeting ligand (EpCAM aptamer) and an imaging agent (quantum dots) for cancer therapy and bioimaging. It was concluded that the studied multifunctional targeted nanoparticle may become a viable and efficient approach in cancer therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor); Lane, Arthur L. (Inventor)

    2018-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  10. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    Moulin, D.

    1987-01-01

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  11. Evaluation and presentation of analysis methods for reception analysis in reprocessing

    International Nuclear Information System (INIS)

    Mainka, E.

    1985-01-01

    The fissile material content in the dissolving or balancing tank of a reprocessing plant has special significance in nuclear fuel balancing. This is the first opportunity for destructive analysis of the fuel content of the material after burn-up of fuel elements in the reactor. In the current state-of-the-art, all balancing methods are based directly or indirectly on data obtained by chemical analysis. The following methods are evaluated: Mass-spectroscopic isotope dilution analysis, X-ray fluorescence spectroscopy, Isotopic correlation, Gamma absorptiometry, Redox titration, Emission spectroscopy after plasma excitation, Alpha spectroscopy, and Laser Raman spectroscopy

  12. Alternative methods for the seismic analysis of piping systems

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    This document is a review of 12 methods and criteria for the seismic analysis of piping systems. Each of the twelve chapters in this document cover the important technical aspects of a given method. The technical aspects presented are those the Subcommittee on Dynamic Stress Criteria believe important to the application of the method, and should not be considered as a positive or negative endorsement for any of the methods. There are many variables in an analysis of a piping system that can influence the selection of the analysis method and criteria to be applied. These variable include system configuration, technical issues, precedent, licensing considerations, and regulatory acceptance. They must all be considered in selecting the appropriate seismic analysis method and criteria. This is relevant for nuclear power plants

  13. Meshless methods in biomechanics bone tissue remodelling analysis

    CERN Document Server

    Belinha, Jorge

    2014-01-01

    This book presents the complete formulation of a new advanced discretization meshless technique: the Natural Neighbour Radial Point Interpolation Method (NNRPIM). In addition, two of the most popular meshless methods, the EFGM and the RPIM, are fully presented. Being a truly meshless method, the major advantages of the NNRPIM over the FEM, and other meshless methods, are the remeshing flexibility and the higher accuracy of the obtained variable field. Using the natural neighbour concept, the NNRPIM permits to determine organically the influence-domain, resembling the cellulae natural behaviour. This innovation permits the analysis of convex boundaries and extremely irregular meshes, which is an advantage in the biomechanical analysis, with no extra computational effort associated.   This volume shows how to extend the NNRPIM to the bone tissue remodelling analysis, expecting to contribute with new numerical tools and strategies in order to permit a more efficient numerical biomechanical analysis.

  14. Cask crush pad analysis using detailed and simplified analysis methods

    International Nuclear Information System (INIS)

    Uldrich, E.D.; Hawkes, B.D.

    1997-01-01

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach

  15. Nonlinear nonstationary analysis with the finite element method

    International Nuclear Information System (INIS)

    Vaz, L.E.

    1981-01-01

    In this paper, after some introductory remarks on numerical methods for the integration of initial value problems, the applicability of the finite element method for transient diffusion analysis as well as dynamic and inelastic analysis is discussed, and some examples are presented. (RW) [de

  16. A Comparison of Card-sorting Analysis Methods

    DEFF Research Database (Denmark)

    Nawaz, Ather

    2012-01-01

    This study investigates how the choice of analysis method for card sorting studies affects the suggested information structure for websites. In the card sorting technique, a variety of methods are used to analyse the resulting data. The analysis of card sorting data helps user experience (UX......) designers to discover the patterns in how users make classifications and thus to develop an optimal, user-centred website structure. During analysis, the recurrence of patterns of classification between users influences the resulting website structure. However, the algorithm used in the analysis influences...... the recurrent patterns found and thus has consequences for the resulting website design. This paper draws an attention to the choice of card sorting analysis and techniques and shows how it impacts the results. The research focuses on how the same data for card sorting can lead to different website structures...

  17. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  18. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  19. A comparison of analysis methods to estimate contingency strength.

    Science.gov (United States)

    Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T

    2018-05-09

    To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.

  20. The ethnographic method and its relationship with the domain analysis

    Directory of Open Access Journals (Sweden)

    Manuel Alejandro Romero Quesada

    2016-03-01

    Full Text Available This paper analyzes the theoretical and conceptual relationship of the ethnographic method with domain analysis. A documentary analysis was performed, exploring the categories of domain analysis and ethnographic method.It was obtained as a result: the analysis of the points of contact between domain analysis and the ethnographic method from an epistemological, methodological and procedural terms. It is concluded that the ethnographic method is an important research tool to scan the turbulent socio-cultural scenarios that occur within discursive communities that constitute the domains of knowledge.

  1. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  2. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Directory of Open Access Journals (Sweden)

    Christian Held

    2013-01-01

    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  3. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Comparability of river suspended-sediment sampling and laboratory analysis methods

    Science.gov (United States)

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  5. Nodal method for fast reactor analysis

    International Nuclear Information System (INIS)

    Shober, R.A.

    1979-01-01

    In this paper, a nodal method applicable to fast reactor diffusion theory analysis has been developed. This method has been shown to be accurate and efficient in comparison to highly optimized finite difference techniques. The use of an analytic solution to the diffusion equation as a means of determining accurate coupling relationships between nodes has been shown to be highly accurate and efficient in specific two-group applications, as well as in the current multigroup method

  6. Inorganic nanolayers: structure, preparation, and biomedical applications.

    Science.gov (United States)

    Saifullah, Bullo; Hussein, Mohd Zobir B

    2015-01-01

    Hydrotalcite-like compounds are two-dimensional inorganic nanolayers also known as clay minerals or anionic clays or layered double hydroxides/layered hydroxy salts, and have emerged as a single type of material with numerous biomedical applications, such as drug delivery, gene delivery, cosmetics, and biosensing. Inorganic nanolayers are promising materials due to their fascinating properties, such as ease of preparation, ability to intercalate different type of anions (inorganic, organic, biomolecules, and even genes), high thermal stability, delivery of intercalated anions in a sustained manner, high biocompatibility, and easy biodegradation. Inorganic nanolayers have been the focus for researchers over the last decade, resulting in widening application horizons, especially in the field of biomedical science. These nanolayers have been widely applied in drug and gene delivery. They have also been applied in biosensing technology, and most recently in bioimaging science. The suitability of inorganic nanolayers for application in drug delivery, gene delivery, biosensing technology, and bioimaging science makes them ideal materials to be applied for theranostic purposes. In this paper, we review the structure, methods of preparation, and latest advances made by inorganic nanolayers in such biomedical applications as drug delivery, gene delivery, biosensing, and bioimaging.

  7. Safety relief valve alternate analysis method

    International Nuclear Information System (INIS)

    Adams, R.H.; Javid, A.; Khatua, T.P.

    1981-01-01

    An experimental test program was started in the United States in 1976 to define and quantify Safety Relief Valve (SRV) phenomena in General Electric Mark I Suppression Chambers. The testing considered several discharged devices and was used to correlate SRV load prediction models. The program was funded by utilities with Mark I containments and has resulted in a detailed SRV load definition as a portion of the Mark I containment program Load Definition Report (LDR). The (USNRC) has reviewed and approved the LDR SRV load definition. In addition, the USNRC has permitted calibration of structural models used for predicting torus response to SRV loads. Model calibration is subject to confirmatory in-plant testing. The SRV methodology given in the LDR requires that transient dynamic pressures be applied to a torus structural model that includes a fluid added mass matrix. Preliminary evaluations of torus response have indicated order of magnitude conservatisms, with respect to test results, which could result in unrealistic containment modifications. In addition, structural response trends observed in full-scale tests between cold pipe, first valve actuation and hot pipe, subsequent valve actuation conditions have not been duplicated using current analysis methods. It was suggested by others that an energy approach using current fluid models be utilized to define loads. An alternate SRV analysis method is defined to correct suppression chamber structural response to a level that permits economical but conservative design. Simple analogs are developed for the purpose of correcting the analytical response obtained from LDR analysis methods. Analogs evaluated considered forced vibration and free vibration structural response. The corrected response correlated well with in-plant test response. The correlation of the analytical model at test conditions permits application of the alternate analysis method at design conditions. (orig./HP)

  8. Analysis and study on core power capability with margin method

    International Nuclear Information System (INIS)

    Liu Tongxian; Wu Lei; Yu Yingrui; Zhou Jinman

    2015-01-01

    Core power capability analysis focuses on the power distribution control of reactor within the given mode of operation, for the purpose of defining the allowed normal operating space so that Condition Ⅰ maneuvering flexibility is maintained and Condition Ⅱ occurrences are adequately protected by the reactor protection system. For the traditional core power capability analysis methods, such as synthesis method or advanced three dimension method, usually calculate the key safety parameters of the power distribution, and then verify that these parameters meet the design criteria. For PWR with on-line power distribution monitoring system, core power capability analysis calculates the most power level which just meets the design criteria. On the base of 3D FAC method of Westinghouse, the calculation model of core power capability analysis with margin method is introduced to provide reference for engineers. The core power capability analysis of specific burnup of Sanmen NPP is performed with the margin method. The results demonstrate the rationality of the margin method. The calculation model of the margin method not only helps engineers to master the core power capability analysis for AP1000, but also provides reference for engineers for core power capability analysis of other PWR with on-line power distribution monitoring system. (authors)

  9. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. 3D/4D multiscale imaging in acute lymphoblastic leukemia cells: visualizing dynamics of cell death

    Science.gov (United States)

    Sarangapani, Sreelatha; Mohan, Rosmin Elsa; Patil, Ajeetkumar; Lang, Matthew J.; Asundi, Anand

    2017-06-01

    Quantitative phase detection is a new methodology that provides quantitative information on cellular morphology to monitor the cell status, drug response and toxicity. In this paper the morphological changes in acute leukemia cells treated with chitosan were detected using d'Bioimager a robust imaging system. Quantitative phase image of the cells was obtained with numerical analysis. Results show that the average area and optical volume of the chitosan treated cells is significantly reduced when compared with the control cells, which reveals the effect of chitosan on the cancer cells. From the results it can be attributed that d'Bioimager can be used as a non-invasive imaging alternative to measure the morphological changes of the living cells in real time.

  11. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  12. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  13. Mass spectrometric methods for trace analysis of metals

    International Nuclear Information System (INIS)

    Bahr, U.; Schulten, H.R.

    1981-01-01

    A brief outline is given of the principles of mass spectrometry (MS) and the fundamentals of qualitative and quantitative mass spectrometric analysis emphasizing recent developments and results. Classical methods of the analysis of solids, i.e. spark-source MS and thermal ionization MS, as well as recent methods of metal analysis are described. Focal points in this survey of recently developed techniques include secondary ion MS, laser probe MS, plasma ion source MS, gas discharge MS and field desorption MS. Here, a more detailed description is given and the merits of these emerging methods are discussed more explicitly. In particular, the results of the field desorption techniques in elemental analyses are reviewed and critically evaluated

  14. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  15. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  16. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  17. Research on neutron noise analysis stochastic simulation method for α calculation

    International Nuclear Information System (INIS)

    Zhong Bin; Shen Huayun; She Ruogu; Zhu Shengdong; Xiao Gang

    2014-01-01

    The prompt decay constant α has significant application on the physical design and safety analysis in nuclear facilities. To overcome the difficulty of a value calculation with Monte-Carlo method, and improve the precision, a new method based on the neutron noise analysis technology was presented. This method employs the stochastic simulation and the theory of neutron noise analysis technology. Firstly, the evolution of stochastic neutron was simulated by discrete-events Monte-Carlo method based on the theory of generalized Semi-Markov process, then the neutron noise in detectors was solved from neutron signal. Secondly, the neutron noise analysis methods such as Rossia method, Feynman-α method, zero-probability method, and cross-correlation method were used to calculate a value. All of the parameters used in neutron noise analysis method were calculated based on auto-adaptive arithmetic. The a value from these methods accords with each other, the largest relative deviation is 7.9%, which proves the feasibility of a calculation method based on neutron noise analysis stochastic simulation. (authors)

  18. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  19. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  20. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  1. Nuclear analysis methods. Rudiments of radiation protection

    International Nuclear Information System (INIS)

    Roth, E.

    1998-01-01

    The nuclear analysis methods are generally used to analyse radioactive elements but they can be used also for chemical analysis, with fields such analysis and characterization of traces. The principles of radiation protection are explained (ALARA), the biological effects of ionizing radiations are given, elements and units used in radiation protection are reminded in tables. A part of this article is devoted to how to use radiation protection in a nuclear analysis laboratory. (N.C.)

  2. Analysis and synthesis of a logic control circuit by binary analysis methods

    International Nuclear Information System (INIS)

    Chicheportiche, Armand

    1974-06-01

    The analytical study of the logic circuits described in this report clearly shows the fruitful efficiency of the methods proposed by Binary Analysis. This study is a very new approach in logic and these mathematical methods are systematically precise in their applications. The detailed operations of an automatic system are to be studied in a way which cannot be reached by other methods. The definition and utilization of transition equations allow the determination of the different commutations in the auxiliary switch functions of a sequential system. This new way of analysis digital circuits will certainly develop in a very near future [fr

  3. Neutron activation analysis: principle and methods

    International Nuclear Information System (INIS)

    Reddy, A.V.R.; Acharya, R.

    2006-01-01

    Neutron activation analysis (NAA) is a powerful isotope specific nuclear analytical technique for simultaneous determination of elemental composition of major, minor and trace elements in diverse matrices. The technique is capable of yielding high analytical sensitivity and low detection limits (ppm to ppb). Due to high penetration power of neutrons and gamma rays, NAA experiences negligible matrix effects in the samples of different origins. Depending on the sample matrix and element of interest NAA technique is used non-destructively, known as instrumental neutron activation analysis (INAA), or through chemical NAA methods. The present article describes principle of NAA, different methods and gives a overview some applications in the fields like environment, biology, geology, material sciences, nuclear technology and forensic sciences. (author)

  4. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  5. Vibration analysis of the piping system using the modal analysis method, 1

    International Nuclear Information System (INIS)

    Fujikawa, Takeshi; Kurohashi, Michiya; Inoue, Yoshio

    1975-01-01

    Modal analysis method was developed for the vibration analysis of piping system in nuclear or chemical plants, with finite element theory, and verified by sinusoidal vibration method. The natural vibration equation for pipings was derived with stiffness, attenuation and mass matrices, and eigenvalues are obtained with usual method, then the forced vibration equation for pipings was derived with the same manner, and the special solutions are given by modal method from the eigenvalues of the natural vibration equation. Three simple piping models (one, two and three dimensional) were made, and the natural vibration frequency was measured with forced input from an electrical dynamic shaker and a sound speaker. The experimental values of natural vibration frequency showed good agreement with the results by the analytical method. Therefore the theoretical approach for piping system vibration was proved to be valid. (Iwase, T.)

  6. Chemical analysis of cyanide in cyanidation process: review of methods

    International Nuclear Information System (INIS)

    Nova-Alonso, F.; Elorza-Rodriguez, E.; Uribe-Salas, A.; Perez-Garibay, R.

    2007-01-01

    Cyanidation, the world wide method for precious metals recovery, the chemical analysis of cyanide, is a very important, but complex operation. Cyanide can be present forming different species, each of them with different stability, toxicity, analysis method and elimination technique. For cyanide analysis, there exists a wide selection of analytical methods but most of them present difficulties because of the interference of species present in the solution. This paper presents the different available methods for chemical analysis of cyanide: titration, specific electrode and distillation, giving special emphasis on the interferences problem, with the aim of helping in the interpretation of the results. (Author)

  7. Lanthanide light for biology and medical diagnosis

    International Nuclear Information System (INIS)

    Bünzli, Jean-Claude G.

    2016-01-01

    Optical imaging emerges as a vital component of the various techniques needed to meet the stringent requirements of modern bioanalysis and bioimaging. Lanthanide luminescent bioprobes (LLBs) have greatly contributed to this field during the past 35 years because they have definite advantages such as little or no photobleaching and, thanks to time-gated detection, high sensitivity. The review summarizes the numerous tools offered by LLBs under their various forms, coordination compounds, nanoparticles, upconverting nanoparticles and their bioconjugates. It then focuses on biosensing, including point-of-care analysis, and then on both in vitro and in vivo bioimaging with visible and NIR light. The last section compares the performances of LLBs versus those of other commonly used bioprobes (organic dyes, quantum dots, and transition metal complexes). It is concluded that although LLBs will not replace all of existing bioprobes, they add invaluable new specific technologies to the biologist and medical doctor toolboxes. A good deal of improvements are achieved through nanotechnologies, which demonstrates that progresses in biosciences depend on the intersection of different disciplines, photophysics, chemistry, biochemistry, nanotechnology, and materials science. - Highlights: • Lanthanide luminescent bioprobes (LLBs) are indispensable tools in biosciences. • The tools provided by LLBs are summarized. • Main trends in biosensing and point-of-care analysis are presented. • Issues regarding optical bioimaging with visible and NIR light are described. • Characteristics of LLBs, including nanoparticles, are compared to other bioprobes.

  8. Lanthanide light for biology and medical diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Bünzli, Jean-Claude G., E-mail: jean-claude.bunzli@epfl.ch [Fujian Institute of Research on the Structure of Matter, Chinese Academy of Sciences, Fuzhou, Fujian 35002 (China); Institute of Chemical Sciences and Engineering, École Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne (Switzerland)

    2016-02-15

    Optical imaging emerges as a vital component of the various techniques needed to meet the stringent requirements of modern bioanalysis and bioimaging. Lanthanide luminescent bioprobes (LLBs) have greatly contributed to this field during the past 35 years because they have definite advantages such as little or no photobleaching and, thanks to time-gated detection, high sensitivity. The review summarizes the numerous tools offered by LLBs under their various forms, coordination compounds, nanoparticles, upconverting nanoparticles and their bioconjugates. It then focuses on biosensing, including point-of-care analysis, and then on both in vitro and in vivo bioimaging with visible and NIR light. The last section compares the performances of LLBs versus those of other commonly used bioprobes (organic dyes, quantum dots, and transition metal complexes). It is concluded that although LLBs will not replace all of existing bioprobes, they add invaluable new specific technologies to the biologist and medical doctor toolboxes. A good deal of improvements are achieved through nanotechnologies, which demonstrates that progresses in biosciences depend on the intersection of different disciplines, photophysics, chemistry, biochemistry, nanotechnology, and materials science. - Highlights: • Lanthanide luminescent bioprobes (LLBs) are indispensable tools in biosciences. • The tools provided by LLBs are summarized. • Main trends in biosensing and point-of-care analysis are presented. • Issues regarding optical bioimaging with visible and NIR light are described. • Characteristics of LLBs, including nanoparticles, are compared to other bioprobes.

  9. Statistical methods for categorical data analysis

    CERN Document Server

    Powers, Daniel

    2008-01-01

    This book provides a comprehensive introduction to methods and models for categorical data analysis and their applications in social science research. Companion website also available, at https://webspace.utexas.edu/dpowers/www/

  10. Method and procedure of fatigue analysis for nuclear equipment

    International Nuclear Information System (INIS)

    Wen Jing; Fang Yonggang; Lu Yan; Zhang Yue; Sun Zaozhan; Zou Mingzhong

    2014-01-01

    As an example, the fatigue analysis for the upper head of the pressurizer in one NPP was carried out by using ANSYS, a finite element method analysis software. According to RCC-M code, only two kinds of typical transients of temperature and pressure were considered in the fatigue analysis. Meanwhile, the influence of earthquake was taken into account. The method and procedure of fatigue analysis for nuclear safety equipment were described in detail. This paper provides a reference for fatigue analysis and assessment of nuclear safety grade equipment and pipe. (authors)

  11. Substructure method of soil-structure interaction analysis for earthquake loadings

    Energy Technology Data Exchange (ETDEWEB)

    Park, H. G.; Joe, Y. H. [Industrial Development Research Center, Univ. of Incheon, Incheon (Korea, Republic of)

    1997-07-15

    Substructure method has been preferably adopted for soil-structure interaction analysis because of its simplicity and economy in practical application. However, substructure method has some limitation in application and does not always give reliable results especially for embedded structures or layered soil conditions. The objective of this study to validate the reliability of the soil-structure interaction analysis results by the proposed substructure method using lumped-parameter model and suggest a method of seismic design of nuclear power plant structures with specific design conditions. In this study, theoretic background and modeling technique of soil-structure interaction phenomenon have been reviewed and an analysis technique based on substructure method using lumped-parameter model has been suggested. The practicality and reliability of the proposed method have been validated through the application of the method to the seismic analysis of the large-scale seismic test models. A technical guide for practical application and evaluation of the proposed method have been also provided through the various type parametric.

  12. The delayed neutron method of uranium analysis

    International Nuclear Information System (INIS)

    Wall, T.

    1989-01-01

    The technique of delayed neutron analysis (DNA) is discussed. The DNA rig installed on the MOATA reactor, the assay standards and the types of samples which have been assayed are described. Of the total sample throughput of about 55,000 units since the uranium analysis service began, some 78% has been concerned with analysis of uranium ore samples derived from mining and exploration. Delayed neutron analysis provides a high sensitivity, low cost uranium analysis method for both uranium exploration and other applications. It is particularly suitable for analysis of large batch samples and for non-destructive analysis over a wide range of matrices. 8 refs., 4 figs., 3 tabs

  13. Phase Plane Analysis Method of Nonlinear Traffic Phenomena

    Directory of Open Access Journals (Sweden)

    Wenhuan Ai

    2015-01-01

    Full Text Available A new phase plane analysis method for analyzing the complex nonlinear traffic phenomena is presented in this paper. This method makes use of variable substitution to transform a traditional traffic flow model into a new model which is suitable for the analysis in phase plane. According to the new model, various traffic phenomena, such as the well-known shock waves, rarefaction waves, and stop-and-go waves, are analyzed in the phase plane. From the phase plane diagrams, we can see the relationship between traffic jams and system instability. So the problem of traffic flow could be converted into that of system stability. The results show that the traffic phenomena described by the new method is consistent with that described by traditional methods. Moreover, the phase plane analysis highlights the unstable traffic phenomena we are chiefly concerned about and describes the variation of density or velocity with time or sections more clearly.

  14. Stochastic seismic floor response analysis method for various damping systems

    International Nuclear Information System (INIS)

    Kitada, Y.; Hattori, K.; Ogata, M.; Kanda, J.

    1991-01-01

    A study using the stochastic seismic response analysis method which is applicable for the estimation of floor response spectra is carried out. It is pointed out as a shortcoming in this stochastic seismic response analysis method, that the method tends to overestimate floor response spectra for low damping systems, e.g. 1% of the critical damping ratio. An investigation on the cause of the shortcoming is carried out and a number of improvements in this method were also made to the original method by taking correlation of successive peaks in a response time history into account. The application of the improved method to a typical BWR reactor building is carried out. The resultant floor response spectra are compared with those obtained by deterministic time history analysis. Floor response spectra estimated by the improved method consistently cover the response spectra obtained by the time history analysis for various damping ratios. (orig.)

  15. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  16. Microlocal methods in the analysis of the boundary element method

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1993-01-01

    The application of the boundary element method in numerical analysis is based upon the use of boundary integral operators stemming from multiple layer potentials. The regularity properties of these operators are vital in the development of boundary integral equations and error estimates. We show...

  17. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  18. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  19. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  20. Probabilistic methods in fire-risk analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment

  1. Methods of charged-particle activation analysis

    International Nuclear Information System (INIS)

    Chaudhri, M. Anwar; Chaudhri, M. Nasir; Jabbar, Q.; Nadeem, Q.

    2006-01-01

    The accuracy of Chaudhri's method for charged-particle activation analysis published in J. Radioanal. Chem. (1977) v. 37 p. 243 has been further demonstrated by extensive calculations. The nuclear reactions 12 C(d,n) 13 N, 63 Cu( 3 He,p) 65 Zn, 107 Ag(α,n) 110 In and 208 Pb(d,p) 209 Pb, the cross sections of which were easily available, have been examined for the detection of 12 C, 63 Cu, 107 Ag and 208 Pb, respectively, in matrices of Cu, Zr and Pb, at the bombarding energies of 4 - 22 MeV. The 'standard' is assumed to be in a carbon matrix. It has been clearly demonstrated that Chaudhri's method, which makes the charged particle activation analysis as simple as neutron activation analysis, provides results which are almost identical to, or only about 1-2 % different, from the results obtained using the full 'Activity Equation' involving solving complex integrals. It is valid even when the difference in the average atomic weights of matrices of the standard and the sample is large. (author)

  2. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    Science.gov (United States)

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Dual ant colony operational modal analysis parameter estimation method

    Science.gov (United States)

    Sitarz, Piotr; Powałka, Bartosz

    2018-01-01

    Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.

  4. Basic methods of isotope analysis; Osnovnye metody analiza izotopov

    Energy Technology Data Exchange (ETDEWEB)

    Ochkin, A V; Rozenkevich, M B

    2000-07-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered.

  5. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  6. A Thiazole Coumarin (TC) Turn-On Fluorescence Probe for AT-Base Pair Detection and Multipurpose Applications in Different Biological Systems

    Science.gov (United States)

    Narayanaswamy, Nagarjun; Kumar, Manoj; Das, Sadhan; Sharma, Rahul; Samanta, Pralok K.; Pati, Swapan K.; Dhar, Suman K.; Kundu, Tapas K.; Govindaraju, T.

    2014-01-01

    Sequence-specific recognition of DNA by small turn-on fluorescence probes is a promising tool for bioimaging, bioanalytical and biomedical applications. Here, the authors report a novel cell-permeable and red fluorescent hemicyanine-based thiazole coumarin (TC) probe for DNA recognition, nuclear staining and cell cycle analysis. TC exhibited strong fluorescence enhancement in the presence of DNA containing AT-base pairs, but did not fluoresce with GC sequences, single-stranded DNA, RNA and proteins. The fluorescence staining of HeLa S3 and HEK 293 cells by TC followed by DNase and RNase digestion studies depicted the selective staining of DNA in the nucleus over the cytoplasmic region. Fluorescence-activated cell sorting (FACS) analysis by flow cytometry demonstrated the potential application of TC in cell cycle analysis in HEK 293 cells. Metaphase chromosome and malaria parasite DNA imaging studies further confirmed the in vivo diagnostic and therapeutic applications of probe TC. Probe TC may find multiple applications in fluorescence spectroscopy, diagnostics, bioimaging and molecular and cell biology. PMID:25252596

  7. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  8. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Chung Bang; Lee, S R; Kim, J M; Park, K L; Oh, S B; Choi, J S; Kim, Y S [Korea Advanced Institute of Science Technology, Daejeon (Korea, Republic of)

    1994-07-15

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'.

  9. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    International Nuclear Information System (INIS)

    Yun, Chung Bang; Lee, S. R.; Kim, J. M.; Park, K. L.; Oh, S. B.; Choi, J. S.; Kim, Y. S.

    1994-07-01

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'

  10. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  11. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy

  12. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  13. Probabilistic structural analysis methods for space transportation propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.

    1991-01-01

    Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .

  14. Multi-dye theranostic nanoparticle platform for bioimaging and cancer therapy

    Directory of Open Access Journals (Sweden)

    Singh AK

    2012-06-01

    Full Text Available Amit K Singh,1,2 Megan A Hahn,2 Luke G Gutwein,3 Michael C Rule,4 Jacquelyn A Knapik,5 Brij M Moudgil,1,2 Stephen R Grobmyer,3 Scott C Brown,2,61Department of Materials Science and Engineering, College of Engineering, 2Particle Engineering Research Center, College of Engineering, 3Division of Surgical Oncology, Department of Surgery, College of Medicine, 4Cell and Tissue Analysis Core, McKnight Brain Institute, 5Department of Pathology, College of Medicine, University of Florida, Gainesville, FL, USA; 6DuPont Central Research and Development, Corporate Center for Analytical Science, Wilmington, DE, USABackground: Theranostic nanomaterials composed of fluorescent and photothermal agents can both image and provide a method of disease treatment in clinical oncology. For in vivo use, the near-infrared (NIR window has been the focus of the majority of studies, because of greater light penetration due to lower absorption and scatter of biological components. Therefore, having both fluorescent and photothermal agents with optical properties in the NIR provides the best chance of improved theranostic capabilities utilizing nanotechnology.Methods: We developed nonplasmonic multi-dye theranostic silica nanoparticles (MDT-NPs, combining NIR fluorescence visualization and photothermal therapy within a single nanoconstruct comprised of molecular components. A modified NIR fluorescent heptamethine cyanine dye was covalently incorporated into a mesoporous silica matrix and a hydrophobic metallo-naphthalocyanine dye with large molar absorptivity was loaded into the pores of these fluorescent particles. The imaging and therapeutic capabilities of these nanoparticles were demonstrated in vivo using a direct tumor injection model.Results: The fluorescent nanoparticles are bright probes (300-fold enhancement in quantum yield versus free dye that have a large Stokes shift (>110 nm. Incorporation of the naphthalocyanine dye and exposure to NIR laser excitation

  15. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  16. Rare earths analysis of rock samples by instrumental neutron activation analysis, internal standard method

    International Nuclear Information System (INIS)

    Silachyov, I.

    2016-01-01

    The application of instrumental neutron activation analysis for the determination of long-lived rare earth elements (REE) in rock samples is considered in this work. Two different methods are statistically compared: the well established external standard method carried out using standard reference materials, and the internal standard method (ISM), using Fe, determined through X-ray fluorescence analysis, as an element-comparator. The ISM proved to be the more precise method for a wide range of REE contents and can be recommended for routine practice. (author)

  17. Electromagnetic modeling method for eddy current signal analysis

    International Nuclear Information System (INIS)

    Lee, D. H.; Jung, H. K.; Cheong, Y. M.; Lee, Y. S.; Huh, H.; Yang, D. J.

    2004-10-01

    An electromagnetic modeling method for eddy current signal analysis is necessary before an experiment is performed. Electromagnetic modeling methods consists of the analytical method and the numerical method. Also, the numerical methods can be divided by Finite Element Method(FEM), Boundary Element Method(BEM) and Volume Integral Method(VIM). Each modeling method has some merits and demerits. Therefore, the suitable modeling method can be chosen by considering the characteristics of each modeling. This report explains the principle and application of each modeling method and shows the comparison modeling programs

  18. Direct methods of soil-structure interaction analysis for earthquake loadings(II)

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Chung Bang; Lee, S. R.; Kim, J. M.; Park, K. L.; Oh, S. B.; Choi, J. S.; Kim, Y. S. [Korea Advanced Institute of Science Technology, Daejeon (Korea, Republic of)

    1994-07-15

    In this study, methods for 3-D soil-structure interaction analysis have been studied. They are 3-D axisymmetric analysis method, 3-D axisymmetric finite element method incorporating infinite elements, and 3-D boundary element methods. The computer code, named as 'KIESSI - PF', has been developed which is based on the 3-D axisymmetric finite element method coupled with infinite element method. It is able to simulate forced vibration test results of a soil-structure interaction system. The Hualien FVT post-correlation analysis before backfill and the blind prediction analysis after backfill have been carried out using the developed computer code 'KIESSI - PF'.

  19. Fluorescence and confocal imaging of mammalian cells using conjugated oligoelectrolytes with phenylenevinylene core

    Energy Technology Data Exchange (ETDEWEB)

    Milczarek, Justyna; Pawlowska, Roza; Zurawinski, Remigiusz; Lukasik, Beata; Garner, Logan E.; Chworos, Arkadiusz

    2017-05-01

    Over the last few years, considerable efforts are taken, in order to find a molecular fluorescent probe fulfilling their applicability requirements. Due to a good optical properties and affinity to biological structures conjugated oligoelectrolytes (COEs) can be considered as a promising dyes for application in fluorescence-based bioimaging. In this work, we synthetized COEs with phenylenevinylene core (PV-COEs) and applied as fluorescent membranous-specific probes. Cytotoxicity effects of each COE were probed on cancerous and non-cancerous cell types and little to no toxicity effects were observed at the high range of concentrations. The intensity of cell fluorescence following the COE staining was determined by the photoluminescence analysis and fluorescence activated cell sorting method (FACS). Intercalation of tested COEs into mammalian cell membranes was revealed by fluorescent and confocal microscopy colocalization with commercial dyes specific for cellular structures including mitochondria, Golgi apparatus and endoplasmic reticulum. The phenylenevinylene conjugated oligoelectrolytes have been found to be suitable for fluorescent bioimaging of mammalian cells and membrane-rich organelles. Due to their water solubility coupled with spontaneous intercalation into cells, favorable photophysical features, ease of cell staining, low cytotoxicity and selectivity for membranous structures, PV-COEs can be applied as markers for fluorescence imaging of a variety of cell types.

  20. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  1. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  2. Multiple predictor smoothing methods for sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  3. Life cycle analysis of electricity systems: Methods and results

    International Nuclear Information System (INIS)

    Friedrich, R.; Marheineke, T.

    1996-01-01

    The two methods for full energy chain analysis, process analysis and input/output analysis, are discussed. A combination of these two methods provides the most accurate results. Such a hybrid analysis of the full energy chains of six different power plants is presented and discussed. The results of such analyses depend on time, site and technique of each process step and, therefore have no general validity. For renewable energy systems the emissions form the generation of a back-up system should be added. (author). 7 figs, 1 fig

  4. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  5. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  6. Three-Phase Harmonic Analysis Method for Unbalanced Distribution Systems

    Directory of Open Access Journals (Sweden)

    Jen-Hao Teng

    2014-01-01

    Full Text Available Due to the unbalanced features of distribution systems, a three-phase harmonic analysis method is essential to accurately analyze the harmonic impact on distribution systems. Moreover, harmonic analysis is the basic tool for harmonic filter design and harmonic resonance mitigation; therefore, the computational performance should also be efficient. An accurate and efficient three-phase harmonic analysis method for unbalanced distribution systems is proposed in this paper. The variations of bus voltages, bus current injections and branch currents affected by harmonic current injections can be analyzed by two relationship matrices developed from the topological characteristics of distribution systems. Some useful formulas are then derived to solve the three-phase harmonic propagation problem. After the harmonic propagation for each harmonic order is calculated, the total harmonic distortion (THD for bus voltages can be calculated accordingly. The proposed method has better computational performance, since the time-consuming full admittance matrix inverse employed by the commonly-used harmonic analysis methods is not necessary in the solution procedure. In addition, the proposed method can provide novel viewpoints in calculating the branch currents and bus voltages under harmonic pollution which are vital for harmonic filter design. Test results demonstrate the effectiveness and efficiency of the proposed method.

  7. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  8. Instrumental methods of analysis, 7th edition

    International Nuclear Information System (INIS)

    Willard, H.H.; Merritt, L.L. Jr.; Dean, J.A.; Settle, F.A. Jr.

    1988-01-01

    The authors have prepared an organized and generally polished product. The book is fashioned to be used as a textbook for an undergraduate instrumental analysis course, a supporting textbook for graduate-level courses, and a general reference work on analytical instrumentation and techniques for professional chemists. Four major areas are emphasized: data collection and processing, spectroscopic instrumentation and methods, liquid and gas chromatographic methods, and electrochemical methods. Analytical instrumentation and methods have been updated, and a thorough citation of pertinent recent literature is included

  9. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    Science.gov (United States)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  10. Dynamic relaxation method in analysis of reinforced concrete bent elements

    Directory of Open Access Journals (Sweden)

    Anna Szcześniak

    2015-12-01

    Full Text Available The paper presents a method for the analysis of nonlinear behaviour of reinforced concrete bent elements subjected to short-term static load. The considerations in the range of modelling of deformation processes of reinforced concrete element were carried out. The method of structure effort analysis was developed using the finite difference method. The Dynamic Relaxation Method, which — after introduction of critical damping — allows for description of the static behaviour of a structural element, was used to solve the system of nonlinear equilibrium equations. In order to increase the method effectiveness in the range of the post-critical analysis, the Arc Length Parameter on the equilibrium path was introduced into the computational procedure.[b]Keywords[/b]: reinforced concrete elements, physical nonlinearity, geometrical nonlinearity, dynamic relaxation method, arc-length method

  11. Methods for Force Analysis of Overconstrained Parallel Mechanisms: A Review

    Science.gov (United States)

    Liu, Wen-Lan; Xu, Yun-Dou; Yao, Jian-Tao; Zhao, Yong-Sheng

    2017-11-01

    The force analysis of overconstrained PMs is relatively complex and difficult, for which the methods have always been a research hotspot. However, few literatures analyze the characteristics and application scopes of the various methods, which is not convenient for researchers and engineers to master and adopt them properly. A review of the methods for force analysis of both passive and active overconstrained PMs is presented. The existing force analysis methods for these two kinds of overconstrained PMs are classified according to their main ideas. Each category is briefly demonstrated and evaluated from such aspects as the calculation amount, the comprehensiveness of considering limbs' deformation, and the existence of explicit expressions of the solutions, which provides an important reference for researchers and engineers to quickly find a suitable method. The similarities and differences between the statically indeterminate problem of passive overconstrained PMs and that of active overconstrained PMs are discussed, and a universal method for these two kinds of overconstrained PMs is pointed out. The existing deficiencies and development directions of the force analysis methods for overconstrained systems are indicated based on the overview.

  12. Security analysis and improvements to the PsychoPass method.

    Science.gov (United States)

    Brumen, Bostjan; Heričko, Marjan; Rozman, Ivan; Hölbl, Marko

    2013-08-13

    In a recent paper, Pietro Cipresso et al proposed the PsychoPass method, a simple way to create strong passwords that are easy to remember. However, the method has some security issues that need to be addressed. To perform a security analysis on the PsychoPass method and outline the limitations of and possible improvements to the method. We used the brute force analysis and dictionary attack analysis of the PsychoPass method to outline its weaknesses. The first issue with the Psychopass method is that it requires the password reproduction on the same keyboard layout as was used to generate the password. The second issue is a security weakness: although the produced password is 24 characters long, the password is still weak. We elaborate on the weakness and propose a solution that produces strong passwords. The proposed version first requires the use of the SHIFT and ALT-GR keys in combination with other keys, and second, the keys need to be 1-2 distances apart. The proposed improved PsychoPass method yields passwords that can be broken only in hundreds of years based on current computing powers. The proposed PsychoPass method requires 10 keys, as opposed to 20 keys in the original method, for comparable password strength.

  13. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  14. Magnetic-relaxation method of analysis of inorganic substances

    International Nuclear Information System (INIS)

    Popel', A.A.

    1978-01-01

    The magnetic-relaxation method is considered of the quantitative analysis of inorganic substances based on time dependence of magnetic nuclei relaxation on the quantity of paramagnetic centres in a solution. The characteristic is given of some methods of measuring nuclear magnetic relaxation times: method of weak oscillation generator and pulse methods. The effect of temperature, general solution viscosity, diamagnetic salt concentration, medium acidity on nuclear relaxation velocity is described. The determination sensitivity is estimated and the means of its increase definable concentration intervals and method selectivity are considered. The method application when studying complexing in the solution is described. A particular attention is given to the investigation of heteroligand homocentre, heterocentre and protonated complexes as well as to the problems of particle exchange of the first coordination sphere with particles from the mass of solution. The equations for equilibrium constant calculation in different systems are given. Possibilities of determining diamagnetic ions by the magnetic-relaxation method using paramagnetic indicators are confirmed by the quantitative analysis of indium, gallium, thorium and scandium in their salt solutions

  15. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  16. Creating infinite contrast in fluorescence microscopy by using lanthanide centered emission

    DEFF Research Database (Denmark)

    R. Carro-Temboury, Miguel; Arppe, Riikka Matleena; Hempel, Casper

    2017-01-01

    The popularity of fluorescence microscopy arises from the inherent mode of action, where the fluorescence emission from probes is used to visualize selected features on a presumed dark background. However, the background is rarely truly dark, and image processing and analysis is needed to enhance...... the fluorescent signal that is ascribed to the selected feature. The image acquisition is facilitated by using considerable illumination, bright probes at a relatively high concentration in order to make the fluorescent signal significantly more intense than the background signal. Here, we present two methods......, while method II resolves the fluorescent signal by subtracting a background calculated via the gradient. Both methods improve signal-to-background ratio significantly and we suggest that spectral imaging of lanthanide-centered emission can be used as a tool to obtain absolute contrast in bioimaging....

  17. Detector Simulation: Data Treatment and Analysis Methods

    CERN Document Server

    Apostolakis, J

    2011-01-01

    Detector Simulation in 'Data Treatment and Analysis Methods', part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B1: Detectors for Particles and Radiation. Part 1: Principles and Methods'. This document is part of Part 1 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '4.1 Detector Simulation' of Chapter '4 Data Treatment and Analysis Methods' with the content: 4.1 Detector Simulation 4.1.1 Overview of simulation 4.1.1.1 Uses of detector simulation 4.1.2 Stages and types of simulation 4.1.2.1 Tools for event generation and detector simulation 4.1.2.2 Level of simulation and computation time 4.1.2.3 Radiation effects and background studies 4.1.3 Components of detector simulation 4.1.3.1 Geometry modeling 4.1.3.2 External fields 4.1.3.3 Intro...

  18. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis; Mouhot, Clé ment

    2011-01-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  19. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  20. Analysis of risk assessment methods for goods trucking

    Directory of Open Access Journals (Sweden)

    Yunyazova A.O.

    2018-04-01

    Full Text Available the article considers models of risk assessment that can be applied to cargo transportation, for forecasting possible damage in the form of financial and material costs in order to reduce the percentage of probability of their occurrence. The analysis of risk by the method «Criterion. Event. Rule" is represented. This method is based on the collection of information by various methods, assigning an assessment to the identified risks, ranking and formulating a report on the analysis. It can be carried out as a fully manual mechanical method of information collecting and performing calculations or can be brought to an automated level from data collection to the delivery of finished results (but in this case some nuances that could significantly influence the outcome of the analysis can be ignored. The expert method is of particular importance, since it relies directly on human experience. In this case, a special role is played by the human factor. The collection of information and the assigned assessments to risk groups depend on the extent to which experts agree on this issue. The smaller the fluctuations in the values ​​of the estimates of the experts, the more accurate and optimal the results will be.

  1. Metal speciation: survey of environmental methods of analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mach, M.H.; Nott, B.; Scott, J.W.; Maddalone, R.F.; Whiddon, N.T. [TRW, Redondo Beach, CA (United States). Chemistry Technology Dept.

    1996-07-01

    As part of a recent task under the EPRI Analytical Methods Qualification Program (RP 1851), TRW has surveyed the methods available for monitoring metal species in typical utility aqueous discharge streams. Methods for determining the individual species of these metals can become important in a regulatory sense as the EPA transitions to assessment of environmental risk based on bioavailability. For example, EPA considers methyl mercury and Cr(VI) much more toxic to the aquatic environment than inorganic mercury or Cr(III). The species of a given element can also differ in their transport and bioaccumulation. Methods for speciation generally include a selective separation step followed by standard metals analysis. Speciation, therefore, is mainly derived from the separation step and not from the method of final quantisation. Examples of separation/analysis include: selective extraction followed by graphite furnace atomic absorption or ICP-MS; separation by GC followed by metals detection; chelation and/or direct separation by LC followed by UV measurement or metals detection; and ion chromatography with conductivity, UV, or metals detection. There are a number of sampling issues associated with metal species such as stabilization (maintaining oxidation state), absorption, and filtration that need to be addressed in order to obtain and maintain a representative sample for analysis. 45 refs., 1 tab.

  2. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    Foutes, C.E.

    1987-01-01

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were developed and input into the analysis. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. Total costs of each level of a standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, was calculated for each alternative standard. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis

  3. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  4. Comparing methods of classifying life courses: Sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Elzinga, C.H.; Liefbroer, Aart C.; Han, Sapphire

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  5. Comparing methods of classifying life courses: sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Han, Y.; Liefbroer, A.C.; Elzinga, C.

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  6. Digital dream analysis: a revised method.

    Science.gov (United States)

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Qualitative data analysis a methods sourcebook

    CERN Document Server

    Miles, Matthew B; Saldana, Johnny

    2014-01-01

    The Third Edition of Miles & Huberman's classic research methods text is updated and streamlined by Johnny SaldaNa, author of The Coding Manual for Qualitative Researchers. Several of the data display strategies from previous editions are now presented in re-envisioned and reorganized formats to enhance reader accessibility and comprehension. The Third Edition's presentation of the fundamentals of research design and data management is followed by five distinct methods of analysis: exploring, describing, ordering, explaining, and predicting. Miles and Huberman's original research studies are profiled and accompanied with new examples from SaldaNa's recent qualitative work. The book's most celebrated chapter, "Drawing and Verifying Conclusions," is retained and revised, and the chapter on report writing has been greatly expanded, and is now called "Writing About Qualitative Research." Comprehensive and authoritative, Qualitative Data Analysis has been elegantly revised for a new generation of qualitative r...

  8. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  9. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  10. Comparison of global sensitivity analysis methods – Application to fuel behavior modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, Timo, E-mail: timo.ikonen@vtt.fi

    2016-02-15

    Highlights: • Several global sensitivity analysis methods are compared. • The methods’ applicability to nuclear fuel performance simulations is assessed. • The implications of large input uncertainties and complex models are discussed. • Alternative strategies to perform sensitivity analyses are proposed. - Abstract: Fuel performance codes have two characteristics that make their sensitivity analysis challenging: large uncertainties in input parameters and complex, non-linear and non-additive structure of the models. The complex structure of the code leads to interactions between inputs that show as cross terms in the sensitivity analysis. Due to the large uncertainties of the inputs these interactions are significant, sometimes even dominating the sensitivity analysis. For the same reason, standard linearization techniques do not usually perform well in the analysis of fuel performance codes. More sophisticated methods are typically needed in the analysis. To this end, we compare the performance of several sensitivity analysis methods in the analysis of a steady state FRAPCON simulation. The comparison of importance rankings obtained with the various methods shows that even the simplest methods can be sufficient for the analysis of fuel maximum temperature. However, the analysis of the gap conductance requires more powerful methods that take into account the interactions of the inputs. In some cases, moment-independent methods are needed. We also investigate the computational cost of the various methods and present recommendations as to which methods to use in the analysis.

  11. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  12. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  13. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  14. X-ray fluorescence method for trace analysis and imaging

    International Nuclear Information System (INIS)

    Hayakawa, Shinjiro

    2000-01-01

    X-ray fluorescence analysis has a long history as conventional bulk elemental analysis with medium sensitivity. However, with the use of synchrotron radiation x-ray fluorescence method has become a unique analytical technique which can provide tace elemental information with the spatial resolution. To obtain quantitative information of trace elemental distribution by using the x-ray fluorescence method, theoretical description of x-ray fluorescence yield is described. Moreover, methods and instruments for trace characterization with a scanning x-ray microprobe are described. (author)

  15. Application of Homotopy Analysis Method to Solve Relativistic Toda Lattice System

    International Nuclear Information System (INIS)

    Wang Qi

    2010-01-01

    In this letter, the homotopy analysis method is successfully applied to solve the Relativistic Toda lattice system. Comparisons are made between the results of the proposed method and exact solutions. Analysis results show that homotopy analysis method is a powerful and easy-to-use analytic tool to solve systems of differential-difference equations. (general)

  16. The colour analysis method applied to homogeneous rocks

    Directory of Open Access Journals (Sweden)

    Halász Amadé

    2015-12-01

    Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  17. Applicability of soil-structure interaction analysis methods for earthquake loadings (V)

    International Nuclear Information System (INIS)

    Chang, S. P.; Ko, H. M.; Kim, J. K.; Yoon, J. Y.; Chin, B. M.; Yang, T. S.; Park, J. Y.; Cho, J. R.; Ryu, H.

    1997-07-01

    The ultimate goals of this research are to cultivate the capability of accurate 551 analysis and to develop the effective soil-structure interaction analysis method and computer program by comparing analysis results obtained in Lotung/Hualien lS5T project. In this research, the scope of this study is to establish the method of soil-structure interaction analysis using hyperlement and to develop a computer program of 551 analysis, to do parametric study for the comprehension of the characteristics and the applicability of hyper elements and to verify the validity and the applicability of this method(or program) through the analysis of seismic response of Hualien lS5T project. In this study, we verified the validity and the efficiency of the soil-structure interaction analysis method using hyper elements and developed computer programs using hyper elements. Based on the I-dimensional wave propagation theory, we developed a computer program of free-field analysis considering the primary non-lineriry of seismic responses. And using this program, we computed the effective ground earthquake motions of soil regions. The computer programs using hyper elements can treat non-homogeneity of soil regions very easily and perform the analysis quickly by the usage of the analytical solutions in horizontal direction. 50 this method would be very efficient and practical method

  18. Design Analysis Method for Multidisciplinary Complex Product using SysML

    Directory of Open Access Journals (Sweden)

    Liu Jihong

    2017-01-01

    Full Text Available In the design of multidisciplinary complex products, model-based systems engineering methods are widely used. However, the methodologies only contain only modeling order and simple analysis steps, and lack integrated design analysis methods supporting the whole process. In order to solve the problem, a conceptual design analysis method with integrating modern design methods has been proposed. First, based on the requirement analysis of the quantization matrix, the user’s needs are quantitatively evaluated and translated to system requirements. Then, by the function decomposition of the function knowledge base, the total function is semi-automatically decomposed into the predefined atomic function. The function is matched into predefined structure through the behaviour layer using function-structure mapping based on the interface matching. Finally based on design structure matrix (DSM, the structure reorganization is completed. The process of analysis is implemented with SysML, and illustrated through an aircraft air conditioning system for the system validation.

  19. Theoretical analysis and experimental study of spray degassing method

    International Nuclear Information System (INIS)

    Wu Ruizhi; Shu Da; Sun Baode; Wang Jun; Li Fei; Chen Haiyan; Lu YanLing

    2005-01-01

    A new hydrogen-removal method of aluminum melt, spray degassing, is presented. The thermodynamic and kinetic analysis of the method are discussed. A comparison between the thermodynamics and kinetics of the spray degassing method and rotary impellor degassing method is made. The thermodynamic analysis shows that the relationship between the final hydrogen content of the aluminum melt and the ratio of purge gas flow rate to melt flow rate is linear. The result of thermodynamic calculation shows that, in spray degassing, when the ratio of G/q is larger than 2.2 x 10 -6 , the final hydrogen content will be less than 0.1 ml/100 g Al. From the kinetic analysis, the degassing effect is affected by both the size of melt droplets and the time that melt droplets move from sprayer to the bottom of the treatment tank. In numerical calculation, the hydrogen in aluminum melt can be degassed to 0.05 ml/100 g Al from 0.2 ml/100 g Al in 0.02 s with the spray degassing method. Finally, the water-model experiments are presented with the spray degassing method and rotary impellor degassing method. Melt experiments are also presented. Both the water-model experiments and the melt experiments show that the degassing effect of the spray degassing method is better than that of the rotary impeller method

  20. Cost–benefit analysis method for building solutions

    International Nuclear Information System (INIS)

    Araújo, Catarina; Almeida, Manuela; Bragança, Luís; Barbosa, José Amarilio

    2016-01-01

    Highlights: • A new cost–benefit method was developed to compare building solutions. • The method considers energy performance, life cycle costs and investment willingness. • The graphical analysis helps stakeholders to easily compare building solutions. • The method was applied to a case study showing consistency and feasibility. - Abstract: The building sector is responsible for consuming approximately 40% of the final energy in Europe. However, more than 50% of this consumption can be reduced through energy-efficient measures. Our society is facing not only a severe and unprecedented environmental crisis but also an economic crisis of similar magnitude. In light of this, EU has developed legislation promoting the use of the Cost-Optimal (CO) method in order to improve building energy efficiency, in which selection criteria is based on life cycle costs. Nevertheless, studies show that the implementation of energy-efficient solutions is far from ideal. Therefore, it is very important to analyse the reasons for this gap between theory and implementation as well as improve selection methods. This study aims to develop a methodology based on a cost-effectiveness analysis, which can be seen as an improvement to the CO method as it considers the investment willingness of stakeholders in the selection process of energy-efficient solutions. The method uses a simple graphical display in which the stakeholders’ investment willingness is identified as the slope of a reference line, allowing easy selection between building solutions. This method will lead to the selection of more desired – from stakeholders’ point of view – and more energy-efficient solutions than those selected through the CO method.

  1. Research and application of sampling and analysis method of sodium aerosol

    International Nuclear Information System (INIS)

    Yu Xiaochen; Guo Qingzhou; Wen Ximeng

    1998-01-01

    Method of sampling-analysis for sodium aerosol is researched. The vacuum sampling technology is used in the sampling process, and the analysis method adopted is volumetric analysis and atomic absorption. When the absolute content of sodium is in the rang of 0.1 mg to 1.0 mg, the deviation of results between volumetric analysis and atomic absorption is less than 2%. The method has been applied in a sodium aerosol removal device successfully. The analysis range, accuracy and precision can meet the requirements for researching sodium aerosol

  2. Endurance time method for Seismic analysis and design of structures

    International Nuclear Information System (INIS)

    Estekanchi, H.E.; Vafai, A.; Sadeghazar, M.

    2004-01-01

    In this paper, a new method for performance based earthquake analysis and design has been introduced. In this method, the structure is subjected to accelerograms that impose increasing dynamic demand on the structure with time. Specified damage indexes are monitored up to the collapse level or other performance limit that defines the endurance limit point for the structure. Also, a method for generating standard intensifying accelerograms has been described. Three accelerograms have been generated using this method. Furthermore, the concept of Endurance Time has been described by applying these accelerograms to single and multi degree of freedom linear systems. The application of this method for analysis of complex nonlinear systems has been explained. Endurance Time method provides a uniform approach to seismic analysis and design of complex structures that can be applied in numerical and experimental investigations

  3. EXPLANATORY METHODS OF MARKETING DATA ANALYSIS – THEORETICAL AND METHODOLOGICAL CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Rozalia GABOR

    2010-01-01

    Full Text Available Explanatory methods of data analysis – also named by some authors supervised learning methods - enable researchers to identify and analyse configurations of relations between two or several variables, most of them with a high accuracy, as there is possibility of testing statistic significance by calculating the confidence level associated with validation of relation concerned across the entire population and not only the surveyed sample. The paper shows some of these methods, respectively: variance analysis, covariance analysis, segmentation and discriminant analysis with the mention - for every method – of applicability area for marketing research.

  4. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  5. Lag profile inversion method for EISCAT data analysis

    Directory of Open Access Journals (Sweden)

    I. I. Virtanen

    2008-03-01

    Full Text Available The present standard EISCAT incoherent scatter experiments are based on alternating codes that are decoded in power domain by simple summation and subtraction operations. The signal is first digitised and then different lagged products are calculated and decoded in real time. Only the decoded lagged products are saved for further analysis so that both the original data samples and the undecoded lagged products are lost. A fit of plasma parameters can be later performed using the recorded lagged products. In this paper we describe a different analysis method, which makes use of statistical inversion in removing range ambiguities from the lag profiles. An analysis program carrying out both the lag profile inversion and the fit of the plasma parameters has been constructed. Because recording the received signal itself instead of the lagged products allows very flexible data analysis, the program is constructed to use raw data, i.e. IQ-sampled signal recorded from an IF stage of the radar. The program is now capable of analysing standard alternating-coded EISCAT experiments as well as experiments with any other kind of radar modulation if raw data is available. The program calculates the ambiguous lag profiles and is capable of inverting them as such but, for analysis in real time, time integration is needed before inversion. We demonstrate the method using alternating code experiments in the EISCAT UHF radar and specific hardware connected to the second IF stage of the receiver. This method produces a data stream of complex samples, which are stored for later processing. The raw data is analysed with lag profile inversion and the results are compared to those given by the standard method.

  6. Annular dispersed flow analysis model by Lagrangian method and liquid film cell method

    International Nuclear Information System (INIS)

    Matsuura, K.; Kuchinishi, M.; Kataoka, I.; Serizawa, A.

    2003-01-01

    A new annular dispersed flow analysis model was developed. In this model, both droplet behavior and liquid film behavior were simultaneously analyzed. Droplet behavior in turbulent flow was analyzed by the Lagrangian method with refined stochastic model. On the other hand, liquid film behavior was simulated by the boundary condition of moving rough wall and liquid film cell model, which was used to estimate liquid film flow rate. The height of moving rough wall was estimated by disturbance wave height correlation. In each liquid film cell, liquid film flow rate was calculated by considering droplet deposition and entrainment flow rate. Droplet deposition flow rate was calculated by Lagrangian method and entrainment flow rate was calculated by entrainment correlation. For the verification of moving rough wall model, turbulent flow analysis results under the annular flow condition were compared with the experimental data. Agreement between analysis results and experimental results were fairly good. Furthermore annular dispersed flow experiments were analyzed, in order to verify droplet behavior model and the liquid film cell model. The experimental results of radial distribution of droplet mass flux were compared with analysis results. The agreement was good under low liquid flow rate condition and poor under high liquid flow rate condition. But by modifying entrainment rate correlation, the agreement become good even under high liquid flow rate. This means that basic analysis method of droplet and liquid film behavior was right. In future work, verification calculation should be carried out under different experimental condition and entrainment ratio correlation also should be corrected

  7. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    Moray, N.P.; Senders, J.W.; Rhodes, W.

    1985-06-01

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  8. A METHOD FOR EXERGY ANALYSIS OF SUGARCANE BAGASSE BOILERS

    Directory of Open Access Journals (Sweden)

    CORTEZ L.A.B.

    1998-01-01

    Full Text Available This work presents a method to conduct a thermodynamic analysis of sugarcane bagasse boilers. The method is based on the standard and actual reactions which allows the calculation of the enthalpies of each process subequation and the exergies of each of the main flowrates participating in the combustion. The method is presented using an example with real data from a sugarcane bagasse boiler. A summary of the results obtained is also presented together based on the 1st Law of Thermodynamics analysis, the exergetic efficiencies, and the irreversibility rates. The method presented is very rigorous with respect to data consistency, particularly for the flue gas composition.

  9. Use of some nuclear methods for materials analysis

    International Nuclear Information System (INIS)

    Habbani, Farouk

    1994-01-01

    A review is given about the use of two nuclear-related analytical methods, namely: X-ray fluorescence (XRF) and neutron activation analysis (NAA), for the determination of elemental composition of various materials. Special emphasis is given to the use of XRF for the analysis of geological samples, and NAA for the analysis of food - stuffs for their protein content. (Author)

  10. Beyond perturbation introduction to the homotopy analysis method

    CERN Document Server

    Liao, Shijun

    2003-01-01

    Solving nonlinear problems is inherently difficult, and the stronger the nonlinearity, the more intractable solutions become. Analytic approximations often break down as nonlinearity becomes strong, and even perturbation approximations are valid only for problems with weak nonlinearity.This book introduces a powerful new analytic method for nonlinear problems-homotopy analysis-that remains valid even with strong nonlinearity. In Part I, the author starts with a very simple example, then presents the basic ideas, detailed procedures, and the advantages (and limitations) of homotopy analysis. Part II illustrates the application of homotopy analysis to many interesting nonlinear problems. These range from simple bifurcations of a nonlinear boundary-value problem to the Thomas-Fermi atom model, Volterra''s population model, Von Kármán swirling viscous flow, and nonlinear progressive waves in deep water.Although the homotopy analysis method has been verified in a number of prestigious journals, it has yet to be ...

  11. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that

  12. Solving the discrete KdV equation with homotopy analysis method

    International Nuclear Information System (INIS)

    Zou, L.; Zong, Z.; Wang, Z.; He, L.

    2007-01-01

    In this Letter, we apply the homotopy analysis method to differential-difference equations. We take the discrete KdV equation as an example, and successfully obtain double periodic wave solutions and solitary wave solutions. It illustrates the validity and the great potential of the homotopy analysis method in solving discrete KdV equation. Comparisons are made between the results of the proposed method and exact solutions. The results reveal that the proposed method is very effective and convenient

  13. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  14. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  15. Sensitivity Analysis of Structures by Virtual Distortion Method

    DEFF Research Database (Denmark)

    Gierlinski, J.T.; Holnicki-Szulc, J.; Sørensen, John Dalsgaard

    1991-01-01

    are used in structural optimization, see Haftka [4]. The recently developed Virtual Distortion Method (VDM) is a numerical technique which offers an efficient approach to calculation of the sensitivity derivatives. This method has been orginally applied to structural remodelling and collapse analysis, see...

  16. Bioimaging of cells and tissues using accelerator-based sources.

    Science.gov (United States)

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  17. Reference peak method for analysis of doublets in gamma-ray spectrometry used in neutron activation analysis

    International Nuclear Information System (INIS)

    Wasek, M.; Cichowlas, A.; Sterlinski, S.; Dybczynski, R.

    2000-01-01

    A simple algebraic method for the quantitative analysis of doublets in gamma-ray spectra from HPGe detectors is presented. The calculation algorithm is accomplished using the Microsoft Excel program. The method does not require any assumptions regarding the shape of the peaks in the spectrum. The possibilities of quantitative analysis of doublets of various intensity ration and separation of ots components are discussed in detail. The practical examples proved the usefulness the method also for the analysis of the closed doublets. (author)

  18. Analysis of a time fractional wave-like equation with the homotopy analysis method

    International Nuclear Information System (INIS)

    Xu Hang; Cang Jie

    2008-01-01

    The time fractional wave-like differential equation with a variable coefficient is studied analytically. By using a simple transformation, the governing equation is reduced to two fractional ordinary differential equations. Then the homotopy analysis method is employed to derive the solutions of these equations. The accurate series solutions are obtained. Especially, when h f =h g =-1, these solutions are exactly the same as those results given by the Adomian decomposition method. The present work shows the validity and great potential of the homotopy analysis method for solving nonlinear fractional differential equations. The basic idea described in this Letter is expected to be further employed to solve other similar nonlinear problems in fractional calculus

  19. Determining wood chip size: image analysis and clustering methods

    Directory of Open Access Journals (Sweden)

    Paolo Febbi

    2013-09-01

    Full Text Available One of the standard methods for the determination of the size distribution of wood chips is the oscillating screen method (EN 15149- 1:2010. Recent literature demonstrated how image analysis could return highly accurate measure of the dimensions defined for each individual particle, and could promote a new method depending on the geometrical shape to determine the chip size in a more accurate way. A sample of wood chips (8 litres was sieved through horizontally oscillating sieves, using five different screen hole diameters (3.15, 8, 16, 45, 63 mm; the wood chips were sorted in decreasing size classes and the mass of all fractions was used to determine the size distribution of the particles. Since the chip shape and size influence the sieving results, Wang’s theory, which concerns the geometric forms, was considered. A cluster analysis on the shape descriptors (Fourier descriptors and size descriptors (area, perimeter, Feret diameters, eccentricity was applied to observe the chips distribution. The UPGMA algorithm was applied on Euclidean distance. The obtained dendrogram shows a group separation according with the original three sieving fractions. A comparison has been made between the traditional sieve and clustering results. This preliminary result shows how the image analysis-based method has a high potential for the characterization of wood chip size distribution and could be further investigated. Moreover, this method could be implemented in an online detection machine for chips size characterization. An improvement of the results is expected by using supervised multivariate methods that utilize known class memberships. The main objective of the future activities will be to shift the analysis from a 2-dimensional method to a 3- dimensional acquisition process.

  20. Inverse thermal analysis method to study solidification in cast iron

    DEFF Research Database (Denmark)

    Dioszegi, Atilla; Hattel, Jesper

    2004-01-01

    Solidification modelling of cast metals is widely used to predict final properties in cast components. Accurate models necessitate good knowledge of the solidification behaviour. The present study includes a re-examination of the Fourier thermal analysis method. This involves an inverse numerical...... solution of a 1-dimensional heat transfer problem connected to solidification of cast alloys. In the analysis, the relation between the thermal state and the fraction solid of the metal is evaluated by a numerical method. This method contains an iteration algorithm controlled by an under relaxation term...... inverse thermal analysis was tested on both experimental and simulated data....

  1. Analysis of Highly Nonlinear Oscillation System Using He's Max-Min Method and Comparison with Homotopy Analysis Method and Energy Balance Methods

    DEFF Research Database (Denmark)

    Ibsen, Lars Bo; Barari, Amin; Kimiaeifar, Amin

    2010-01-01

    of calculations. Results obtained by max–min are compared with Homotopy Analysis Method (HAM), energy balance and numerical solution and it is shown that, simply one term is enough to obtain a highly accurate result in contrast to HAM with just one term in series solution. Finally, the phase plane to show...... the stability of systems is plotted and discussed....

  2. Neutron activation analysis of certified samples by the absolute method

    Science.gov (United States)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  3. Analysis of a Braking System on the Basis of Structured Analysis Methods

    OpenAIRE

    Ben Salem J.; Lakhoua M.N.; El Amraoui L.

    2016-01-01

    In this paper, we present the general context of the research in the domain of analysis and modeling of mechatronic systems. In fact, we present à bibliographic review on some works of research about the systemic analysis of mechatronic systems. To better understand its characteristics, we start with an introduction about mechatronic systems and various fields related to these systems, after we present a few analysis and design methods applied to mechatronic systems. Finally, we apply the two...

  4. Comparison of neutron activation analysis with other instrumental methods for elemental analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    Regge, P. de; Lievens, F.; Delespaul, I.; Monsecour, M.

    1976-01-01

    A comparison of instrumental methods, including neutron activation analysis, X-ray fluorescence spectrometry, atomic absorption spectrometry and emission spectrometry, for the analysis of heavy metals in airborne particulate matter is described. The merits and drawbacks of each method for the routine analysis of a large number of samples are discussed. The sample preparation technique, calibration and statistical data relevant to each method are given. Concordant results are obtained by the different methods for Co, Cu, Ni, Pb and Zn. Less good agreement is obtained for Fe, Mn and V. The results are not in agreement for the elements Cd and Cr. Using data obtained on the dust sample distributed by Euratom-ISPRA within the framework of an interlaboratory comparison, the accuracy of each method for the various elements is estimated. Neutron activation analysis was found to be the most sensitive and accurate of the non-destructive analysis methods. Only atomic absorption spectrometry has a comparable sensitivity, but requires considerable preparation work. X-ray fluorescence spectrometry is less sensitive and shows biases for Cr and V. Automatic emission spectrometry with simultaneous measurement of the beam intensities by photomultipliers is the fastest and most economical technique, though at the expense of some precision and sensitivity. (author)

  5. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  6. Linear Algebraic Method for Non-Linear Map Analysis

    International Nuclear Information System (INIS)

    Yu, L.; Nash, B.

    2009-01-01

    We present a newly developed method to analyze some non-linear dynamics problems such as the Henon map using a matrix analysis method from linear algebra. Choosing the Henon map as an example, we analyze the spectral structure, the tune-amplitude dependence, the variation of tune and amplitude during the particle motion, etc., using the method of Jordan decomposition which is widely used in conventional linear algebra.

  7. DESCRIBING FUNCTION METHOD FOR PI-FUZZY CONTROLLED SYSTEMS STABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Stefan PREITL

    2004-12-01

    Full Text Available The paper proposes a global stability analysis method dedicated to fuzzy control systems containing Mamdani PI-fuzzy controllers with output integration to control SISO linear / linearized plants. The method is expressed in terms of relatively simple steps, and it is based on: the generalization of the describing function method for the considered fuzzy control systems to the MIMO case, the approximation of the describing functions by applying the least squares method. The method is applied to the stability analysis of a class of PI-fuzzy controlled servo-systems, and validated by considering a case study.

  8. Chemical analysis by nuclear methods. v. 2

    International Nuclear Information System (INIS)

    Alfassi, Z.B.

    1998-01-01

    'Chemical analysis by Nuclear Methods' is an effort of some renowned authors in field of nuclear chemistry and radiochemistry which is compiled by Alfassi, Z.B. and translated into Farsi version collected in two volumes. The second volume consists of the following chapters: Detecting ion recoil scattering and elastic scattering are dealt in the eleventh chapter, the twelfth chapter is devoted to nuclear reaction analysis using charged particles, X-ray emission is discussed at thirteenth chapter, the fourteenth chapter is about using ion microprobes, X-ray fluorescence analysis is discussed in the fifteenth chapter, alpha, beta and gamma ray scattering in chemical analysis are dealt in chapter sixteen, Moessbauer spectroscopy and positron annihilation are discussed in chapter seventeen and eighteen; The last two chapters are about isotope dilution analysis and radioimmunoassay

  9. Application of pulse spectro- zonal luminescent method for the rapid method of material analysis

    International Nuclear Information System (INIS)

    Lisitsin, V.M.; Oleshko, V.I.; Yakovlev, A.N.

    2004-01-01

    Full text: The scope of luminescent methods of the analysis covers enough a big around of substances as the luminescence can be excited in overwhelming majority of nonmetals. Analytical opportunities of luminescent methods can be essentially expanded by use of pulse excitation and registration of spectra of a luminescence with the time resolved methods. The most perspective method is to use pulses of high-current electron beams with the nanosecond duration for excitation from the following reasons: excitation is carried out ionizing, deeply enough by a penetrating radiation; the pulse of radiation has high capacity, up to 10 8 W, but energy no more than 1 J; the pulse of radiation has the nanosecond duration. Electrons with energy in 300-400 keV will penetrate on depth into some tenth shares of mm, i.e. they create volumetric excitation of a sample. Therefore the luminescence raised by an electronic beam has the information about volumetric properties of substance. High density of excitation allow to find out and study the centers (defects) having a small yield of a luminescence, to analyze the weakly luminescent objects. Occurrence of the new effects is possible useful to analyze of materials. There is an opportunity of reception of the information from change of spectral structure of a luminescence during the time after the ending of a pulse of excitation and kinetic characteristics of attenuation of luminescence. The matter is the energy of radiation is absorbed mainly by a matrix, then electronic excitations one is transferred the centers of a luminescence (defects) of a lattice. Therefore during the time after creation electronic excitations the spectrum of a luminescence can repeatedly change, transferring the information on the centers (defects) which are the most effective radiators at present time. Hence, the study of change of spectra of radiation during the time allows providing an additional way of discrimination of the information on the centers of a

  10. Determinants of investment behaviour. Methods and applications of meta-analysis

    International Nuclear Information System (INIS)

    Koetse, M.J.

    2006-01-01

    Meta-analysis is gradually gaining ground in economics as a research method to objectively and quantitatively summarise a body of existing empirical evidence. This dissertation studies the performance of well-known meta-analytic models and presents two meta-analysis applications. Despite its many attractive features, meta-analysis faces several methodical difficulties, especially when applied in economic research. We investigate two specific methodical problems that any meta-analysis in economics will have to deal with, viz., systematic effect-size variation due to primary-study misspecifications, and random effect-size heterogeneity. Using Monte-Carlo analysis we investigate the effects of these methodical problems on the results of a meta-analysis, and study the small-sample properties of several well-known and often applied meta-estimators. The focus of the meta-analysis applications is on two topics that are relevant for understanding investment behaviour, viz., the impact of uncertainty on investment spending, and the potential for substitution of capital for energy in production processes. In the first application we aim to shed light on the direction of the relationship between investment and uncertainty, and to uncover which factors are empirically relevant for explaining the wide variety in study outcomes. In the second application our goal is to analyse the direction and magnitude of capital-energy substitution potential, and to analyse the empirical relevance of suggested sources of variation in elasticity estimates

  11. Determinants of investment behaviour. Methods and applications of meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koetse, M.J.

    2006-03-14

    Meta-analysis is gradually gaining ground in economics as a research method to objectively and quantitatively summarise a body of existing empirical evidence. This dissertation studies the performance of well-known meta-analytic models and presents two meta-analysis applications. Despite its many attractive features, meta-analysis faces several methodical difficulties, especially when applied in economic research. We investigate two specific methodical problems that any meta-analysis in economics will have to deal with, viz., systematic effect-size variation due to primary-study misspecifications, and random effect-size heterogeneity. Using Monte-Carlo analysis we investigate the effects of these methodical problems on the results of a meta-analysis, and study the small-sample properties of several well-known and often applied meta-estimators. The focus of the meta-analysis applications is on two topics that are relevant for understanding investment behaviour, viz., the impact of uncertainty on investment spending, and the potential for substitution of capital for energy in production processes. In the first application we aim to shed light on the direction of the relationship between investment and uncertainty, and to uncover which factors are empirically relevant for explaining the wide variety in study outcomes. In the second application our goal is to analyse the direction and magnitude of capital-energy substitution potential, and to analyse the empirical relevance of suggested sources of variation in elasticity estimates.

  12. Development of Ultraviolet Spectrophotometric Method for Analysis ...

    African Journals Online (AJOL)

    HP

    Method for Analysis of Lornoxicam in Solid Dosage. Forms. Sunit Kumar Sahoo ... testing. Mean recovery was 100.82 % for tablets. Low values of % RSD indicate .... Saharty E, Refaat YS, Khateeb ME. Stability-. Indicating. Spectrophotometric.

  13. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  14. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  15. Social network analysis: Presenting an underused method for nursing research.

    Science.gov (United States)

    Parnell, James Michael; Robinson, Jennifer C

    2018-06-01

    This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.

  16. A comparative study of three different gene expression analysis methods.

    Science.gov (United States)

    Choe, Jae Young; Han, Hyung Soo; Lee, Seon Duk; Lee, Hanna; Lee, Dong Eun; Ahn, Jae Yun; Ryoo, Hyun Wook; Seo, Kang Suk; Kim, Jong Kun

    2017-12-04

    TNF-α regulates immune cells and acts as an endogenous pyrogen. Reverse transcription polymerase chain reaction (RT-PCR) is one of the most commonly used methods for gene expression analysis. Among the alternatives to PCR, loop-mediated isothermal amplification (LAMP) shows good potential in terms of specificity and sensitivity. However, few studies have compared RT-PCR and LAMP for human gene expression analysis. Therefore, in the present study, we compared one-step RT-PCR, two-step RT-LAMP and one-step RT-LAMP for human gene expression analysis. We compared three gene expression analysis methods using the human TNF-α gene as a biomarker from peripheral blood cells. Total RNA from the three selected febrile patients were subjected to the three different methods of gene expression analysis. In the comparison of three gene expression analysis methods, the detection limit of both one-step RT-PCR and one-step RT-LAMP were the same, while that of two-step RT-LAMP was inferior. One-step RT-LAMP takes less time, and the experimental result is easy to determine. One-step RT-LAMP is a potentially useful and complementary tool that is fast and reasonably sensitive. In addition, one-step RT-LAMP could be useful in environments lacking specialized equipment or expertise.

  17. Creep analysis by the path function method

    International Nuclear Information System (INIS)

    Akin, J.E.; Pardue, R.M.

    1977-01-01

    The finite element method has become a common analysis procedure for the creep analysis of structures. The most recent programs are designed to handle a general class of material properties and are able to calculate elastic, plastic, and creep components of strain under general loading histories. The constant stress approach is too crude a model to accurately represent the actual behaviour of the stress for large time steps. The true path of a point in the effective stress-effective strain (sigmasup(e)-epsilonsup(c)) plane is often one in which the slope is rapidly changing. Thus the stress level quickly moves away from the initial stress level and then gradually approaches the final one. The result is that the assumed constant stress level quickly becomes inaccurate. What is required is a better method of approximation of the true path in the sigmasup(e)-epsilonsup(c) space. The method described here is called the path function approach because it employs an assumed function to estimate the motion of points in the sigmasup(e)-epsilonsup(c) space. (Auth.)

  18. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  19. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  20. Development of rupture process analysis method for great earthquakes using Direct Solution Method

    Science.gov (United States)

    Yoshimoto, M.; Yamanaka, Y.; Takeuchi, N.

    2010-12-01

    Conventional rupture process analysis methods using teleseismic body waves were based on ray theory. Therefore, these methods have the following problems in applying to great earthquakes such as 2004 Sumatra earthquake: (1) difficulty in computing all later phases such as the PP reflection phase, (2) impossibility of computing called “W phase”, the long period phase arriving before S wave, (3) implausibility of hypothesis that the distance is far enough from the observation points to the hypocenter compared to the fault length. To solve above mentioned problems, we have developed a new method which uses the synthetic seismograms computed by the Direct Solution Method (DSM, e.g. Kawai et al. 2006) as Green’s functions. We used the DSM software (http://www.eri.u-tokyo.ac.jp/takeuchi/software/) for computing the Green’s functions up to 1 Hz for the IASP91 (Kennett and Engdahl, 1991) model, and determined the final slip distributions using the waveform inversion method (Kikuchi et al. 2003). First we confirmed whether the Green’s functions computed by DSM were accurate in higher frequencies up to 1 Hz. Next we performed the rupture process analysis of this new method for Mw8.0 (GCMT) large Solomon Islands earthquake on April 1, 2007. We found that this earthquake consisted of two asperities and the rupture propagated across the subducting Sinbo ridge. The obtained slip distribution better correlates to the aftershock distributions than existing method. Furthermore, this new method keep same accuracy of existing method (which has the advantage of calculating) with respect to direct P-wave and reflection phases near the source, and also accurately calculate the later phases such a PP-wave.

  1. PIXE - a new method for elemental analysis

    International Nuclear Information System (INIS)

    Johansson, S.A.E.

    1983-01-01

    With elemental analysis we mean the determination of which chemical elements are present in a sample and of their concentration. This is an old and important problem in chemistry. The earliest methods were purely chemical and many such methods are still used. However, various methods based on physical principles have gradually become more and more important. One such method is neutron activation. When the sample is bombarded with neutrons it becomes radioactive and the various radioactive isotopes produced can be identified by the radiation they emit. From the measured intensity of the radiation one can calculate how much of a certain element that is present in the sample. Another possibility is to study the light emitted when the sample is excited in various ways. A spectroscopic investigation of the light can identify the chemical elements and allows also a determination of their concentration in the sample. In the same way, if a sample can be brought to emit X-rays, this radiation is also characteristic for the elements present and can be used to determine the elemental concentration. One such X-ray method which has been developed recently is PIXE. The name is an acronym for Particle Induced X-ray Emission and indicates the principle of the method. Particles in this context means heavy, charged particles such as protons and a-particles of rather high energy. Hence, in PIXE-analysis the sample is irradiated in the beam of an accelerator and the emitted X-rays are studied. (author)

  2. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  3. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  4. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    (Danmarks Grundforskningfond) 'Centre of Excellence in GeoGenetics' grant, with additional funding provided by the Danish Council for Independent Research 'Sapere Aude' programme. The thesis comprises five chapters, all of which represent different projects that involved the analysis of massive amounts......, thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project......The work presented in this thesis is the result of research carried out during a three-year PhD at the Centre for GeoGenetics, Natural History Museum of Denmark, University of Copenhagen, under supervision of Professor Tom Gilbert. The PhD was funded by the Danish National Research Foundation...

  5. Sensitivity analysis methods and a biosphere test case implemented in EIKOS

    Energy Technology Data Exchange (ETDEWEB)

    Ekstroem, P.A.; Broed, R. [Facilia AB, Stockholm, (Sweden)

    2006-05-15

    Computer-based models can be used to approximate real life processes. These models are usually based on mathematical equations, which are dependent on several variables. The predictive capability of models is therefore limited by the uncertainty in the value of these. Sensitivity analysis is used to apportion the relative importance each uncertain input parameter has on the output variation. Sensitivity analysis is therefore an essential tool in simulation modelling and for performing risk assessments. Simple sensitivity analysis techniques based on fitting the output to a linear equation are often used, for example correlation or linear regression coefficients. These methods work well for linear models, but for non-linear models their sensitivity estimations are not accurate. Usually models of complex natural systems are non-linear. Within the scope of this work, various sensitivity analysis methods, which can cope with linear, non-linear, as well as non-monotone problems, have been implemented, in a software package, EIKOS, written in Matlab language. The following sensitivity analysis methods are supported by EIKOS: Pearson product moment correlation coefficient (CC), Spearman Rank Correlation Coefficient (RCC), Partial (Rank) Correlation Coefficients (PCC), Standardized (Rank) Regression Coefficients (SRC), Sobol' method, Jansen's alternative, Extended Fourier Amplitude Sensitivity Test (EFAST) as well as the classical FAST method and the Smirnov and the Cramer-von Mises tests. A graphical user interface has also been developed, from which the user easily can load or call the model and perform a sensitivity analysis as well as uncertainty analysis. The implemented sensitivity analysis methods has been benchmarked with well-known test functions and compared with other sensitivity analysis software, with successful results. An illustration of the applicability of EIKOS is added to the report. The test case used is a landscape model consisting of several

  6. Sensitivity analysis methods and a biosphere test case implemented in EIKOS

    International Nuclear Information System (INIS)

    Ekstroem, P.A.; Broed, R.

    2006-05-01

    Computer-based models can be used to approximate real life processes. These models are usually based on mathematical equations, which are dependent on several variables. The predictive capability of models is therefore limited by the uncertainty in the value of these. Sensitivity analysis is used to apportion the relative importance each uncertain input parameter has on the output variation. Sensitivity analysis is therefore an essential tool in simulation modelling and for performing risk assessments. Simple sensitivity analysis techniques based on fitting the output to a linear equation are often used, for example correlation or linear regression coefficients. These methods work well for linear models, but for non-linear models their sensitivity estimations are not accurate. Usually models of complex natural systems are non-linear. Within the scope of this work, various sensitivity analysis methods, which can cope with linear, non-linear, as well as non-monotone problems, have been implemented, in a software package, EIKOS, written in Matlab language. The following sensitivity analysis methods are supported by EIKOS: Pearson product moment correlation coefficient (CC), Spearman Rank Correlation Coefficient (RCC), Partial (Rank) Correlation Coefficients (PCC), Standardized (Rank) Regression Coefficients (SRC), Sobol' method, Jansen's alternative, Extended Fourier Amplitude Sensitivity Test (EFAST) as well as the classical FAST method and the Smirnov and the Cramer-von Mises tests. A graphical user interface has also been developed, from which the user easily can load or call the model and perform a sensitivity analysis as well as uncertainty analysis. The implemented sensitivity analysis methods has been benchmarked with well-known test functions and compared with other sensitivity analysis software, with successful results. An illustration of the applicability of EIKOS is added to the report. The test case used is a landscape model consisting of several linked

  7. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  8. Elastic and inelastic methods of piping systems analysis: a preliminary review

    International Nuclear Information System (INIS)

    Reich, M.; Esztergar, E.P.; Spence, J.; Boyle, J.; Chang, T.Y.

    1975-02-01

    A preliminary review of the methods used for elastic and inelastic piping system analysis is presented. The following principal conclusions are reached: techniques for the analysis of complex piping systems operating in the high temperature creep regime should be further developed; accurate analysis of a complete pipework system in creep using the ''complete shell finite element methods'' is not feasible at the present, and the ''reduced shell finite element method'' still requires excessive computer time and also requires further investigation regarding the compatibility problems associated with the pipe bend element, particularly when applied to cases involving general loading conditions; and with the current size of proposed high temperature systems requiring the evaluation of long-term operating life (30 to 40 years), it is important to adopt a simplified analysis method. A design procedure for a simplified analysis method based on currently available techniques applied in a three-stage approach is outlined. The work required for implementation of these procedures together with desirable future developments are also briefly discussed. Other proposed simplified approximations also are reviewed in the text. 101 references. (U.S.)

  9. A New Boron Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Weitman, J; Daaverhoeg, N; Farvolden, S

    1970-07-01

    In connection with fast neutron (n, {alpha}) cross section measurements a novel boron analysis method has been developed. The boron concentration is inferred from the mass spectrometrically determined number of helium atoms produced in the thermal and epithermal B-10 (n, {alpha}) reaction. The relation between helium amount and boron concentration is given, including corrections for self shielding effects and background levels. Direct and diffusion losses of helium are calculated and losses due to gettering, adsorption and HF-ionization in the release stage are discussed. A series of boron determinations is described and the results are compared with those obtained by other methods, showing excellent agreement. The lower limit of boron concentration which can be measured varies with type of sample. In e.g. steel, concentrations below 10-5 % boron in samples of 0.1-1 gram may be determined.

  10. Method for environmental risk analysis (MIRA) revision 2007

    International Nuclear Information System (INIS)

    2007-04-01

    OLF's instruction manual for carrying out environmental risk analyses provides a united approach and a common framework for environmental risk assessments. This is based on the best information available. The manual implies standardizations of a series of parameters, input data and partial analyses that are included in the environmental risk analysis. Environmental risk analyses carried out according to the MIRA method will thus be comparable between fields and between companies. In this revision an update of the text in accordance with today's practice for environmental risk analyses and prevailing regulations is emphasized. Moreover, method adjustments for especially protected beach habitats have been introduced, as well as a general method for estimating environmental risk concerning fish. Emphasis has also been put on improving environmental risk analysis' possibilities to contribute to a better management of environmental risk in the companies (ml)

  11. Applicability of finite element method to collapse analysis of steel connection under compression

    International Nuclear Information System (INIS)

    Zhou, Zhiguang; Nishida, Akemi; Kuwamura, Hitoshi

    2010-01-01

    It is often necessary to study the collapse behavior of steel connections. In this study, the limit load of the steel pyramid-to-tube socket connection subjected to uniform compression was investigated by means of FEM and experiment. The steel connection was modeled using 4-node shell element. Three kinds of analysis were conducted: linear buckling, nonlinear buckling and modified Riks method analysis. For linear buckling analysis the linear eigenvalue analysis was done. For nonlinear buckling analysis, eigenvalue analysis was performed for buckling load in a nonlinear manner based on the incremental stiffness matrices, and nonlinear material properties and large displacement were considered. For modified Riks method analysis compressive load was loaded by using the modified Riks method, and nonlinear material properties and large displacement were considered. The results of FEM analyses were compared with the experimental results. It shows that nonlinear buckling and modified Riks method analyses are more accurate than linear buckling analysis because they employ nonlinear, large-deflection analysis to estimate buckling loads. Moreover, the calculated limit loads from nonlinear buckling and modified Riks method analysis are close. It can be concluded that modified Riks method analysis is more effective for collapse analysis of steel connection under compression. At last, modified Riks method analysis is used to do the parametric studies of the thickness of the pyramid. (author)

  12. Application of numerical analysis methods to thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    Gomez Ros, J. M.; Delgado, A.

    1989-01-01

    This report presents the application of numerical methods to thermoluminescence dosimetry (TLD), showing the advantages obtained over conventional evaluation systems. Different configurations of the analysis method are presented to operate in specific dosimetric applications of TLD, such as environmental monitoring and mailed dosimetry systems for quality assurance in radiotherapy facilities. (Author) 10 refs

  13. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    Science.gov (United States)

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Simplified analysis method for vibration of fusion reactor components with magnetic damping

    International Nuclear Information System (INIS)

    Tanaka, Yoshikazu; Horie, Tomoyoshi; Niho, Tomoya

    2000-01-01

    This paper describes two simplified analysis methods for the magnetically damped vibration. One is the method modifying the result of finite element uncoupled analysis using the coupling intensity parameter, and the other is the method using the solution and coupled eigenvalues of the single-degree-of-freedom coupled model. To verify these methods, numerical analyses of a plate and a thin cylinder are performed. The comparison between the results of the former method and the finite element tightly coupled analysis show almost satisfactory agreement. The results of the latter method agree very well with the finite element tightly coupled results because of the coupled eigenvalues. Since the vibration with magnetic damping can be evaluated using these methods without finite element coupled analysis, these approximate methods will be practical and useful for the wide range of design analyses taking account of the magnetic damping effect

  15. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  16. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  17. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  1. A Review of Classical Methods of Item Analysis.

    Science.gov (United States)

    French, Christine L.

    Item analysis is a very important consideration in the test development process. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractibility of the items in a test. This paper reviews some of the classical methods for…

  2. Analysis of nitrate absorption and transport in non-nodulated and nodulated soybean plants with 13NO3- and 15NO3-

    International Nuclear Information System (INIS)

    Sato, Takashi; Ohtake, Norikuni; Ohyama, Takuji

    1999-01-01

    Nodulating (T202) and non-nodulating (T201) soybean isolines were hydroponically cultivated, then nitrate labeled with 13 N or 15 N, was added to the culture solution in order to investigate the nitrate absorption and transport in soybean. The accumulation pattern of the absorbed 13 N in the first trifoliate was observed by positron emitting tracer imaging system (PETIS) as well as bioimaging analyzer system (BAS). The 15 N abundance of each part was determined by emission spectrometry. Real time changes in two dimensional image of the radioactivity could be monitored by PETIS, besides the distribution 13 N in whole plant could be observed by BAS. However quantitative data were hardly obtained by the 13 N analysis. Stable isotope 15 N is more reliable in the quantitative analysis in each part. Combing the data obtained by 15 N and 13 N tracer experiments, the absorption and translocation of N in plant should be more clearly figured out. (author)

  3. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  4. New experimental and analysis methods in I-DLTS

    International Nuclear Information System (INIS)

    Pandey, S.U.; Middelkamp, P.; Li, Z.; Eremin, V.

    1998-02-01

    A new experimental apparatus to perform I-DLTS measurements is presented. The method is shown to be faster and more sensitive than traditional double boxcar I-DLTS systems. A novel analysis technique utilizing multiple exponential fits to the I-DLTS signal from a highly neutron irradiated silicon sample is presented with a discussion of the results. It is shown that the new method has better resolution and can deconvolute overlapping peaks more accurately than previous methods

  5. Methods of I-129 analysis for environmental monitoring

    International Nuclear Information System (INIS)

    Nomura, T.; Katagiri, H.; Kitahara, Y.; Fukuda, S.

    1980-01-01

    Among the radioiodine isotopes discharged from nuclear facilities, I-129 has the longest half-life (1.7 x 10 7 years) and thus environmental monitoring of this nuclide is important. Methods of analysis of low level I-129 in environmental samples such as milk, crops, seaweeds and soils are described. The iodine is separated from the dried or pulverized samples by ignition at 1000 0 C in a quartz combustion apparatus with a stream of oxygen. Stable I-127 is simultaneously determined and the atom ratio of 129 I/ 127 I calculated in order to evaluate the thyroid dose by the specific activity method. Results of an analysis of typical food samples collected near the fuel reprocessing plant of Tokai Works showed no I-129 concentrations higher than the detection limit of the method (10 -2 pCi for 10 g. dry sample). (UK)

  6. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  7. Development of spectral history methods for pin-by-pin core analysis method using three-dimensional direct response matrix

    International Nuclear Information System (INIS)

    Mitsuyasu, T.; Ishii, K.; Hino, T.; Aoyama, M.

    2009-01-01

    Spectral history methods for pin-by-pin core analysis method using the three-dimensional direct response matrix have been developed. The direct response matrix is formalized by four sub-response matrices in order to respond to a core eigenvalue k and thus can be recomposed at each outer iteration in the core analysis. For core analysis, it is necessary to take into account the burn-up effect related to spectral history. One of the methods is to evaluate the nodal burn-up spectrum obtained using the out-going neutron current. The other is to correct the fuel rod neutron production rates obtained the pin-by-pin correction. These spectral history methods were tested in a heterogeneous system. The test results show that the neutron multiplication factor error can be reduced by half during burn-up, the nodal neutron production rates errors can be reduced by 30% or more. The root-mean-square differences between the relative fuel rod neutron production rate distributions can be reduced within 1.1% error. This means that these methods can accurately reflect the effects of intra- and inter-assembly heterogeneities during burn-up and can be used for core analysis. Core analysis with the DRM method was carried out for an ABWR quarter core and it was found that both thermal power and coolant-flow distributions were smoothly converged. (authors)

  8. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-03

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDfurfurals will contribute to characterise and quantify their presence in the human diet.

  9. Piping dynamic analysis by the synthesis method

    International Nuclear Information System (INIS)

    Bezler, P.; Curreri, J.R.

    1976-01-01

    Since piping systems are a frequent source of noise and vibrations, their efficient dynamic analysis is imperative. As an alternate to more conventional analyses methods, an application of the synthesis method to piping vibrations analyses is demonstrated. Specifically, the technique is illustrated by determining the normal modes and natural frequencies of a composite bend from the normal mode and natural frequency data of two component parts. A comparison of the results to those derived for the composite bend by other techniques is made

  10. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  11. Principal component analysis of dynamic fluorescence images for diagnosis of diabetic vasculopathy

    Science.gov (United States)

    Seo, Jihye; An, Yuri; Lee, Jungsul; Ku, Taeyun; Kang, Yujung; Ahn, Chulwoo; Choi, Chulhee

    2016-04-01

    Indocyanine green (ICG) fluorescence imaging has been clinically used for noninvasive visualizations of vascular structures. We have previously developed a diagnostic system based on dynamic ICG fluorescence imaging for sensitive detection of vascular disorders. However, because high-dimensional raw data were used, the analysis of the ICG dynamics proved difficult. We used principal component analysis (PCA) in this study to extract important elements without significant loss of information. We examined ICG spatiotemporal profiles and identified critical features related to vascular disorders. PCA time courses of the first three components showed a distinct pattern in diabetic patients. Among the major components, the second principal component (PC2) represented arterial-like features. The explained variance of PC2 in diabetic patients was significantly lower than in normal controls. To visualize the spatial pattern of PCs, pixels were mapped with red, green, and blue channels. The PC2 score showed an inverse pattern between normal controls and diabetic patients. We propose that PC2 can be used as a representative bioimaging marker for the screening of vascular diseases. It may also be useful in simple extractions of arterial-like features.

  12. New Isotope Analysis Method: Atom Trap Mass Spectrometry

    International Nuclear Information System (INIS)

    Ko, Kwang Hoon; Park, Hyun Min; Han, Jae Min; Kim, Taek Soo; Cha, Yong Ho; Lim, Gwon; Jeong, Do Young

    2011-01-01

    Trace isotope analysis has been an important role in science, archaeological dating, geology, biology and nuclear industry. Some fission products such as Sr-90, Cs-135 and Kr-85 can be released to the environment when nuclear accident occurs and the reprocessing factory operates. Thus, the analysis of artificially produced radioactive isotopes has been of interest in nuclear industry. But it is difficult to detect them due to low natural abundance less then 10 -10 . In general, radio-chemical method has been applied to detect ultra-trace radio isotopes. But this method has disadvantages of long measurement time for long lived radioisotopes and toxic chemical process for the purification. The Accelerator Mass Spectrometer has high isotope selectivity, but the system is huge and its selectivity is affected by isobars. The laser based method, such as RIMS (Resonance Ionization Mass Spectrometry) has the advantage of isobar-effect free characteristics. But the system size is still huge for high isotope selective system. Recently, ATTA (Atom Trap Trace Analysis) has been successfully applied to detect ultra-trace isotope, Kr-81 and Kr-85. ATTA is the isobar-effect free detection with high isotope selectivity and the system size is small. However, it requires steady atomic beam source during detection, and is not allowed simultaneous detection of several isotopes. In this presentation, we introduce new isotope detection method which is a coupled method of Atom Trap Mass Spectrometry (ATMS). We expect that it can overcome the disadvantage of ATTA while it has both advantages of ATTA and mass spectrometer. The basic concept and the system design will be presented. In addition, the experimental status of ATMS will also be presented

  13. A rapid chemical method for lysing Arabidopsis cells for protein analysis

    Directory of Open Access Journals (Sweden)

    Takano Tetsuo

    2011-07-01

    Full Text Available Abstract Background Protein extraction is a frequent procedure in biological research. For preparation of plant cell extracts, plant materials usually have to be ground and homogenized to physically break the robust cell wall, but this step is laborious and time-consuming when a large number of samples are handled at once. Results We developed a chemical method for lysing Arabidopsis cells without grinding. In this method, plants are boiled for just 10 minutes in a solution containing a Ca2+ chelator and detergent. Cell extracts prepared by this method were suitable for SDS-PAGE and immunoblot analysis. This method was also applicable to genomic DNA extraction for PCR analysis. Our method was applied to many other plant species, and worked well for some of them. Conclusions Our method is rapid and economical, and allows many samples to be prepared simultaneously for protein analysis. Our method is useful not only for Arabidopsis research but also research on certain other species.

  14. Evaluation of the Methods for Response Analysis under Non-Stationary Excitation

    Directory of Open Access Journals (Sweden)

    R.S. Jangid

    1999-01-01

    Full Text Available Response of structures to non-stationary ground motion can be obtained either by the evolutionary spectral analysis or by the Markov approach. In certain conditions, a quasi-stationary analysis can also be performed. The first two methods of analysis are difficult to apply for complex situations such as problems involving soil-structure interaction, non-classical damping and primary-secondary structure interaction. The quasi-stationary analysis, on the other hand, provides an easier solution procedure for such cases. Here-in, the effectiveness of the quasi-stationary analysis is examined with the help of the analysis of a single degree-of-freedom (SDOF system under a set of parametric variations. For this purpose, responses of the SDOF system to uniformly modulated non-stationary random ground excitation are obtained by the three methods and they are compared. In addition, the relative computational efforts for different methods are also investigated.

  15. The stress analysis method for three-dimensional composite materials

    Science.gov (United States)

    Nagai, Kanehiro; Yokoyama, Atsushi; Maekawa, Zen'ichiro; Hamada, Hiroyuki

    1994-05-01

    This study proposes a stress analysis method for three-dimensionally fiber reinforced composite materials. In this method, the rule-of mixture for composites is successfully applied to 3-D space in which material properties would change 3-dimensionally. The fundamental formulas for Young's modulus, shear modulus, and Poisson's ratio are derived. Also, we discuss a strength estimation and an optimum material design technique for 3-D composite materials. The analysis is executed for a triaxial orthogonally woven fabric, and their results are compared to the experimental data in order to verify the accuracy of this method. The present methodology can be easily understood with basic material mechanics and elementary mathematics, so it enables us to write a computer program of this theory without difficulty. Furthermore, this method can be applied to various types of 3-D composites because of its general-purpose characteristics.

  16. A comparison of uncertainty analysis methods using a groundwater flow model

    International Nuclear Information System (INIS)

    Doctor, P.G.; Jacobson, E.A.; Buchanan, J.A.

    1988-06-01

    This report evaluates three uncertainty analysis methods that are proposed for use in performances assessment activities within the OCRWM and Nuclear Regulatory Commission (NRC) communities. The three methods are Monte Carlo simulation with unconstrained sampling, Monte Carlo simulation with Latin Hypercube sampling, and first-order analysis. Monte Carlo simulation with unconstrained sampling is a generally accepted uncertainty analysis method, but it has the disadvantage of being costly and time consuming. Latin Hypercube sampling was proposed to make Monte Carlo simulation more efficient. However, although it was originally formulated for independent variables, which is a major drawback in performance assessment modeling, Latin Hypercube can be used to generate correlated samples. The first-order method is efficient to implement because it is based on the first-order Taylor series expansion; however, there is concern that it does not adequately describe the variability for complex models. These three uncertainty analysis methods were evaluated using a calibrated groundwater flow model of a unconfined aquifer in southern Arizona. The two simulation methods produced similar results, although the Latin Hypercube method tends to produce samples whose estimates of statistical parameters are closer to the desired parameters. The mean travel times for the first-order method does not agree with those of the simulations. In additions, the first-order method produces estimates of variance in travel times that are more variable than those produced by the simulation methods, resulting in nonconservative tolerance intervals. 13 refs., 33 figs

  17. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  18. Phosphorus analysis in milk samples by neutron activation analysis method

    International Nuclear Information System (INIS)

    Oliveira, R.M. de; Cunha, I.I.L.

    1991-01-01

    The determination of phosphorus in milk samples by instrumental thermal neutron activation analysis is described. The procedure involves a short irradiation in a nuclear reactor and measurement of the beta radiation emitted by phosphorus - 32 after a suitable decay period. The sources of error were studied and the established method was applied to standard reference materials of known phosphorus content. (author)

  19. Different methods for ethical analysis in health technology assessment: an empirical study.

    Science.gov (United States)

    Saarni, Samuli I; Braunack-Mayer, Annette; Hofmann, Bjørn; van der Wilt, Gert Jan

    2011-10-01

    Ethical analysis can highlight important ethical issues related to implementing a technology, values inherent in the technology itself, and value-decisions underlying the health technology assessment (HTA) process. Ethical analysis is a well-acknowledged part of HTA, yet seldom included in practice. One reason for this is lack of knowledge about the properties and differences between the methods available. This study compares different methods for ethical analysis within HTA. Ethical issues related to bariatric (obesity) surgery were independently evaluated using axiological, casuist, principlist, and EUnetHTA models for ethical analysis within HTA. The methods and results are presented and compared. Despite varying theoretical underpinnings and practical approaches, the four methods identified similar themes: personal responsibility, self-infliction, discrimination, justice, public funding, and stakeholder involvement. The axiological and EUnetHTA models identified a wider range of arguments, whereas casuistry and principlism concentrated more on analyzing a narrower set of arguments deemed more important. Different methods can be successfully used for conducting ethical analysis within HTA. Although our study does not show that different methods in ethics always produce similar results, it supports the view that different methods of ethics can yield relevantly similar results. This suggests that the key conclusions of ethical analyses within HTA can be transferable between methods and countries. The systematic and transparent use of some method of ethics appears more important than the choice of the exact method.

  20. A method for rapid similarity analysis of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Liu Na

    2006-11-01

    Full Text Available Abstract Background Owing to the rapid expansion of RNA structure databases in recent years, efficient methods for structure comparison are in demand for function prediction and evolutionary analysis. Usually, the similarity of RNA secondary structures is evaluated based on tree models and dynamic programming algorithms. We present here a new method for the similarity analysis of RNA secondary structures. Results Three sets of real data have been used as input for the example applications. Set I includes the structures from 5S rRNAs. Set II includes the secondary structures from RNase P and RNase MRP. Set III includes the structures from 16S rRNAs. Reasonable phylogenetic trees are derived for these three sets of data by using our method. Moreover, our program runs faster as compared to some existing ones. Conclusion The famous Lempel-Ziv algorithm can efficiently extract the information on repeated patterns encoded in RNA secondary structures and makes our method an alternative to analyze the similarity of RNA secondary structures. This method will also be useful to researchers who are interested in evolutionary analysis.

  1. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  2. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  3. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  4. Dynamic Error Analysis Method for Vibration Shape Reconstruction of Smart FBG Plate Structure

    Directory of Open Access Journals (Sweden)

    Hesheng Zhang

    2016-01-01

    Full Text Available Shape reconstruction of aerospace plate structure is an important issue for safe operation of aerospace vehicles. One way to achieve such reconstruction is by constructing smart fiber Bragg grating (FBG plate structure with discrete distributed FBG sensor arrays using reconstruction algorithms in which error analysis of reconstruction algorithm is a key link. Considering that traditional error analysis methods can only deal with static data, a new dynamic data error analysis method are proposed based on LMS algorithm for shape reconstruction of smart FBG plate structure. Firstly, smart FBG structure and orthogonal curved network based reconstruction method is introduced. Then, a dynamic error analysis model is proposed for dynamic reconstruction error analysis. Thirdly, the parameter identification is done for the proposed dynamic error analysis model based on least mean square (LMS algorithm. Finally, an experimental verification platform is constructed and experimental dynamic reconstruction analysis is done. Experimental results show that the dynamic characteristics of the reconstruction performance for plate structure can be obtained accurately based on the proposed dynamic error analysis method. The proposed method can also be used for other data acquisition systems and data processing systems as a general error analysis method.

  5. Preliminary study of elemental analysis of hydroxyapatite used neutron activation analysis method

    International Nuclear Information System (INIS)

    Yustinus Purwamargapratala; Rina Mulyaningsih

    2010-01-01

    Preliminary study has been carried out elemental analysis of hydroxyapatite synthesized using the method of neutron activation analysis. Hydroxyapatite is the main component constituent of bones and teeth which can be synthesized from limestone and phosphoric acid. Hydroxyapatite can be used as a bone substitute material and human and animal teeth. Tests on the metal content is necessary to prevent the risk of damage to bones and teeth due to contamination. Results of analysis using neutron activation analysis method with samples irradiated at the neutron flux 10"3 n.det"-"1cm"-"2 for one minute, the impurities of Al (48.60±6.47 mg/kg), CI (38.00±7.47 mg/kg), Mn (1.05±0.19 mg/kg), and Mg (2095.30±203.66 mg/kg), were detected, whereas with irradiation time for 10 minutes and 40 minutes with a time decay of three days there were K (103.89 ± 26.82 mg/kg), Br (1617.06 ± 193.66 mg/kg), and Na (125.10±9.57 mg/kg). These results indicate that there is impurity Al, CI, Mn, Mg, Br, K and Na, although in very small amounts and do not cause damage to bones and teeth. (author)

  6. Mixing Methods in Organizational Ethics and Organizational Innovativeness Research : Three Approaches to Mixed Methods Analysis

    OpenAIRE

    Riivari, Elina

    2015-01-01

    This chapter discusses three categories of mixed methods analysis techniques: variableoriented, case-oriented, and process/experience-oriented. All three categories combine qualitative and quantitative approaches to research methodology. The major differences among the categories are the focus of the study, available analysis techniques and timely aspect of the study. In variable-oriented analysis, the study focus is relationships between the research phenomena. In case-oriente...

  7. Screening, sensitivity, and uncertainty for the CREAM method of Human Reliability Analysis

    International Nuclear Information System (INIS)

    Bedford, Tim; Bayley, Clare; Revie, Matthew

    2013-01-01

    This paper reports a sensitivity analysis of the Cognitive Reliability and Error Analysis Method for Human Reliability Analysis. We consider three different aspects: the difference between the outputs of the Basic and Extended methods, on the same HRA scenario; the variability in outputs through the choices made for common performance conditions (CPCs); and the variability in outputs through the assignment of choices for cognitive function failures (CFFs). We discuss the problem of interpreting categories when applying the method, compare its quantitative structure to that of first generation methods and discuss also how dependence is modelled with the approach. We show that the control mode intervals used in the Basic method are too narrow to be consistent with the Extended method. This motivates a new screening method that gives improved accuracy with respect to the Basic method, in the sense that (on average) halves the uncertainty associated with the Basic method. We make some observations on the design of a screening method that are generally applicable in Risk Analysis. Finally, we propose a new method of combining CPC weights with nominal probabilities so that the calculated probabilities are always in range (i.e. between 0 and 1), while satisfying sensible properties that are consistent with the overall CREAM method

  8. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  9. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    Science.gov (United States)

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  10. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  11. A new modification of summary-based analysis method for large software system testing

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The automated testing tools becoming a frequent practice require thorough computer-aided testing of large software systems, including system inter-component interfaces. To achieve a good coverage, one should overcome scalability problems of different methods of analysis. These problems arise from impossibility to analyze all the execution paths. The objective of this research is to build a method for inter-procedural analysis, which efficiency enables us to analyse large software systems (such as Android OS codebase as a whole for a reasonable time (no more than 4 hours. This article reviews existing methods of software analysis to detect their potential defects. It focuses on the symbolic execution method since it is widely used both in static analysis of source code and in hybrid analysis of object files and intermediate representation (concolic testing. The method of symbolic execution involves separation of a set of input data values into equivalence classes while choosing an execution path. The paper also considers advantages of this method and its shortcomings. One of the main scalability problems is related to inter-procedural analysis. Analysis time grows rapidly if an inlining method is used for inter-procedural analysis. So this work proposes a summary-based analysis method to solve scalability problems. Clang Static Analyzer, an open source static analyzer (a part of the LLVM project, has been chosen as a target system. It allows us to compare performance of inlining and summary-based inter-procedural analysis. A mathematical model for preliminary estimations is described in order to identify possible factors of performance improvement.

  12. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  13. Analysis of live cell images: Methods, tools and opportunities.

    Science.gov (United States)

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  14. An exergy method for compressor performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, J A; Harte, S [Trinity Coll., Dublin (Ireland)

    1995-07-01

    An exergy method for compressor performance analysis is presented. The purpose of this is to identify and quantify defects in the use of a compressor`s shaft power. This information can be used as the basis for compressor design improvements. The defects are attributed to friction, irreversible heat transfer, fluid throttling, and irreversible fluid mixing. They are described, on a common basis, as exergy destruction rates and their locations are identified. The method can be used with any type of positive displacement compressor. It is most readily applied where a detailed computer simulation program is available for the compressor. An analysis of an open reciprocating refrigeration compressor that used R12 refrigerant is given as an example. The results that are presented consist of graphs of the instantaneous rates of exergy destruction according to the mechanisms involved, a pie chart of the breakdown of the average shaft power wastage by mechanism, and a pie chart with a breakdown by location. (author)

  15. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  16. In Vivo Imaging of Nitric Oxide by Magnetic Resonance Imaging Techniques

    Directory of Open Access Journals (Sweden)

    Rakesh Sharma

    2014-01-01

    Full Text Available Nitric oxide (NO biosensors are novel tools for real-time bioimaging of tissue oxygen changes and physiological monitoring of tissue vasculature. Nitric oxide behavior further enhances its role in mapping signal transduction at the molecular level. Spectrometric electron paramagnetic resonance (EPR and fluorometric imaging are well known techniques with the potential for in vivo bioimaging of NO. In tissues, NO is a specific target of nitrosyl compounds for chemical reaction, which provides a unique opportunity for application of newly identified NO biosensors. However, the accuracy and sensitivity of NO biosensors still need to be improved. Another potential magnetic resonance technique based on short term NO effects on proton relaxation enhancement is magnetic resonance imaging (MRI, and some NO biosensors may be used as potent imaging contrast agents for measurement of tumor size by MRI combined with fluorescent imaging. The present review provides supporting information regarding the possible use of nitrosyl compounds as NO biosensors in MRI and fluorescent bioimaging showing their measurement limitations and quantitative accuracy. These new approaches open a perspective regarding bioimaging of NO and the in vivo elucidation of NO effects by magnetic resonance techniques.

  17. Comparing sensitivity analysis methods to advance lumped watershed model identification and evaluation

    Directory of Open Access Journals (Sweden)

    Y. Tang

    2007-01-01

    Full Text Available This study seeks to identify sensitivity tools that will advance our understanding of lumped hydrologic models for the purposes of model improvement, calibration efficiency and improved measurement schemes. Four sensitivity analysis methods were tested: (1 local analysis using parameter estimation software (PEST, (2 regional sensitivity analysis (RSA, (3 analysis of variance (ANOVA, and (4 Sobol's method. The methods' relative efficiencies and effectiveness have been analyzed and compared. These four sensitivity methods were applied to the lumped Sacramento soil moisture accounting model (SAC-SMA coupled with SNOW-17. Results from this study characterize model sensitivities for two medium sized watersheds within the Juniata River Basin in Pennsylvania, USA. Comparative results for the 4 sensitivity methods are presented for a 3-year time series with 1 h, 6 h, and 24 h time intervals. The results of this study show that model parameter sensitivities are heavily impacted by the choice of analysis method as well as the model time interval. Differences between the two adjacent watersheds also suggest strong influences of local physical characteristics on the sensitivity methods' results. This study also contributes a comprehensive assessment of the repeatability, robustness, efficiency, and ease-of-implementation of the four sensitivity methods. Overall ANOVA and Sobol's method were shown to be superior to RSA and PEST. Relative to one another, ANOVA has reduced computational requirements and Sobol's method yielded more robust sensitivity rankings.

  18. Numerical analysis of melting/solidification phenomena using a moving boundary problem analysis method X-FEM

    International Nuclear Information System (INIS)

    Uchibori, Akihiro; Ohshima, Hiroyuki

    2008-01-01

    A numerical analysis method for melting/solidification phenomena has been developed to evaluate a feasibility of several candidate techniques in the nuclear fuel cycle. Our method is based on the eXtended Finite Element Method (X-FEM) which has been used for moving boundary problems. Key technique of the X-FEM is to incorporate signed distance function into finite element interpolation to represent a discontinuous gradient of the temperature at a moving solid-liquid interface. Construction of the finite element equation, the technique of quadrature and the method to solve the equation are reported here. The numerical solutions of the one-dimensional Stefan problem, solidification in a two-dimensional square corner and melting of pure gallium are compared to the exact solutions or to the experimental data. Through these analyses, validity of the newly developed numerical analysis method has been demonstrated. (author)

  19. A Method of Fire Scenarios Identification in a Consolidated Fire Risk Analysis

    International Nuclear Information System (INIS)

    Lim, Ho Gon; Han, Sang Hoon; Yang, Joon Eon

    2010-01-01

    Conventional fire PSA consider only two cases of fire scenarios, that is one for fire without propagation and the other for single propagation to neighboring compartment. Recently, a consolidated fire risk analysis using single fault tree (FT) was developed. However, the fire scenario identification in the new method is similar to conventional fire analysis method. The present study develops a new method of fire scenario identification in a consolidated fire risk analysis method. An equation for fire propagation is developed to identify fire scenario and a mapping method of fire scenarios into internal event risk model is discussed. Finally, an algorithm for automatic program is suggested

  20. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link...

  1. Development and analysis of finite volume methods

    International Nuclear Information System (INIS)

    Omnes, P.

    2010-05-01

    This document is a synthesis of a set of works concerning the development and the analysis of finite volume methods used for the numerical approximation of partial differential equations (PDEs) stemming from physics. In the first part, the document deals with co-localized Godunov type schemes for the Maxwell and wave equations, with a study on the loss of precision of this scheme at low Mach number. In the second part, discrete differential operators are built on fairly general, in particular very distorted or nonconforming, bidimensional meshes. These operators are used to approach the solutions of PDEs modelling diffusion, electro and magneto-statics and electromagnetism by the discrete duality finite volume method (DDFV) on staggered meshes. The third part presents the numerical analysis and some a priori as well as a posteriori error estimations for the discretization of the Laplace equation by the DDFV scheme. The last part is devoted to the order of convergence in the L2 norm of the finite volume approximation of the solution of the Laplace equation in one dimension and on meshes with orthogonality properties in two dimensions. Necessary and sufficient conditions, relatively to the mesh geometry and to the regularity of the data, are provided that ensure the second-order convergence of the method. (author)

  2. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  3. Smoothed analysis of the k-means method

    NARCIS (Netherlands)

    Arthur, David; Manthey, Bodo; Röglin, Heiko

    2011-01-01

    The k-means method is one of the most widely used clustering algorithms, drawing its popularity from its speed in practice. Recently, however, it was shown to have exponential worst-case running time. In order to close the gap between practical performance and theoretical analysis, the k-means

  4. Method validation for uranium content analysis using a potentiometer T-90

    International Nuclear Information System (INIS)

    Torowati; Ngatijo; Rahmiati

    2016-01-01

    An experimental method validation has been conducted for uranium content analysis using Potentiometer T-90. The method validation experiment was performed in the quality control laboratory of Experiment Fuel Element Installation, PTBBN - BATAN. The objective is to determine the level of precision and accuracy of analytical results for uranium analysis referring to the latest American Standard Test Method (ASTM) of ASTM C1267-11, which is a modified reference method by reducing of reagent consumption by 10% of the amount used by the original method. The ASTM C 1267-11 reference is a new ASTM as a substitute for the older ASTM namely ASTM C799, Vol.12.01, 2003. It is, therefore, necessary to validate the renewed method. The tool used for the analysis of uranium was potentiometer T-90 and the material used was standard uranium oxide powder CRM (Certificate Reference Material). Validation of the method was done by analyzing standard uranium powder by 7 times weighing and 7 times analysis. Analysis results were used to determine the level of accuracy, precision, Relative Standard Deviation (RSD) and Horwitz coefficient Variation and limit detection and quantitation. The average uranium obtained for this method validation is 84.36% with Standard Deviation (SD) of 0.12%, Relative Standard Deviation (RSD) 0.14% and 2/3 Horwitz coefficient Variation (CV Horwitz) 2.05%. The results show that RSD value is smaller than the value of (2/3) CV Horwitz, which means that this method has a high precision. The accuracy value obtained is 0.48%, and since the acceptance limit of high level of accuracy is when the accuracy value is <2.00%, this method is regarded as having a high degree of accuracy [1]. The limit of detection (LOD) and and the limit of quantitation (LOQ) are 0.0145 g/L and 0.0446 g/L respectively. It is concluded that the ASTM C 1267-11 reference method is valid for use. (author)

  5. Solving system of DAEs by homotopy analysis method

    International Nuclear Information System (INIS)

    Awawdeh, Fadi; Jaradat, H.M.; Alsayyed, O.

    2009-01-01

    Homotopy analysis method (HAM) is applied to systems of differential-algebraic equations (DAEs). The HAM is proved to be very effective, simple and convenient to give approximate analytical solutions to DAEs.

  6. Combined optimal-pathlengths method for near-infrared spectroscopy analysis

    International Nuclear Information System (INIS)

    Liu Rong; Xu Kexin; Lu Yanhui; Sun Huili

    2004-01-01

    Near-infrared (NIR) spectroscopy is a rapid, reagent-less and nondestructive analytical technique, which is being increasingly employed for quantitative application in chemistry, pharmaceutics and food industry, and for the optical analysis of biological tissue. The performance of NIR technology greatly depends on the abilities to control and acquire data from the instrument and to calibrate and analyse data. Optical pathlength is a key parameter of the NIR instrument, which has been thoroughly discussed in univariate quantitative analysis in the presence of photometric errors. Although multiple wavelengths can provide more chemical information, it is difficult to determine a single pathlength that is suitable for each wavelength region. A theoretical investigation of a selection procedure for multiple pathlengths, called the combined optimal-pathlengths (COP) method, is identified in this paper and an extensive comparison with the single pathlength method is also performed on simulated and experimental NIR spectral data sets. The results obtained show that the COP method can greatly improve the prediction accuracy in NIR spectroscopy quantitative analysis

  7. Development of rapid urine analysis method for uranium

    Energy Technology Data Exchange (ETDEWEB)

    Kuwabara, J.; Noguchi, H. [Japan Atomic Energy Research Institute, Tokai, Ibaraki (Japan)

    2000-05-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  8. Development of rapid urine analysis method for uranium

    International Nuclear Information System (INIS)

    Kuwabara, J.; Noguchi, H.

    2000-01-01

    ICP-MS has begun to spread in the field of individual monitoring for internal exposure as a very effective machine for uranium analysis. Although the ICP-MS has very high sensitivity, it requires longer time than conventional analysis, such as fluorescence analysis, because it is necessary to remove matrix from a urine sample sufficiently. To shorten time required for the urine bioassay by ICP-MS, a rapid uranium analysis method using the ICP-MS connected with a flow injection system was developed. Since this method does not involve chemical separation steps, the time required is equivalent to the conventional analysis. A measurement test was carried out using 10 urine solutions prepared from a urine sample. Required volume of urine solution is 5 ml. Main chemical treatment is only the digestion with 5 ml of nitric acid using a microwave oven to decompose organic matter and to dissolve suspended or precipitated matter. The microwave oven can digest 10 samples at once within an hour. Volume of digested sample solution was adjusted to 10 ml. The prepared sample solutions were directly introduced to the ICP-MS without any chemical separation procedure. The ICP-MS was connected with a flow injection system and an auto sampler. The flow injection system can minimize the matrix effects caused from salt dissolved in high matrix solution, such as non chemical separated urine sample, because it can introduce micro volume of sample solution into the ICP-MS. The ICP-MS detected uranium within 2 min/sample using the auto sampler. The 10 solutions prepared from a urine sample showed an average of 7.5 ng/l of uranium concentration in urine with 10 % standard deviation. A detection limit is about 1 ng/l. The total time required was less than 4 hours for 10 sample analysis. In the series of measurement, any memory effect was not observed. The present analysis method using the ICP-MS equipped with the flow injection system demonstrated that the shortening of time required on high

  9. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  10. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    Palamalai, A.; Thankachan, T.S.; Balasubramanian, G.R.

    1979-01-01

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  11. Interface and thin film analysis: Comparison of methods, trends

    International Nuclear Information System (INIS)

    Werner, H.W.; Torrisi, A.

    1990-01-01

    Thin film properties are governed by a number of parameters such as: Surface and interface chemical composition, microstructure and the distribution of defects, dopants and impurities. For the determination of most of these aspects sophisticated analytical methods are needed. An overview of these analytical methods is given including: - Features and modes of analytical methods; - Main characteristics, advantages and disadvantages of the established methods [e.g. ESCA (Electron Spectroscopy for Chemical Analysis), AES (Auger Electron Spectroscopy), SIMS (Secondary Ion Mass Spectrometry), RBS (Rutherford Backscattering Spectrometry), SEM (Scanning Electron Microscopy), TEM (Transmission Electron Microscopy), illustrated with typical examples]; - Presentation of relatively new methods such as XRM (X-ray Microscopy) and SCAM (Scanning Acoustic Microscopy). Some features of ESCA (chemical information, insulator analysis, non-destructive depth profiling) have been selected for a more detailed presentation, viz. to illustrate the application of ESCA to practical problems. Trends in instrumental development and analytical applications of the techniques are discussed; the need for a multi-technique approach to solve complex analytical problems is emphasized. (orig.)

  12. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    Science.gov (United States)

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  13. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...... determine the result. This article reviews the most commonly used robust multivariate regression and exploratory methods that have appeared since 1996 in the field of chemometrics. Special emphasis is put on the robust versions of chemometric standard tools like PCA and PLS and the corresponding robust...

  14. Comparison of urine analysis using manual and sedimentation methods.

    Science.gov (United States)

    Kurup, R; Leich, M

    2012-06-01

    Microscopic examination of urine sediment is an essential part in the evaluation of renal and urinary tract diseases. Traditionally, urine sediments are assessed by microscopic examination of centrifuged urine. However the current method used by the Georgetown Public Hospital Corporation Medical Laboratory involves uncentrifuged urine. To encourage high level of care, the results provided to the physician must be accurate and reliable for proper diagnosis. The aim of this study is to determine whether the centrifuge method is more clinically significant than the uncentrifuged method. In this study, a comparison between the results obtained from centrifuged and uncentrifuged methods were performed. A total of 167 urine samples were randomly collected and analysed during the period April-May 2010 at the Medical Laboratory, Georgetown Public Hospital Corporation. The urine samples were first analysed microscopically by the uncentrifuged, and then by the centrifuged method. The results obtained from both methods were recorded in a log book. These results were then entered into a database created in Microsoft Excel, and analysed for differences and similarities using this application. Analysis was further done in SPSS software to compare the results using Pearson ' correlation. When compared using Pearson's correlation coefficient analysis, both methods showed a good correlation between urinary sediments with the exception of white bloods cells. The centrifuged method had a slightly higher identification rate for all of the parameters. There is substantial agreement between the centrifuged and uncentrifuged methods. However the uncentrifuged method provides for a rapid turnaround time.

  15. The Dynamic Monte Carlo Method for Transient Analysis of Nuclear Reactors

    NARCIS (Netherlands)

    Sjenitzer, B.L.

    2013-01-01

    In this thesis a new method for the analysis of power transients in a nuclear reactor is developed, which is more accurate than the present state-of-the-art methods. Transient analysis is important tool when designing nuclear reactors, since they predict the behaviour of a reactor during changing

  16. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  17. Establishment of analysis method for methane detection by gas chromatography

    Science.gov (United States)

    Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu

    2018-02-01

    The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.

  18. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  19. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    Science.gov (United States)

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  20. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  1. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  2. Fringe image analysis based on the amplitude modulation method.

    Science.gov (United States)

    Gai, Shaoyan; Da, Feipeng

    2010-05-10

    A novel phase-analysis method is proposed. To get the fringe order of a fringe image, the amplitude-modulation fringe pattern is carried out, which is combined with the phase-shift method. The primary phase value is obtained by a phase-shift algorithm, and the fringe-order information is encoded in the amplitude-modulation fringe pattern. Different from other methods, the amplitude-modulation fringe identifies the fringe order by the amplitude of the fringe pattern. In an amplitude-modulation fringe pattern, each fringe has its own amplitude; thus, the order information is integrated in one fringe pattern, and the absolute fringe phase can be calculated correctly and quickly with the amplitude-modulation fringe image. The detailed algorithm is given, and the error analysis of this method is also discussed. Experimental results are presented by a full-field shape measurement system where the data has been processed using the proposed algorithm. (c) 2010 Optical Society of America.

  3. Numerical Solution of Nonlinear Fredholm Integro-Differential Equations Using Spectral Homotopy Analysis Method

    Directory of Open Access Journals (Sweden)

    Z. Pashazadeh Atabakan

    2013-01-01

    Full Text Available Spectral homotopy analysis method (SHAM as a modification of homotopy analysis method (HAM is applied to obtain solution of high-order nonlinear Fredholm integro-differential problems. The existence and uniqueness of the solution and convergence of the proposed method are proved. Some examples are given to approve the efficiency and the accuracy of the proposed method. The SHAM results show that the proposed approach is quite reasonable when compared to homotopy analysis method, Lagrange interpolation solutions, and exact solutions.

  4. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    Science.gov (United States)

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  5. [Sample preparation methods for chromatographic analysis of organic components in atmospheric particulate matter].

    Science.gov (United States)

    Hao, Liang; Wu, Dapeng; Guan, Yafeng

    2014-09-01

    The determination of organic composition in atmospheric particulate matter (PM) is of great importance in understanding how PM affects human health, environment, climate, and ecosystem. Organic components are also the scientific basis for emission source tracking, PM regulation and risk management. Therefore, the molecular characterization of the organic fraction of PM has become one of the priority research issues in the field of environmental analysis. Due to the extreme complexity of PM samples, chromatographic methods have been the chief selection. The common procedure for the analysis of organic components in PM includes several steps: sample collection on the fiber filters, sample preparation (transform the sample into a form suitable for chromatographic analysis), analysis by chromatographic methods. Among these steps, the sample preparation methods will largely determine the throughput and the data quality. Solvent extraction methods followed by sample pretreatment (e. g. pre-separation, derivatization, pre-concentration) have long been used for PM sample analysis, and thermal desorption methods have also mainly focused on the non-polar organic component analysis in PM. In this paper, the sample preparation methods prior to chromatographic analysis of organic components in PM are reviewed comprehensively, and the corresponding merits and limitations of each method are also briefly discussed.

  6. A complex neutron activation method for the analysis of biological materials

    International Nuclear Information System (INIS)

    Ordogh, M.

    1978-01-01

    The aim of the present work was to deal primarily with a few essential trace elements and to obtain reliable results of adequate accuracy and precision for the analysis of biological samples. A few other than trace elements were determined by the nondestructive technique as they can be well evaluated from the gamma-spectra. In the development of the method BOWEN's kale was chosen as model material. To confirm the reliability of the method two samples were analysed proposed by the IAEA in the frame of an international comparative analysis series. The comparative analysis shows the present method to be reliable, the precision and accuracy are good. (author)

  7. Analysis and comparison of biometric methods

    OpenAIRE

    Zatloukal, Filip

    2011-01-01

    The thesis deals with biometrics and biometric systems and the possibility to use these systems in the enterprise. Aim of this study is an analysis and description of selected types of biometric identification methods and their advantages and shortcomings. The work is divided into two parts. The first part is theoretical, describes the basic concepts of biometrics, biometric identification criteria, currently used identification systems, the ways of biometric systems use, performance measurem...

  8. Study of nasal swipe analysis methods at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Metcalf, R.A.

    1996-01-01

    The Health Physics Analysis Laboratory (HPAL) performs around 30,000 nasal swipe analyses for transuranic nuclides each year in support of worker health and safety at the Los Alamos National Laboratory (LANL). The analysis method used employs cotton swabs swiped inside a nostril and liquid scintillation analyses of the swabs. The technical basis of this method was developed at LANL and has been in use for over 10 years. Recently, questions regarding the usefulness of a non-homogeneous mixture in liquid scintillation analyses have created a need for re-evaluation of the method. A study of the validity of the method shows it provides reliable, stable, and useful data as an indicator of personnel contamination. The study has also provided insight into the underlying process which occurs to allow the analysis. Further review of this process has shown that similar results can be obtained with different sample matrices, using less material than the current analysis method. This reduction can save HPAL the cost of materials as well as greatly reduce the waste created. Radionuclides of concern include Am-241, Pu-239, and Pu-238

  9. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Diprete, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); McCabe, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  10. Comparation studies of uranium analysis method using spectrophotometer and voltammeter

    International Nuclear Information System (INIS)

    Sugeng Pomomo

    2013-01-01

    Comparation studies of uranium analysis method by spectrophotometer and voltammeter had been done. The objective of experiment is to examine the reliability of analysis method and instrument performance by evaluate parameters; linearity, accuracy, precision and detection limit. Uranyl nitrate hexahydrate is used as standard, and the sample is solvent mixture of tributyl phosphate and kerosene containing uranium (from phosphoric acid purification unit Petrokimia Gresik). Uranium (U) stripping in the sample use HN0 3 0,5 N and then was analyzed by using of both instrument. Analysis of standard show that both methods give a good linearity by correlation coefficient > 0,999. Spectrophotometry give accuration 99,34 - 101,05 % with ratio standard deviation (RSD) 1,03 %; detection limit (DL) 0,05 ppm. Voltammetry give accuration 95,63 -101,49 % with RSD 3,91 %; detection limit (DL) 0,509 ppm. On the analysis of sludge samples were given the significantly different in result; spectrophotometry give U concentration 4,445 ppm by RSD 6,74 % and voltammetry give U concentration 7,693 by RSD 19,53%. (author)

  11. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    Science.gov (United States)

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.

  12. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  13. Application of the probabilistic approximate analysis method to a turbopump blade analysis. [for Space Shuttle Main Engine

    Science.gov (United States)

    Thacker, B. H.; Mcclung, R. C.; Millwater, H. R.

    1990-01-01

    An eigenvalue analysis of a typical space propulsion system turbopump blade is presented using an approximate probabilistic analysis methodology. The methodology was developed originally to investigate the feasibility of computing probabilistic structural response using closed-form approximate models. This paper extends the methodology to structures for which simple closed-form solutions do not exist. The finite element method will be used for this demonstration, but the concepts apply to any numerical method. The results agree with detailed analysis results and indicate the usefulness of using a probabilistic approximate analysis in determining efficient solution strategies.

  14. Piezoelectric Analysis of Saw Sensor Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    Vladimír KUTIŠ

    2013-06-01

    Full Text Available In this contribution modeling and simulation of surface acoustic waves (SAW sensor using finite element method will be presented. SAW sensor is made from piezoelectric GaN layer and SiC substrate. Two different analysis types are investigated - modal and transient. Both analyses are only 2D. The goal of modal analysis, is to determine the eigenfrequency of SAW, which is used in following transient analysis. In transient analysis, wave propagation in SAW sensor is investigated. Both analyses were performed using FEM code ANSYS.

  15. Impact response analysis of cask for spent fuel by dimensional analysis and mode superposition method

    International Nuclear Information System (INIS)

    Kim, Y. J.; Kim, W. T.; Lee, Y. S.

    2006-01-01

    Full text: Full text: Due to the potentiality of accidents, the transportation safety of radioactive material has become extremely important in these days. The most important means of accomplishing the safety in transportation for radioactive material is the integrity of cask. The cask for spent fuel consists of a cask body and two impact limiters generally. The impact limiters are attached at the upper and the lower of the cask body. The cask comprises general requirements and test requirements for normal transport conditions and hypothetical accident conditions in accordance with IAEA regulations. Among the test requirements for hypothetical accident conditions, the 9 m drop test of dropping the cask from 9 m height to unyielding surface to get maximum damage becomes very important requirement because it can affect the structural soundness of the cask. So far the impact response analysis for 9 m drop test has been obtained by finite element method with complex computational procedure. In this study, the empirical equations of the impact forces for 9 m drop test are formulated by dimensional analysis. And then using the empirical equations the characteristics of material used for impact limiters are analysed. Also the dynamic impact response of the cask body is analysed using the mode superposition method and the analysis method is proposed. The results are also validated by comparing with previous experimental results and finite element analysis results. The present method is simpler than finite element method and can be used to predict the impact response of the cask

  16. Titanium Analysis of Ilmenite Bangka by Spectrophotometric Method Using Perhidrol

    International Nuclear Information System (INIS)

    Rusydi-S

    2004-01-01

    Determination of titanium by spectrofotometric method using perhydrol has been done. The purpose of experiment is to find out the condition of titanium analysis of ilmenite by spectrophotometric method. The experiment parameter e c. analysis condition include the Ti-perhydrol complex spectrum, acidity of complex, perhydrol concentration, linearity, influence of anion, limit detection, interfere elements, application of the method using 40 SRM Standard and analysis for ilmenite samples. Result of experiments are spectrum of Ti-perhydrol complex at 410 nm, acidity of complex is the H 2 SO 4 2 M, perhydrol concentration is 0,24 %, the curve calibration is linier at 1-80 ppm, anion of PO 4 , NO 3 , Cl, CO 3 not be influence until 2000 ppm, limit detection is 0,30 ppm and Mo, V is interfere. The method which was aplicated to 40 SRM standard, shown that the content of Ti is 1580 ppm while the standard is 1600 ppm, and the content of Ti from ilmenite high grade samples is 31,46 % and low grade is 13,55 %. (author)

  17. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    2013-01-01

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO 2 and MOX fuel rods, (3) analysis of isotopic composition data for UO 2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  18. Validation study of core analysis methods for full MOX BWR

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO{sub 2} and MOX fuel rods, (3) analysis of isotopic composition data for UO{sub 2} and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  19. MANNER OF STOCKS SORTING USING CLUSTER ANALYSIS METHODS

    Directory of Open Access Journals (Sweden)

    Jana Halčinová

    2014-06-01

    Full Text Available The aim of the present article is to show the possibility of using the methods of cluster analysis in classification of stocks of finished products. Cluster analysis creates groups (clusters of finished products according to similarity in demand i.e. customer requirements for each product. Manner stocks sorting of finished products by clusters is described a practical example. The resultants clusters are incorporated into the draft layout of the distribution warehouse.

  20. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  1. Evaluation of sample extraction methods for proteomics analysis of green algae Chlorella vulgaris.

    Science.gov (United States)

    Gao, Yan; Lim, Teck Kwang; Lin, Qingsong; Li, Sam Fong Yau

    2016-05-01

    Many protein extraction methods have been developed for plant proteome analysis but information is limited on the optimal protein extraction method from algae species. This study evaluated four protein extraction methods, i.e. direct lysis buffer method, TCA-acetone method, phenol method, and phenol/TCA-acetone method, using green algae Chlorella vulgaris for proteome analysis. The data presented showed that phenol/TCA-acetone method was superior to the other three tested methods with regards to shotgun proteomics. Proteins identified using shotgun proteomics were validated using sequential window acquisition of all theoretical fragment-ion spectra (SWATH) technique. Additionally, SWATH provides protein quantitation information from different methods and protein abundance using different protein extraction methods was evaluated. These results highlight the importance of green algae protein extraction method for subsequent MS analysis and identification. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Investigating Convergence Patterns for Numerical Methods Using Data Analysis

    Science.gov (United States)

    Gordon, Sheldon P.

    2013-01-01

    The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…

  3. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  4. Robust gene selection methods using weighting schemes for microarray data analysis.

    Science.gov (United States)

    Kang, Suyeon; Song, Jongwoo

    2017-09-02

    A common task in microarray data analysis is to identify informative genes that are differentially expressed between two different states. Owing to the high-dimensional nature of microarray data, identification of significant genes has been essential in analyzing the data. However, the performances of many gene selection techniques are highly dependent on the experimental conditions, such as the presence of measurement error or a limited number of sample replicates. We have proposed new filter-based gene selection techniques, by applying a simple modification to significance analysis of microarrays (SAM). To prove the effectiveness of the proposed method, we considered a series of synthetic datasets with different noise levels and sample sizes along with two real datasets. The following findings were made. First, our proposed methods outperform conventional methods for all simulation set-ups. In particular, our methods are much better when the given data are noisy and sample size is small. They showed relatively robust performance regardless of noise level and sample size, whereas the performance of SAM became significantly worse as the noise level became high or sample size decreased. When sufficient sample replicates were available, SAM and our methods showed similar performance. Finally, our proposed methods are competitive with traditional methods in classification tasks for microarrays. The results of simulation study and real data analysis have demonstrated that our proposed methods are effective for detecting significant genes and classification tasks, especially when the given data are noisy or have few sample replicates. By employing weighting schemes, we can obtain robust and reliable results for microarray data analysis.

  5. Image segmentation and particles classification using texture analysis method

    Directory of Open Access Journals (Sweden)

    Mayar Aly Atteya

    Full Text Available Introduction: Ingredients of oily fish include a large amount of polyunsaturated fatty acids, which are important elements in various metabolic processes of humans, and have also been used to prevent diseases. However, in an attempt to reduce cost, recent developments are starting a replace the ingredients of fish oil with products of microalgae, that also produce polyunsaturated fatty acids. To do so, it is important to closely monitor morphological changes in algae cells and monitor their age in order to achieve the best results. This paper aims to describe an advanced vision-based system to automatically detect, classify, and track the organic cells using a recently developed SOPAT-System (Smart On-line Particle Analysis Technology, a photo-optical image acquisition device combined with innovative image analysis software. Methods The proposed method includes image de-noising, binarization and Enhancement, as well as object recognition, localization and classification based on the analysis of particles’ size and texture. Results The methods allowed for correctly computing cell’s size for each particle separately. By computing an area histogram for the input images (1h, 18h, and 42h, the variation could be observed showing a clear increase in cell. Conclusion The proposed method allows for algae particles to be correctly identified with accuracies up to 99% and classified correctly with accuracies up to 100%.

  6. The Constant Comparative Analysis Method Outside of Grounded Theory

    Science.gov (United States)

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  7. Regional frequency analysis of extreme rainfalls using partial L moments method

    Science.gov (United States)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  8. Analysis of investment appeal of the industrial enterprise by eigenstate method

    Directory of Open Access Journals (Sweden)

    Buslaeva O.S.

    2017-01-01

    Full Text Available An analysis of enterprise performance is considered. The problem is solved based on the analysis of the base indicators of functioning of the enterprise in terms of improving economic stability, and also development of autoregulation mechanism of economical stability of enterprise. An eigenstate method is proposed for the analysis of the basic indicators of the enterprise as it allows to construct an economomical stability model of enterprise. Methodology of economic stability analysis of enterprise on the basis of eigenstate method is described. The formulas for calculating the complex indicator of economic stability are given. The effectiveness of the methodology is demonstrated on the example of economic stability analysis of the large trading company.

  9. Dependent data in social sciences research forms, issues, and methods of analysis

    CERN Document Server

    Eye, Alexander; Wiedermann, Wolfgang

    2015-01-01

    This volume presents contributions on handling data in which the postulate of independence in the data matrix is violated. When this postulate is violated and when the methods assuming independence are still applied, the estimated parameters are likely to be biased, and statistical decisions are very likely to be incorrect. Problems associated with dependence in data have been known for a long time, and led to the development of tailored methods for the analysis of dependent data in various areas of statistical analysis. These methods include, for example, methods for the analysis of longitudinal data, corrections for dependency, and corrections for degrees of freedom. This volume contains the following five sections: growth curve modeling, directional dependence, dyadic data modeling, item response modeling (IRT), and other methods for the analysis of dependent data (e.g., approaches for modeling cross-section dependence, multidimensional scaling techniques, and mixed models). Researchers and graduate stud...

  10. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  11. Analysis of Piezoelectric Solids using Finite Element Method

    Science.gov (United States)

    Aslam, Mohammed; Nagarajan, Praveen; Remanan, Mini

    2018-03-01

    Piezoelectric materials are extensively used in smart structures as sensors and actuators. In this paper, static analysis of three piezoelectric solids is done using general-purpose finite element software, Abaqus. The simulation results from Abaqus are compared with the results obtained using numerical methods like Boundary Element Method (BEM) and meshless point collocation method (PCM). The BEM and PCM are cumbersome for complex shape and complicated boundary conditions. This paper shows that the software Abaqus can be used to solve the governing equations of piezoelectric solids in a much simpler and faster way than the BEM and PCM.

  12. Recent characterization of steel by surface analysis methods

    International Nuclear Information System (INIS)

    Suzuki, Shigeru

    1996-01-01

    Surface analysis methods, such as Auger electron spectroscopy, X-ray photoelectron spectroscopy, secondary ion mass spectrometry, glow discharge optical emission spectrometry and so on, have become indispensable to characterize surface and interface of many kinds of steel. Although a number of studies on characterization of steel by these methods have been carried out, several problems still remain in quantification and depth profiling. Nevertheless, the methods have provided essential information on the concentration and chemical state of elements at the surface and interface. Recent results on characterization of oxide layers, coated films, etc. on the surface of steel are reviewed here. (author). 99 refs

  13. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    Foutes, C.E.; Queenan, C.J. III

    1987-01-01

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  14. Applications of modern statistical methods to analysis of data in physical science

    Science.gov (United States)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance

  15. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    International Nuclear Information System (INIS)

    Kenley, C. Robert; Collins, John W.; Beck, John M.; Heydt, Harold J.; Garcia, Chad B.

    2001-01-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  16. Comparative analysis of clustering methods for gene expression time course data

    Directory of Open Access Journals (Sweden)

    Ivan G. Costa

    2004-01-01

    Full Text Available This work performs a data driven comparative study of clustering methods used in the analysis of gene expression time courses (or time series. Five clustering methods found in the literature of gene expression analysis are compared: agglomerative hierarchical clustering, CLICK, dynamical clustering, k-means and self-organizing maps. In order to evaluate the methods, a k-fold cross-validation procedure adapted to unsupervised methods is applied. The accuracy of the results is assessed by the comparison of the partitions obtained in these experiments with gene annotation, such as protein function and series classification.

  17. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or computer implementation--of numerical algorithms, depending on the background and interests of students. Designed for upper-division undergraduates in mathematics or computer science classes, the textbook assumes that students have prior knowledge of linear algebra and calculus, although these topics are reviewed in the text. Short discussions of the history of numerical methods are interspersed throughout the chapters. The book a...

  18. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  19. The Impact of Normalization Methods on RNA-Seq Data Analysis

    Science.gov (United States)

    Zyprych-Walczak, J.; Szabelska, A.; Handschuh, L.; Górczak, K.; Klamecka, K.; Figlerowicz, M.; Siatkowski, I.

    2015-01-01

    High-throughput sequencing technologies, such as the Illumina Hi-seq, are powerful new tools for investigating a wide range of biological and medical problems. Massive and complex data sets produced by the sequencers create a need for development of statistical and computational methods that can tackle the analysis and management of data. The data normalization is one of the most crucial steps of data processing and this process must be carefully considered as it has a profound effect on the results of the analysis. In this work, we focus on a comprehensive comparison of five normalization methods related to sequencing depth, widely used for transcriptome sequencing (RNA-seq) data, and their impact on the results of gene expression analysis. Based on this study, we suggest a universal workflow that can be applied for the selection of the optimal normalization procedure for any particular data set. The described workflow includes calculation of the bias and variance values for the control genes, sensitivity and specificity of the methods, and classification errors as well as generation of the diagnostic plots. Combining the above information facilitates the selection of the most appropriate normalization method for the studied data sets and determines which methods can be used interchangeably. PMID:26176014

  20. Strength Analysis on Ship Ladder Using Finite Element Method

    Science.gov (United States)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  1. direct method of analysis of an isotropic rectangular plate direct

    African Journals Online (AJOL)

    eobe

    This work evaluates the static analysis of an isotropic rectangular plate with various the static analysis ... method according to Ritz is used to obtain the total potential energy of the plate by employing the used to ..... for rectangular plates analysis, as the behavior of the ... results obtained by previous research work that used.

  2. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  3. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  4. Comparative Analysis Of Dempster Shafer Method With Certainty Factor Method For Diagnose Stroke Diseases

    Directory of Open Access Journals (Sweden)

    Erwin Kuit Panggabean

    2018-02-01

    Full Text Available The development of artificial intelligence technology that has occurred has allowed expert systems to be applied in detecting disease using programming languages. One in terms of providing information about a variety of disease problems that have recently been feared by Indonesian society, namely stroke. Expert system method used is dempster shafer and certainty factor method is used to analyze the comparison of both methods in stroke.Based on the analysis result, it is found that certainty factor is better than demster shafer and more accurate in handling the knowledge representation of stoke disease according to the symptoms of disease obtained from one hospital in medan city, uniqueness of algorithm that exist in both methods.

  5. Multiobjective flux balancing using the NISE method for metabolic network analysis.

    Science.gov (United States)

    Oh, Young-Gyun; Lee, Dong-Yup; Lee, Sang Yup; Park, Sunwon

    2009-01-01

    Flux balance analysis (FBA) is well acknowledged as an analysis tool of metabolic networks in the framework of metabolic engineering. However, FBA has a limitation for solving a multiobjective optimization problem which considers multiple conflicting objectives. In this study, we propose a novel multiobjective flux balance analysis method, which adapts the noninferior set estimation (NISE) method (Solanki et al., 1993) for multiobjective linear programming (MOLP) problems. NISE method can generate an approximation of the Pareto curve for conflicting objectives without redundant iterations of single objective optimization. Furthermore, the flux distributions at each Pareto optimal solution can be obtained for understanding the internal flux changes in the metabolic network. The functionality of this approach is shown by applying it to a genome-scale in silico model of E. coli. Multiple objectives for the poly(3-hydroxybutyrate) [P(3HB)] production are considered simultaneously, and relationships among them are identified. The Pareto curve for maximizing succinic acid production vs. maximizing biomass production is used for the in silico analysis of various combinatorial knockout strains. This proposed method accelerates the strain improvement in the metabolic engineering by reducing computation time of obtaining the Pareto curve and analysis time of flux distribution at each Pareto optimal solution. (c) 2009 American Institute of Chemical Engineers Biotechnol. Prog., 2009.

  6. Methods in carbon K-edge NEXAFS: Experiment and analysis

    International Nuclear Information System (INIS)

    Watts, B.; Thomsen, L.; Dastoor, P.C.

    2006-01-01

    Near-edge X-ray absorption spectroscopy (NEXAFS) is widely used to probe the chemistry and structure of surface layers. Moreover, using ultra-high brilliance polarised synchrotron light sources, it is possible to determine the molecular alignment of ultra-thin surface films. However, the quantitative analysis of NEXAFS data is complicated by many experimental factors and, historically, the essential methods of calibration, normalisation and artefact removal are presented in the literature in a somewhat fragmented manner, thus hindering their integrated implementation as well as their further development. This paper outlines a unified, systematic approach to the collection and quantitative analysis of NEXAFS data with a particular focus upon carbon K-edge spectra. As a consequence, we show that current methods neglect several important aspects of the data analysis process, which we address with a combination of novel and adapted techniques. We discuss multiple approaches in solving the issues commonly encountered in the analysis of NEXAFS data, revealing the inherent assumptions of each approach and providing guidelines for assessing their appropriateness in a broad range of experimental situations

  7. Study on color difference estimation method of medicine biochemical analysis

    Science.gov (United States)

    Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun

    2006-01-01

    The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.

  8. Advanced Cellular and Biomolecular Imaging at Lehigh University, (PA) Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Cassimeris, Lynne, U.

    2010-09-10

    Lehigh University is establishing an interdisciplinary program in high resolution cellular and subcellular biological imaging for a range of applications including improved cancer detection. The completed DOE project added to Lehigh?s bio-imaging infrastructure through acquisition of a new confocal microscope system as well as upgrades to two pieces of existing equipment. Bio-imaging related research at Lehigh was also supported through two seed grants for initiation of new projects.

  9. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  10. A general first-order global sensitivity analysis method

    International Nuclear Information System (INIS)

    Xu Chonggang; Gertner, George Zdzislaw

    2008-01-01

    Fourier amplitude sensitivity test (FAST) is one of the most popular global sensitivity analysis techniques. The main mechanism of FAST is to assign each parameter with a characteristic frequency through a search function. Then, for a specific parameter, the variance contribution can be singled out of the model output by the characteristic frequency. Although FAST has been widely applied, there are two limitations: (1) the aliasing effect among parameters by using integer characteristic frequencies and (2) the suitability for only models with independent parameters. In this paper, we synthesize the improvement to overcome the aliasing effect limitation [Tarantola S, Gatelli D, Mara TA. Random balance designs for the estimation of first order global sensitivity indices. Reliab Eng Syst Safety 2006; 91(6):717-27] and the improvement to overcome the independence limitation [Xu C, Gertner G. Extending a global sensitivity analysis technique to models with correlated parameters. Comput Stat Data Anal 2007, accepted for publication]. In this way, FAST can be a general first-order global sensitivity analysis method for linear/nonlinear models with as many correlated/uncorrelated parameters as the user specifies. We apply the general FAST to four test cases with correlated parameters. The results show that the sensitivity indices derived by the general FAST are in good agreement with the sensitivity indices derived by the correlation ratio method, which is a non-parametric method for models with correlated parameters

  11. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  12. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  13. Research on reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Ye Zhimin; Zhang Peng

    2014-01-01

    In order to meet the demand of nuclear energy market in the future, many new concepts of nuclear energy systems has been put forward. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multigroup cross section libraries. Due to its strong geometry modeling capability and the application of continuous energy cross section libraries, the Monte Carlo method has been widely used in reactor physics calculations, and more and more researches on Monte Carlo method has been carried out. Neutronics-thermal hydraulics coupling analysis based on Monte Carlo method has been realized. However, it still faces the problems of long computation time and slow convergence which make it not applicable to the reactor core fuel management simulations. Drawn from the deterministic core analysis method, a new two-step core analysis scheme is proposed in this work. Firstly, Monte Carlo simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Secondly, the core diffusion calculations can be done with these multigroup cross sections. The new scheme can achieve high efficiency while maintain acceptable precision, so it can be used as an effective tool for the design and analysis of innovative nuclear energy systems. Numeric tests have been done in this work to verify the new scheme. (authors)

  14. Evaluation of methods for seismic analysis of nuclear fuel reprocessing plants, part 1

    International Nuclear Information System (INIS)

    Tokarz, F.J.; Murray, R.C.; Arthur, D.F.; Feng, W.W.; Wight, L.H.; Zaslawsky, M.

    1975-01-01

    Currently, no guidelines exist for choosing methods of structural analysis to evaluate the seismic hazard of nuclear fuel reprocessing plants. This study examines available methods and their applicability to fuel reprocessing plant structures. The results of this study should provide a basis for establishing guidelines recommending methods of seismic analysis for evaluating future fuel reprocessing plants. The approach taken is: (1) to identify critical plant structures and place them in four categories (structures at or near grade; deeply embedded structures; fully buried structures; equipment/vessels/attachments/piping), (2) to select a representative structure in each of the first three categories and perform static and dynamic analysis on each, and (3) to evaluate and recommend method(s) of analysis for structures within each category. The Barnwell Nuclear Fuel Plant is selected as representative of future commercial reprocessing plants. The effect of site characteristics on the structural response is also examined. The response spectra method of analysis combined with the finite element model for each category is recommended. For structures founded near or at grade, the lumped mass model could also be used. If a time history response is required, a time-history analysis is necessary. (U.S.)

  15. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  16. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent with the "true......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...

  17. Managerial Methods Based on Analysis, Recommended to a Boarding House

    Directory of Open Access Journals (Sweden)

    Solomia Andreş

    2015-06-01

    Full Text Available The paper presents a few theoretical and practical contributions regarding the implementing of analysis based methods, respectively a SWOT and an economic analysis, from the perspective and the demands of a firm management which functions with profits due to the activity of a boarding house. The two types of managerial methods recommended to the firm offer real and complex information necessary for the knowledge of the firm status and the elaboration of prediction for the maintaining of business viability.

  18. Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

    Science.gov (United States)

    Sharma, Sanjib

    2017-08-01

    Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.

  19. Advanced Analysis Methods in High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  20. Benchmarking Foot Trajectory Estimation Methods for Mobile Gait Analysis

    Directory of Open Access Journals (Sweden)

    Julius Hannink

    2017-08-01

    Full Text Available Mobile gait analysis systems based on inertial sensing on the shoe are applied in a wide range of applications. Especially for medical applications, they can give new insights into motor impairment in, e.g., neurodegenerative disease and help objectify patient assessment. One key component in these systems is the reconstruction of the foot trajectories from inertial data. In literature, various methods for this task have been proposed. However, performance is evaluated on a variety of datasets due to the lack of large, generally accepted benchmark datasets. This hinders a fair comparison of methods. In this work, we implement three orientation estimation and three double integration schemes for use in a foot trajectory estimation pipeline. All methods are drawn from literature and evaluated against a marker-based motion capture reference. We provide a fair comparison on the same dataset consisting of 735 strides from 16 healthy subjects. As a result, the implemented methods are ranked and we identify the most suitable processing pipeline for foot trajectory estimation in the context of mobile gait analysis.

  1. Asset Analysis Method for the Cyber Security of Man Machine Interface System

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Sung Kon; Kim, Hun Hee; Shin, Yeong Cheol [Korea Hydro and Nuclear Power, Daejeon (Korea, Republic of)

    2010-10-15

    As digital MMIS (Man Machine Interface System) is applied in Nuclear Power Plant (NPP), cyber security is becoming more and more important. Regulatory guide (KINS/GT-N27) requires that implementation plan for cyber security be prepared in NPP. Regulatory guide recommends the following 4 processes: 1) an asset analysis of MMIS, 2) a vulnerability analysis of MMIS, 3) establishment of countermeasures, and 4) establishment of operational guideline for cyber security. Conventional method for the asset analysis is mainly performed with a table form for each asset. Conventional method requires a lot of efforts due to the duplication of information. This paper presents an asset analysis method using object oriented approach for the NPP

  2. Asset Analysis Method for the Cyber Security of Man Machine Interface System

    International Nuclear Information System (INIS)

    Kang, Sung Kon; Kim, Hun Hee; Shin, Yeong Cheol

    2010-01-01

    As digital MMIS (Man Machine Interface System) is applied in Nuclear Power Plant (NPP), cyber security is becoming more and more important. Regulatory guide (KINS/GT-N27) requires that implementation plan for cyber security be prepared in NPP. Regulatory guide recommends the following 4 processes: 1) an asset analysis of MMIS, 2) a vulnerability analysis of MMIS, 3) establishment of countermeasures, and 4) establishment of operational guideline for cyber security. Conventional method for the asset analysis is mainly performed with a table form for each asset. Conventional method requires a lot of efforts due to the duplication of information. This paper presents an asset analysis method using object oriented approach for the NPP

  3. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  4. Collage Portraits as a Method of Analysis in Qualitative Research

    Directory of Open Access Journals (Sweden)

    Paula Gerstenblatt PhD

    2013-02-01

    Full Text Available This article explores the use of collage portraits in qualitative research and analysis. Collage portraiture, an area of arts-based research (ABR, is gaining stature as a method of analysis and documentation in many disciplines. This article presents a method of creating collage portraits to support a narrative thematic analysis that explored the impact of participation in an art installation construction. Collage portraits provide the opportunity to include marginalized voices and encourage a range of linguistic and non-linguistic representations to articulate authentic lived experiences. Other potential benefits to qualitative research are cross-disciplinary study and collaboration, innovative ways to engage and facilitate dialogue, and the building and dissemination of knowledge.

  5. Regional analysis of annual maximum rainfall using TL-moments method

    Science.gov (United States)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  6. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  7. An experimental-numerical method for comparative analysis of joint prosthesis

    International Nuclear Information System (INIS)

    Claramunt, R.; Rincon, E.; Zubizarreta, V.; Ros, A.

    2001-01-01

    The difficulty that exists in the analysis of mechanical stresses in bones is high due to its complex mechanical and morphological characteristics. This complexity makes generalists modelling and conclusions derived from prototype tests very questionable. In this article a relatively simple comparative analysis systematic method that allow us to establish some behaviour differences in different kind of prosthesis is presented. The method, applicable in principle to any joint problem, is based on analysing perturbations produced in natural stress states of a bone after insertion of a joint prosthesis and combines numerical analysis using a 3-D finite element model and experimental studies based on photoelastic coating and electric extensometry. The experimental method is applied to compare two total hip prosthesis cement-free femoral stems of different philosophy. One anatomic of new generation, being of oblique setting over cancellous bone and the other madreporique of trochantero-diaphyseal support over cortical bone. (Author) 4 refs

  8. Fluid-Induced Vibration Analysis for Reactor Internals Using Computational FSI Method

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Jong Sung; Yi, Kun Woo; Sung, Ki Kwang; Im, In Young; Choi, Taek Sang [KEPCO E and C, Daejeon (Korea, Republic of)

    2013-10-15

    This paper introduces a fluid-induced vibration analysis method which calculates the response of the RVI to both deterministic and random loads at once and utilizes more realistic pressure distribution using the computational Fluid Structure Interaction (FSI) method. As addressed above, the FIV analysis for the RVI was carried out using the computational FSI method. This method calculates the response to deterministic and random turbulence loads at once. This method is also a simple and integrative method to get structural dynamic responses of reactor internals to various flow-induced loads. Because the analysis of this paper omitted the bypass flow region and Inner Barrel Assembly (IBA) due to the limitation of computer resources, it is necessary to find an effective way to consider all regions in the RV for the FIV analysis in the future. Reactor coolant flow makes Reactor Vessel Internals (RVI) vibrate and may affect the structural integrity of them. U. S. NRC Regulatory Guide 1.20 requires the Comprehensive Vibration Assessment Program (CVAP) to verify the structural integrity of the RVI for Fluid-Induced Vibration (FIV). The hydraulic forces on the RVI of OPR1000 and APR1400 were computed from the hydraulic formulas and the CVAP measurements in Palo Verde Unit 1 and Yonggwang Unit 4 for the structural vibration analyses. In this method, the hydraulic forces were divided into deterministic and random turbulence loads and were used for the excitation forces of the separate structural analyses. These forces are applied to the finite element model and the responses to them were combined into the resultant stresses.

  9. CB3PMF - Thermohidraulic analysis using the open lateral boundary method

    International Nuclear Information System (INIS)

    Borges, R.C.; Andrade, G.G. de

    1985-01-01

    A calculation method for the thermohydraulic analysis of a nuclear reator having a large number of sub-channels is presented. The method uses the open lateral boundary which mantains the influence of the external boundaries of the channel under study and adds to the external face of the channel physical model important characteristcs that other computational models identify only at the sub-channel level. This permits to keep the mixture characteristics that exist between the channel under analysis and the neighboring ones from the previous step. This method is shown be valid, reliable and applicable to the steady state thermohydraulic analysis and permits greater flexibility in the application of coefficients and correlations. The additional computing time is negligible compared to the information obtained. (F.E.) [pt

  10. Distinct Interfacial Fluorescence in Oil-in-Water Emulsions via Exciton Migration of Conjugated Polymers.

    Science.gov (United States)

    Koo, Byungjin; Swager, Timothy M

    2017-09-01

    Commercial dyes are extensively utilized to stain specific phases for the visualization applications in emulsions and bioimaging. In general, dyes emit only one specific fluorescence signal and thus, in order to stain various phases and/or interfaces, one needs to incorporate multiple dyes and carefully consider their compatibility to avoid undesirable interactions with each other and with the components in the system. Herein, surfactant-type, perylene-endcapped fluorescent conjugated polymers that exhibit two different emissions are reported, which are cyan in water and red at oil-water interfaces. The interfacially distinct red emission results from enhanced exciton migration from the higher-bandgap polymer backbone to the lower-bandgap perylene endgroup. The confocal microscopy images exhibit the localized red emission exclusively from the circumference of oil droplets. This exciton migration and dual fluorescence of the polymers in different physical environments can provide a new concept of visualization methods in many amphiphilic colloidal systems and bioimaging. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Stimulated emission depletion microscopy resolves individual nitrogen vacancy centers in diamond nanocrystals.

    Science.gov (United States)

    Arroyo-Camejo, Silvia; Adam, Marie-Pierre; Besbes, Mondher; Hugonin, Jean-Paul; Jacques, Vincent; Greffet, Jean-Jacques; Roch, Jean-François; Hell, Stefan W; Treussart, François

    2013-12-23

    Nitrogen-vacancy (NV) color centers in nanodiamonds are highly promising for bioimaging and sensing. However, resolving individual NV centers within nanodiamond particles and the controlled addressing and readout of their spin state has remained a major challenge. Spatially stochastic super-resolution techniques cannot provide this capability in principle, whereas coordinate-controlled super-resolution imaging methods, like stimulated emission depletion (STED) microscopy, have been predicted to fail in nanodiamonds. Here we show that, contrary to these predictions, STED can resolve single NV centers in 40-250 nm sized nanodiamonds with a resolution of ≈10 nm. Even multiple adjacent NVs located in single nanodiamonds can be imaged individually down to relative distances of ≈15 nm. Far-field optical super-resolution of NVs inside nanodiamonds is highly relevant for bioimaging applications of these fluorescent nanolabels. The targeted addressing and readout of individual NV(-) spins inside nanodiamonds by STED should also be of high significance for quantum sensing and information applications.

  12. Assessment of the Prony's method for BWR stability analysis

    International Nuclear Information System (INIS)

    Ortiz-Villafuerte, Javier; Castillo-Duran, Rogelio; Palacios-Hernandez, Javier C.

    2011-01-01

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  13. Review of analysis methods for prestressed concrete reactor vessels

    International Nuclear Information System (INIS)

    Dodge, W.G.; Bazant, Z.P.; Gallagher, R.H.

    1977-02-01

    Theoretical and practical aspects of analytical models and numerical procedures for detailed analysis of prestressed concrete reactor vessels are reviewed. Constitutive models and numerical algorithms for time-dependent and nonlinear response of concrete and various methods for modeling crack propagation are discussed. Published comparisons between experimental and theoretical results are used to assess the accuracy of these analytical methods

  14. Automatic extraction of nuclei centroids of mouse embryonic cells from fluorescence microscopy images.

    Directory of Open Access Journals (Sweden)

    Md Khayrul Bashar

    Full Text Available Accurate identification of cell nuclei and their tracking using three dimensional (3D microscopic images is a demanding task in many biological studies. Manual identification of nuclei centroids from images is an error-prone task, sometimes impossible to accomplish due to low contrast and the presence of noise. Nonetheless, only a few methods are available for 3D bioimaging applications, which sharply contrast with 2D analysis, where many methods already exist. In addition, most methods essentially adopt segmentation for which a reliable solution is still unknown, especially for 3D bio-images having juxtaposed cells. In this work, we propose a new method that can directly extract nuclei centroids from fluorescence microscopy images. This method involves three steps: (i Pre-processing, (ii Local enhancement, and (iii Centroid extraction. The first step includes two variations: first variation (Variant-1 uses the whole 3D pre-processed image, whereas the second one (Variant-2 modifies the preprocessed image to the candidate regions or the candidate hybrid image for further processing. At the second step, a multiscale cube filtering is employed in order to locally enhance the pre-processed image. Centroid extraction in the third step consists of three stages. In Stage-1, we compute a local characteristic ratio at every voxel and extract local maxima regions as candidate centroids using a ratio threshold. Stage-2 processing removes spurious centroids from Stage-1 results by analyzing shapes of intensity profiles from the enhanced image. An iterative procedure based on the nearest neighborhood principle is then proposed to combine if there are fragmented nuclei. Both qualitative and quantitative analyses on a set of 100 images of 3D mouse embryo are performed. Investigations reveal a promising achievement of the technique presented in terms of average sensitivity and precision (i.e., 88.04% and 91.30% for Variant-1; 86.19% and 95.00% for Variant-2

  15. Advantages of a Dynamic RGGG Method in Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Shin, Seung Ki; Seong, Poong Hyun

    2009-01-01

    Various researches have been conducted in order to analyze dynamic interactions among components and process variables in nuclear power plants which cannot be handled by static reliability analysis methods such as conventional fault tree and event tree techniques. A dynamic reliability graph with general gates (RGGG) method was proposed for an intuitive modeling of dynamic systems and it enables one to easily analyze huge and complex systems. In this paper, advantages of the dynamic RGGG method are assessed through two stages: system modeling and quantitative analysis. And then a software tool for dynamic RGGG method is introduced and an application to a real dynamic system is accompanied

  16. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  17. Who's in and why? A typology of stakeholder analysis methods for natural resource management.

    Science.gov (United States)

    Reed, Mark S; Graves, Anil; Dandy, Norman; Posthumus, Helena; Hubacek, Klaus; Morris, Joe; Prell, Christina; Quinn, Claire H; Stringer, Lindsay C

    2009-04-01

    Stakeholder analysis means many things to different people. Various methods and approaches have been developed in different fields for different purposes, leading to confusion over the concept and practice of stakeholder analysis. This paper asks how and why stakeholder analysis should be conducted for participatory natural resource management research. This is achieved by reviewing the development of stakeholder analysis in business management, development and natural resource management. The normative and instrumental theoretical basis for stakeholder analysis is discussed, and a stakeholder analysis typology is proposed. This consists of methods for: i) identifying stakeholders; ii) differentiating between and categorising stakeholders; and iii) investigating relationships between stakeholders. The range of methods that can be used to carry out each type of analysis is reviewed. These methods and approaches are then illustrated through a series of case studies funded through the Rural Economy and Land Use (RELU) programme. These case studies show the wide range of participatory and non-participatory methods that can be used, and discuss some of the challenges and limitations of existing methods for stakeholder analysis. The case studies also propose new tools and combinations of methods that can more effectively identify and categorise stakeholders and help understand their inter-relationships.

  18. Development of motion image prediction method using principal component analysis

    International Nuclear Information System (INIS)

    Chhatkuli, Ritu Bhusal; Demachi, Kazuyuki; Kawai, Masaki; Sakakibara, Hiroshi; Kamiaka, Kazuma

    2012-01-01

    Respiratory motion can induce the limit in the accuracy of area irradiated during lung cancer radiation therapy. Many methods have been introduced to minimize the impact of healthy tissue irradiation due to the lung tumor motion. The purpose of this research is to develop an algorithm for the improvement of image guided radiation therapy by the prediction of motion images. We predict the motion images by using principal component analysis (PCA) and multi-channel singular spectral analysis (MSSA) method. The images/movies were successfully predicted and verified using the developed algorithm. With the proposed prediction method it is possible to forecast the tumor images over the next breathing period. The implementation of this method in real time is believed to be significant for higher level of tumor tracking including the detection of sudden abdominal changes during radiation therapy. (author)

  19. Review of the analysis methods of surface crack for straight pipe and elbow

    International Nuclear Information System (INIS)

    Kim, H. S.; Jang, Y. S.; Jin, T. E.

    1999-01-01

    The objective of this paper is to find out optimum EPFM analysis methods of straight pipe and elbow by comparison of load-carrying capacities. To do this, analytical and finite element analyses were performed and then these results compared with the ones in the literatures and experimental data to verify the validity of the analysis results. Comparison results showed that NSC method for straight pipe and SC.ELB2 method for elbow were appropriate ones among analytical methods except FEM to predict load-carrying capacities. However, the trend of prediction results scattered according to the analysis conditions such as geometry and material as well as analytical methods, it is necessary for cautious application of the analytical methods

  20. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.