WorldWideScience

Sample records for bioimage analysis methods

  1. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  2. Robust normalization protocols for multiplexed fluorescence bioimage analysis.

    Science.gov (United States)

    Ahmed Raza, Shan E; Langenkämper, Daniel; Sirinukunwattana, Korsuk; Epstein, David; Nattkemper, Tim W; Rajpoot, Nasir M

    2016-01-01

    study of mapping and interaction of co-localized proteins at a sub-cellular level is important for understanding complex biological phenomena. One of the recent techniques to map co-localized proteins is to use the standard immuno-fluorescence microscopy in a cyclic manner (Nat Biotechnol 24:1270-8, 2006; Proc Natl Acad Sci 110:11982-7, 2013). Unfortunately, these techniques suffer from variability in intensity and positioning of signals from protein markers within a run and across different runs. Therefore, it is necessary to standardize protocols for preprocessing of the multiplexed bioimaging (MBI) data from multiple runs to a comparable scale before any further analysis can be performed on the data. In this paper, we compare various normalization protocols and propose on the basis of the obtained results, a robust normalization technique that produces consistent results on the MBI data collected from different runs using the Toponome Imaging System (TIS). Normalization results produced by the proposed method on a sample TIS data set for colorectal cancer patients were ranked favorably by two pathologists and two biologists. We show that the proposed method produces higher between class Kullback-Leibler (KL) divergence and lower within class KL divergence on a distribution of cell phenotypes from colorectal cancer and histologically normal samples. PMID:26949415

  3. BioImage Suite: An integrated medical image analysis suite: An update

    OpenAIRE

    Papademetris, Xenophon; Jackowski, Marcel P; Rajeevan, Nallakkandi; DiStasio, Marcello; Okuda, Hirohito; Constable, R. Todd; Staib, Lawrence H.

    2006-01-01

    BioImage Suite is an NIH-supported medical image analysis software suite developed at Yale. It leverages both the Visualization Toolkit (VTK) and the Insight Toolkit (ITK) and it includes many additional algorithms for image analysis especially in the areas of segmentation, registration, diffusion weighted image processing and fMRI analysis. BioImage Suite has a user-friendly user interface developed in the Tcl scripting language. A final beta version is freely available for download 1

  4. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    OpenAIRE

    Ayyagari Sri Nagesh; G.P.Saradhi Varma; Govardhan, A.

    2012-01-01

    In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR) has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identific...

  5. Bioimage Informatics for Big Data.

    Science.gov (United States)

    Peng, Hanchuan; Zhou, Jie; Zhou, Zhi; Bria, Alessandro; Li, Yujie; Kleissas, Dean Mark; Drenkow, Nathan G; Long, Brian; Liu, Xiaoxiao; Chen, Hanbo

    2016-01-01

    Bioimage informatics is a field wherein high-throughput image informatics methods are used to solve challenging scientific problems related to biology and medicine. When the image datasets become larger and more complicated, many conventional image analysis approaches are no longer applicable. Here, we discuss two critical challenges of large-scale bioimage informatics applications, namely, data accessibility and adaptive data analysis. We highlight case studies to show that these challenges can be tackled based on distributed image computing as well as machine learning of image examples in a multidimensional environment. PMID:27207370

  6. Transforms and Operators for Directional Bioimage Analysis: A Survey.

    Science.gov (United States)

    Püspöki, Zsuzsanna; Storath, Martin; Sage, Daniel; Unser, Michael

    2016-01-01

    We give a methodology-oriented perspective on directional image analysis and rotation-invariant processing. We review the state of the art in the field and make connections with recent mathematical developments in functional analysis and wavelet theory. We unify our perspective within a common framework using operators. The intent is to provide image-processing methods that can be deployed in algorithms that analyze biomedical images with improved rotation invariance and high directional sensitivity. We start our survey with classical methods such as directional-gradient and the structure tensor. Then, we discuss how these methods can be improved with respect to robustness, invariance to geometric transformations (with a particular interest in scaling), and computation cost. To address robustness against noise, we move forward to higher degrees of directional selectivity and discuss Hessian-based detection schemes. To present multiscale approaches, we explain the differences between Fourier filters, directional wavelets, curvelets, and shearlets. To reduce the computational cost, we address the problem of matching directional patterns by proposing steerable filters, where one might perform arbitrary rotations and optimizations without discretizing the orientation. We define the property of steerability and give an introduction to the design of steerable filters. We cover the spectrum from simple steerable filters through pyramid schemes up to steerable wavelets. We also present illustrations on the design of steerable wavelets and their application to pattern recognition. PMID:27207363

  7. Transforms and Operators for Directional Bioimage Analysis: A Survey.

    Science.gov (United States)

    Püspöki, Zsuzsanna; Storath, Martin; Sage, Daniel; Unser, Michael

    2016-01-01

    We give a methodology-oriented perspective on directional image analysis and rotation-invariant processing. We review the state of the art in the field and make connections with recent mathematical developments in functional analysis and wavelet theory. We unify our perspective within a common framework using operators. The intent is to provide image-processing methods that can be deployed in algorithms that analyze biomedical images with improved rotation invariance and high directional sensitivity. We start our survey with classical methods such as directional-gradient and the structure tensor. Then, we discuss how these methods can be improved with respect to robustness, invariance to geometric transformations (with a particular interest in scaling), and computation cost. To address robustness against noise, we move forward to higher degrees of directional selectivity and discuss Hessian-based detection schemes. To present multiscale approaches, we explain the differences between Fourier filters, directional wavelets, curvelets, and shearlets. To reduce the computational cost, we address the problem of matching directional patterns by proposing steerable filters, where one might perform arbitrary rotations and optimizations without discretizing the orientation. We define the property of steerability and give an introduction to the design of steerable filters. We cover the spectrum from simple steerable filters through pyramid schemes up to steerable wavelets. We also present illustrations on the design of steerable wavelets and their application to pattern recognition.

  8. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  9. Upconverting nanophosphors for bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Shuang Fang; Zhuo Rui [Department of MAE, Princeton University, Princeton, NJ 08544 (United States); Riehn, Robert [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Tung, Chih-kuan; Dalland, Joanna; Austin, Robert H [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Ryu, William S [Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544 (United States)

    2009-10-07

    Upconverting nanoparticles (UCNPs) when excited in the near-infrared (NIR) region display anti-Stokes emission whereby the emitted photon is higher in energy than the excitation energy. The material system achieves that by converting two or more infrared photons into visible photons. The use of the infrared confers benefits to bioimaging because of its deeper penetrating power in biological tissues and the lack of autofluorescence. We demonstrate here sub-10 nm, upconverting rare earth oxide UCNPs synthesized by a combustion method that can be stably suspended in water when amine modified. The amine modified UCNPs show specific surface immobilization onto patterned gold surfaces. Finally, the low toxicity of the UCNPs is verified by testing on the multi-cellular C. elegans nematode.

  10. Chapter 17: bioimage informatics for systems pharmacology.

    Directory of Open Access Journals (Sweden)

    Fuhai Li

    2013-04-01

    Full Text Available Recent advances in automated high-resolution fluorescence microscopy and robotic handling have made the systematic and cost effective study of diverse morphological changes within a large population of cells possible under a variety of perturbations, e.g., drugs, compounds, metal catalysts, RNA interference (RNAi. Cell population-based studies deviate from conventional microscopy studies on a few cells, and could provide stronger statistical power for drawing experimental observations and conclusions. However, it is challenging to manually extract and quantify phenotypic changes from the large amounts of complex image data generated. Thus, bioimage informatics approaches are needed to rapidly and objectively quantify and analyze the image data. This paper provides an overview of the bioimage informatics challenges and approaches in image-based studies for drug and target discovery. The concepts and capabilities of image-based screening are first illustrated by a few practical examples investigating different kinds of phenotypic changes caEditorsused by drugs, compounds, or RNAi. The bioimage analysis approaches, including object detection, segmentation, and tracking, are then described. Subsequently, the quantitative features, phenotype identification, and multidimensional profile analysis for profiling the effects of drugs and targets are summarized. Moreover, a number of publicly available software packages for bioimage informatics are listed for further reference. It is expected that this review will help readers, including those without bioimage informatics expertise, understand the capabilities, approaches, and tools of bioimage informatics and apply them to advance their own studies.

  11. Quantitative analysis for radiation image measured by bio-image analyzer

    International Nuclear Information System (INIS)

    Bio-image analyzer is a system for detecting radiation images. In the system, the radiation image recorded on the imaging plate (coated with photostimulable phosphor on a polyester plate) is read out as light signals by laser beam excitation and the image data are processed by a computer. This system is mainly applied for the autoradiography of biological samples. In order to clarify the characteristics of the analyzer, the factors that affect to the quantification of radiation image have been investigated. The photostimulable phosphor shows the fading phenomenon and its quantity depends on the preservation temperature and period. Irradiating C14-β ray for definite time, the plates were preserved for 1 hour to 14 days under 10degC to 40degC and read out. The absolute output value, defined as a value unaffected by fading, was determined from the relation between irradiation time and the output, by extraporating the time to zero. Compared to the absolute value, the calibration factors were calculated and expressed as the function of storage time and temperature. The fading effects after Tl204-β and γ ray irradiation were also examined and the fading rates almost coincide with that of C14-β ray. (author)

  12. Semiquantitative fluorescence method for bioconjugation analysis.

    Science.gov (United States)

    Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser

    2014-01-01

    Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

  13. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  14. Open source bioimage informatics for cell biology.

    Science.gov (United States)

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  15. Elemental bioimaging and speciation analysis for the investigation of Wilson's disease using μXRF and XANES.

    Science.gov (United States)

    Hachmöller, Oliver; Buzanich, Ana Guilherme; Aichler, Michaela; Radtke, Martin; Dietrich, Dörthe; Schwamborn, Kristina; Lutz, Lisa; Werner, Martin; Sperling, Michael; Walch, Axel; Karst, Uwe

    2016-07-13

    A liver biopsy specimen from a Wilson's disease (WD) patient was analyzed by means of micro-X-ray fluorescence (μXRF) spectroscopy to determine the elemental distribution. First, bench-top μXRF was utilized for a coarse scan of the sample under laboratory conditions. The resulting distribution maps of copper and iron enabled the determination of a region of interest (ROI) for further analysis. In order to obtain more detailed elemental information, this ROI was analyzed by synchrotron radiation (SR)-based μXRF with a beam size of 4 μm offering a resolution at the cellular level. Distribution maps of additional elements to copper and iron like zinc and manganese were obtained due to a higher sensitivity of SR-μXRF. In addition to this, X-ray absorption near edge structure spectroscopy (XANES) was performed to identify the oxidation states of copper in WD. This speciation analysis indicated a mixture of copper(i) and copper(ii) within the WD liver tissue.

  16. Biomagnetics and bioimaging for medical applications

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shoogo [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)]. E-mail: ueno@medes.m.u-tokyo.ac.jp; Sekino, Masaki [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2006-09-15

    This paper reviews medical applications of the recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation, magnetoencephalography, magnetic resonance imaging, cancer therapy based on magnetic stimulation, and magnetic control of cell orientation and cell growth. These techniques are leading medicine and biology into a new horizon through the novel applications of magnetism.

  17. Quantum dots in bio-imaging: Revolution by the small

    International Nuclear Information System (INIS)

    Visual analysis of biomolecules is an integral avenue of basic and applied biological research. It has been widely carried out by tagging of nucleotides and proteins with traditional fluorophores that are limited in their application by features such as photobleaching, spectral overlaps, and operational difficulties. Quantum dots (QDs) are emerging as a superior alternative and are poised to change the world of bio-imaging and further its applications in basic and applied biology. The interdisciplinary field of nanobiotechnology is experiencing a revolution and QDs as an enabling technology have become a harbinger of this hybrid field. Within a decade, research on QDs has evolved from being a pure science subject to the one with high-end commercial applications

  18. Advances in Bio-Imaging From Physics to Signal Understanding Issues State-of-the-Art and Challenges

    CERN Document Server

    Racoceanu, Daniel; Gouaillard, Alexandre

    2012-01-01

    Advances in Imaging Devices and Image processing stem from cross-fertilization between many fields of research such as Chemistry, Physics, Mathematics and Computer Sciences. This BioImaging Community feel the urge to integrate more intensively its various results, discoveries and innovation into ready to use tools that can address all the new exciting challenges that Life Scientists (Biologists, Medical doctors, ...) keep providing, almost on a daily basis. Devising innovative chemical probes, for example, is an archetypal goal in which image quality improvement must be driven by the physics of acquisition, the image processing and analysis algorithms and the chemical skills in order to design an optimal bioprobe. This book offers an overview of the current advances in many research fields related to bioimaging and highlights the current limitations that would need to be addressed in the next decade to design fully integrated BioImaging Device.

  19. A simple one-step synthesis of melanin-originated red shift emissive carbonaceous dots for bioimaging.

    Science.gov (United States)

    Hu, Chuan; Liu, Yongmei; Chen, Jiantao; He, Qin; Gao, Huile

    2016-10-15

    Carbonaceous dots (CDs) are superior nanomaterials owing to their promising luminescence properties and good biocompatibility. However, most CDs have relatively short excitation/emission, which restrict their application in bioimaging. In this study, a simple one-step procedure was developed for synthesis of melanin-originated CDs (MNPs). The MNPs showed two long red shift emissions at 570nm and 645nm with broad absorptions from 200nm to 400nm and 500nm to 700nm, suggesting the great potential of MNPs in bioimaging. Besides, several experiments indicated that MNPs possessed good serum stability and well blood compatibility. In vitro, MNPs could be taken up by C6 cell in a concentration- and time-dependent manner with endosomes involved. In conclusion, MNPs were prepared using a simple one-step method with unique optical and good biological properties and could be used for bioimaging. PMID:27416289

  20. Intelligent spectral signature bio-imaging in vivo for surgical applications

    Science.gov (United States)

    Jeong, Jihoon; Frykman, Philip K.; Gaon, Mark; Chung, Alice P.; Lindsley, Erik H.; Hwang, Jae Y.; Farkas, Daniel L.

    2007-02-01

    Multi-spectral imaging provides digital images of a scene or object at a large, usually sequential number of wavelengths, generating precise optical spectra at every pixel. We use the term "spectral signature" for a quantitative plot of optical property variations as a function of wavelengths. We present here intelligent spectral signature bio-imaging methods we developed, including automatic signature selection based on machine learning algorithms and database search-based automatic color allocations, and selected visualization schemes matching these approaches. Using this intelligent spectral signature bio-imaging method, we could discriminate normal and aganglionic colon tissue of the Hirschsprung's disease mouse model with over 95% sensitivity and specificity in various similarity measure methods and various anatomic organs such as parathyroid gland, thyroid gland and pre-tracheal fat in dissected neck of the rat in vivo.

  1. Synthesis, Structure, Properties, and Bioimaging of a Fluorescent Nitrogen-Linked Bisbenzothiadiazole.

    Science.gov (United States)

    Mota, Alberto A R; Corrêa, José R; Carvalho, Pedro H P R; de Sousa, Núbia M P; de Oliveira, Heibbe C B; Gatto, Claudia C; da Silva Filho, Demétrio A; de Oliveira, Aline L; Neto, Brenno A D

    2016-04-01

    This paper describes the synthesis, structure, photophysical properties, and bioimaging application of a novel 2,1,3-benzothiadiazole (BTD)-based rationally designed fluorophore. The capability of undergoing efficient stabilizing processes from the excited state allowed the novel BTD derivative to be used as a stable probe for bioimaging applications. No notable photobleaching effect or degradation could be observed during the experimental time period. Before the synthesis, the molecular architecture of the novel BTD derivative was evaluated by means of DFT calculations to validate the chosen design. Single-crystal X-ray analysis revealed the nearly flat characteristics of the structure in a syn conformation. The fluorophore was successfully tested as a live-cell-imaging probe and efficiently stained MCF-7 breast cancer cell lineages. PMID:26930300

  2. Applications of graphene and its derivatives in intracellular biosensing and bioimaging.

    Science.gov (United States)

    Zhu, Xiaohua; Liu, Yang; Li, Pei; Nie, Zhou; Li, Jinghong

    2016-08-01

    Graphene has a unique planar structure, as well as excellent electronic properties, and has attracted a great deal of interest from scientists. Graphene and its derivatives display advantageous characteristics as a biosensing platform due to their high surface area, good biocompatibility and ease of functionalization. Moreover, graphene and its derivatives exhibit excellent optical properties; thus they are considered to be promising and attractive candidates for bioimaging, mainly of cells and tissues. Following an introduction and a discussion of the optical properties of graphene, this review assesses the methods for engineering the functions of graphene and its derivatives. Specific examples are given on the use of graphene and its derivatives in fluorescence bioimaging, surface-enhanced Raman scattering (SERS) imaging, and magnetic resonance imaging (MRI). Finally, the prospects and further developments in this exciting field of graphene-based materials are suggested. PMID:27373227

  3. Bioimaging mass spectrometry of trace elements – recent advance and applications of LA-ICP-MS: A review

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J.Sabine, E-mail: s.becker@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany); Matusch, Andreas, E-mail: a.matusch@fz-juelich.de [Institute for Neuroscience and Medicine (INM-2), Forschungszentrum Jülich, Jülich D-52425 (Germany); Wu, Bei, E-mail: b.wu@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany)

    2014-07-04

    Highlights: • Bioimaging LA-ICP-MS is established for trace metals within biomedical specimens. • Trace metal imaging allows to study brain function and neurodegenerative diseases. • Laser microdissection ICP-MS was applied to mouse brain hippocampus and wheat root. - Abstract: Bioimaging using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) offers the capability to quantify trace elements and isotopes within tissue sections with a spatial resolution ranging about 10–100 μm. Distribution analysis adds to clarifying basic questions of biomedical research and enables bioaccumulation and bioavailability studies for ecological and toxicological risk assessment in humans, animals and plants. Major application fields of mass spectrometry imaging (MSI) and metallomics have been in brain and cancer research, animal model validation, drug development and plant science. Here we give an overview of latest achievements in methods and applications. Recent improvements in ablation systems, operation and cell design enabled progressively better spatial resolutions down to 1 μm. Meanwhile, a body of research has accumulated covering basic principles of the element architecture in animals and plants that could consistently be reproduced by several laboratories such as the distribution of Fe, Cu, Zn in rodent brain. Several studies investigated the distribution and delivery of metallo-drugs in animals. Hyper-accumulating plants and pollution indicator organisms have been the key topics in environmental science. Increasingly, larger series of samples are analyzed, may it be in the frame of comparisons between intervention and control groups, of time kinetics or of three-dimensional atlas approaches.

  4. Gold nanoclusters with enhanced tunable fluorescence as bioimaging probes.

    Science.gov (United States)

    Palmal, Sharbari; Jana, Nikhil R

    2014-01-01

    Development of unique bioimaging probes offering essential information's about bio environments are an important step forward in biomedical science. Nanotechnology offers variety of novel imaging nanoprobes having high-photo stability as compared to conventional molecular probes which often experience rapid photo bleaching problem. Although great advances have been made on the development of semiconductor nanocrystals-based fluorescent imaging probes, potential toxicity issue by heavy metal component limits their in vivo therapeutic and clinical application. Recent works show that fluorescent gold clusters (FGCs) can be a promising nontoxic alternative of semiconductor nanocrystals. FGCs derived imaging nanoprobes offer stable and tunable visible emission, small hydrodynamic size, high biocompatibility and have been exploited in variety in vitro and in vivo imaging applications. In this review, we will focus on the synthetic advances and bioimaging application potentials of FGCs. In particular, we will emphasize on functional FGCs that are bright and stable enough to be useful as bioimaging probes.

  5. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  6. New nanoplatforms based on UCNPs linking with polyhedral oligomeric silsesquioxane (POSS) for multimodal bioimaging

    Science.gov (United States)

    Ge, Xiaoqian; Dong, Liang; Sun, Lining; Song, Zhengmei; Wei, Ruoyan; Shi, Liyi; Chen, Haige

    2015-04-01

    A new and facile method was used to transfer upconversion luminescent nanoparticles from hydrophobic to hydrophilic using polyhedral oligomeric silsesquioxane (POSS) linking on the surface of upconversion nanoparticles. In comparison with the unmodified upconversion nanoparticles, the POSS modified upconversion nanoplatforms [POSS-UCNPs(Er), POSS-UCNPs(Tm)] displayed good monodispersion in water and exhibited good water-solubility, while their particle size did not change substantially. Due to the low cytotoxicity and good biocompatibility as determined by methyl thiazolyl tetrazolium (MTT) assay and histology and hematology analysis, the POSS modified upconversion nanoplatforms were successfully applied to upconversion luminescence imaging of living cells in vitro and nude mouse in vivo (upon excitation at 980 nm). In addition, the doped Gd3+ ion endows the POSS-UCNPs with effective T1 signal enhancement and the POSS-UCNPs were successfully applied to in vivo magnetic resonance imaging (MRI) for a Kunming mouse, which makes them potential MRI positive-contrast agents. More importantly, the corner organic groups of POSS can be easily modified, resulting in kinds of POSS-UCNPs with many potential applications. Therefore, the method and results may provide more exciting opportunities for multimodal bioimaging and multifunctional applications.A new and facile method was used to transfer upconversion luminescent nanoparticles from hydrophobic to hydrophilic using polyhedral oligomeric silsesquioxane (POSS) linking on the surface of upconversion nanoparticles. In comparison with the unmodified upconversion nanoparticles, the POSS modified upconversion nanoplatforms [POSS-UCNPs(Er), POSS-UCNPs(Tm)] displayed good monodispersion in water and exhibited good water-solubility, while their particle size did not change substantially. Due to the low cytotoxicity and good biocompatibility as determined by methyl thiazolyl tetrazolium (MTT) assay and histology and hematology

  7. Visible-light-excited and europium-emissive nanoparticles for highly-luminescent bioimaging in vivo.

    Science.gov (United States)

    Wu, Yongquan; Shi, Mei; Zhao, Lingzhi; Feng, Wei; Li, Fuyou; Huang, Chunhui

    2014-07-01

    Europium(III)-based material showing special milliseconds photoluminescence lifetime has been considered as an ideal time-gated luminescence probe for bioimaging, but is still limited in application in luminescent small-animal bioimaging in vivo. Here, a water-soluble, stable, highly-luminescent nanosystem, Ir-Eu-MSN (MSN = mesoporous silica nanoparticles, Ir-Eu = [Ir(dfppy)2(pic-OH)]3Eu·2H2O, dfppy = 2-(2,4-difluorophenyl)pyridine, pic-OH = 3-hydroxy-2-carboxypyridine), was developed by an in situ coordination reaction to form an insoluble dinuclear iridium(III) complex-sensitized-europium(III) emissive complex within mesoporous silica nanoparticles (MSNs) which had high loading efficiency. Compared with the usual approach of physical adsorption, this in-situ reaction strategy provided 20-fold the loading efficiency (43.2%) of the insoluble Ir-Eu complex in MSNs. These nanoparticles in solid state showed bright red luminescence with high quantum yield of 55.2%, and the excitation window extended up to 470 nm. These Ir-Eu-MSN nanoparticles were used for luminescence imaging in living cells under excitation at 458 nm with confocal microscopy, which was confirmed by flow cytometry. Furthermore, the Ir-Eu-MSN nanoparticles were successfully applied into high-contrast luminescent lymphatic imaging in vivo under low power density excitation of 5 mW cm(-2). This synthetic method provides a universal strategy of combining hydrophobic complexes with hydrophilic MSNs for in vivo bioimaging.

  8. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  9. Nanostructures Derived from Starch and Chitosan for Fluorescence Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Yinxue Zu

    2016-07-01

    Full Text Available Fluorescent nanostructures (NSs derived from polysaccharides have drawn great attention as novel fluorescent probes for potential bio-imaging applications. Herein, we reported a facile alkali-assisted hydrothermal method to fabricate polysaccharide NSs using starch and chitosan as raw materials. Transmission electron microscopy (TEM demonstrated that the average particle sizes are 14 nm and 75 nm for starch and chitosan NSs, respectively. Fourier transform infrared (FT-IR spectroscopy analysis showed that there are a large number of hydroxyl or amino groups on the surface of these polysaccharide-based NSs. Strong fluorescence with an excitation-dependent emission behaviour was observed under ultraviolet excitation. Interestingly, the photostability of the NSs was found to be superior to fluorescein and rhodamine B. The quantum yield of starch NSs could reach 11.12% under the excitation of 360 nm. The oxidative metal ions including Cu(II, Hg(IIand Fe(III exhibited a quench effect on the fluorescence intensity of the prepared NSs. Both of the two kinds of the multicoloured NSs showed a maximum fluorescence intensity at pH 7, while the fluorescence intensity decreased dramatically when they were put in an either acidic or basic environment (at pH 3 or 11. The cytotoxicity study of starch NSs showed that low cell cytotoxicity and 80% viability was found after 24 h incubation, when their concentration was less than 10 mg/mL. The study also showed the possibility of using the multicoloured starch NSs for mouse melanoma cells and guppy fish imaging.

  10. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  11. Theranostic liposomes loaded with quantum dots and apomorphine for brain targeting and bioimaging

    Directory of Open Access Journals (Sweden)

    Wen CJ

    2012-03-01

    Full Text Available Chih-Jen Wen1,*, Li-Wen Zhang2,*, Saleh A Al-Suwayeh3, Tzu-Chen Yen1, Jia-You Fang2,4 1Molecular Imaging Center, Chang Gung Memorial Hospital, Gueishan, Taoyuan, Taiwan; 2Pharmaceutics Laboratory, Graduate Institute of Natural Products, Chang Gung University, Gueishan, Taoyuan, Taiwan; 3Department of Pharmaceutics, College of Pharmacy, King Saud University, Riyadh, Saudi Arabia; 4Department of Cosmetic Science, Chang Gung University of Science and Technology, Gueishan, Taoyuan, Taiwan *These authors contributed equally to this workAbstract: Quantum dots (QDs and apomorphine were incorporated into liposomes to eliminate uptake by the liver and enhance brain targeting. We describe the preparation, physicochemical characterization, in vivo bioimaging, and brain endothelial cell uptake of the theranostic liposomes. QDs and the drug were mainly located in the bilayer membrane and inner core of the liposomes, respectively. Spherical vesicles with a mean diameter of ~140 nm were formed. QDs were completely encapsulated by the vesicles. Nearly 80% encapsulation percentage was achieved for apomorphine. A greater fluorescence intensity was observed in mouse brains treated with liposomes compared to free QDs. This result was further confirmed by ex vivo imaging of the organs. QD uptake by the heart and liver was reduced by liposomal incorporation. Apomorphine accumulation in the brain increased by 2.4-fold after this incorporation. According to a hyperspectral imaging analysis, multifunctional liposomes but not the aqueous solution carried QDs into the brain. Liposomes were observed to have been efficiently endocytosed into bEND3 cells. The mechanisms involved in the cellular uptake were clathrin- and caveola-mediated endocytosis, which were energy-dependent. To the best of our knowledge, our group is the first to develop liposomes with a QD-drug hybrid for the aim of imaging and treating brain disorders.Keywords: liposomes, quantum dots, apomorphine

  12. Nanomaterials for bio-imaging and therapeutics

    Science.gov (United States)

    Liu, Yu-San

    2007-12-01

    In this thesis we studied the applications of colloidal nanocrystal quantum dots (QD) in bio-medical studies. We investigate the synthesis of QD and report a relatively simple method for synthesizing QD. To produce QDs that are more stable and have higher fluorescent quantum efficiency than those produced by other methods (typically CdSe/ZnS core/shell structures), we developed a CdSe/ZnSe/ZnS (core/shell/shell) nanocrystal complex, capped with the small molecule mercaptoacetic acid (MAA) for aqueous solubilization and low toxicity. These MAA-capped QDs can be used as the visualization aid for a multi-functional probe combining the functions of viruses and carbon nanotubes (CNT). A mild method of tagging virus through a polycationic solution, Polybrene, at 4°C is developed. This method can preserve most viral infectivity. The probes can be used to induce higher death rate in cells under near-infrared laser irradiation than in the cells without them, and thus, after additional improvements, may find applications in the study of cancer therapy. The optical properties of MAA-capped QDs are pH dependent. In particular, the fluorescence intensity increases with pH (pH between 4 and 10) of the environment. The results lead to a new venue to exploit QD as nano-scale sensors for localized physical and chemical properties in cells.

  13. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  14. Aqueous synthesis of high bright and tunable near-infrared AgInSe2-ZnSe quantum dots for bioimaging.

    Science.gov (United States)

    Che, Dongchen; Zhu, Xiaoxu; Wang, Hongzhi; Duan, Yourong; Zhang, Qinghong; Li, Yaogang

    2016-02-01

    Efficient synthetic methods for near-infrared quantum dots with good biophysical properties as bioimaging agents are urgently required. In this work, a simple and fast synthesis of highly luminescent, near-infrared AgInSe2-ZnSe quantum dots (QDs) with tunable emissions in aqueous media is reported. This method avoids high temperature and pressure and organic solvents to directly generate water-dispersible AgInSe2-ZnSe QDs. The photoluminescence emission peak of the AgInSe2-ZnSe QDs ranged from 625 to 940nm, with quantum yields up to 31%. The AgInSe2-ZnSe QDs with high quantum yield, near-infrared and low cytotoxic could be used as good cell labels, showing great potential applications in bio-imaging. PMID:26513730

  15. Aqueous synthesis of high bright and tunable near-infrared AgInSe2-ZnSe quantum dots for bioimaging.

    Science.gov (United States)

    Che, Dongchen; Zhu, Xiaoxu; Wang, Hongzhi; Duan, Yourong; Zhang, Qinghong; Li, Yaogang

    2016-02-01

    Efficient synthetic methods for near-infrared quantum dots with good biophysical properties as bioimaging agents are urgently required. In this work, a simple and fast synthesis of highly luminescent, near-infrared AgInSe2-ZnSe quantum dots (QDs) with tunable emissions in aqueous media is reported. This method avoids high temperature and pressure and organic solvents to directly generate water-dispersible AgInSe2-ZnSe QDs. The photoluminescence emission peak of the AgInSe2-ZnSe QDs ranged from 625 to 940nm, with quantum yields up to 31%. The AgInSe2-ZnSe QDs with high quantum yield, near-infrared and low cytotoxic could be used as good cell labels, showing great potential applications in bio-imaging.

  16. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  17. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  18. High resolution laser mass spectrometry bioimaging.

    Science.gov (United States)

    Murray, Kermit K; Seneviratne, Chinthaka A; Ghorai, Suman

    2016-07-15

    Mass spectrometry imaging (MSI) was introduced more than five decades ago with secondary ion mass spectrometry (SIMS) and a decade later with laser desorption/ionization (LDI) mass spectrometry (MS). Large biomolecule imaging by matrix-assisted laser desorption/ionization (MALDI) was developed in the 1990s and ambient laser MS a decade ago. Although SIMS has been capable of imaging with a moderate mass range at sub-micrometer lateral resolution from its inception, laser MS requires additional effort to achieve a lateral resolution of 10μm or below which is required to image at the size scale of single mammalian cells. This review covers untargeted large biomolecule MSI using lasers for desorption/ionization or laser desorption and post-ionization. These methods include laser microprobe (LDI) MSI, MALDI MSI, laser ambient and atmospheric pressure MSI, and near-field laser ablation MS. Novel approaches to improving lateral resolution are discussed, including oversampling, beam shaping, transmission geometry, reflective and through-hole objectives, microscope mode, and near-field optics. PMID:26972785

  19. Engineering nanosilver as an antibacterial, biosensor and bioimaging material.

    Science.gov (United States)

    Sotiriou, Georgios A; Pratsinis, Sotiris E

    2011-10-01

    The capacity of nanosilver (Ag nanoparticles) to destroy infectious micro-organisms makes it one of the most powerful antimicrobial agents, an attractive feature against "super-bugs" resistant to antibiotics. Furthermore, its plasmonic properties facilitate its employment as a biosensor or bioimaging agent. Here, the interaction of nanosilver with biological systems including bacteria and mammalian cells is reviewed. The toxicity of nanosilver is discussed focusing on Ag(+) ion release in liquid solutions. Biomedical applications of nanosilver are also presented capitalizing on its antimicrobial and plasmonic properties and summarizing its advantages, limitations and challenges. Though a lot needs to be learned about the toxicity of nanosilver, enough is known to safely use it in a spectrum of applications with minimal impact to the environment and human health.

  20. Firm Analysis by Different Methods

    OpenAIRE

    Píbilová, Kateřina

    2012-01-01

    This Diploma Thesis deals with an analysis of the company made by selected methods. External environment of the company is analysed using PESTLE analysis and Porter’s five-factor model. The internal environment is analysed by means of Kralicek Quick test and Fundamental analysis. SWOT analysis represents opportunities and threats of the external environment with the strengths and weaknesses of the company. The proposal of betterment of the company’s economic management is designed on the basi...

  1. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  2. High-efficiency upconversion luminescent sensing and bioimaging of Hg(II) by chromophoric ruthenium complex-assembled nanophosphors.

    Science.gov (United States)

    Liu, Qian; Peng, Juanjuan; Sun, Lining; Li, Fuyou

    2011-10-25

    A chromophoric ruthenium complex-assembled nanophosphor (N719-UCNPs) was achieved as a highly selective water-soluble probe for upconversion luminescence sensing and bioimaging of intracellular mercury ions. The prepared nanophosphors were characterized by X-ray powder diffraction (XRD), transmission electron microscopy (TEM), energy-dispersive X-ray analysis (EDXA), Fourier transform infrared spectroscopy (FTIR), and X-ray photoelectron spectroscopy (XPS). Further application of N719-UCNPs in sensing Hg(2+) was confirmed by optical titration experiment and upconversion luminescence live cell imaging. Using the ratiometric upconversion luminescence as a detection signal, the detection limit of Hg(2+) for this nanoprobe in water was down to 1.95 ppb, lower than the maximum level (2 ppb) of Hg(2+) in drinking water set by the United States EPA. Importantly, the nanoprobe N719-UCNPs has been shown to be capable of monitoring changes in the distribution of Hg(2+) in living cells by upconversion luminescence bioimaging. PMID:21899309

  3. G-quadruplex enhanced fluorescence of DNA-silver nanoclusters and their application in bioimaging

    Science.gov (United States)

    Zhu, Jinbo; Zhang, Libing; Teng, Ye; Lou, Baohua; Jia, Xiaofang; Gu, Xiaoxiao; Wang, Erkang

    2015-07-01

    Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially AS1411. Benefiting from the high brightness of AS1411 enhanced DNA-AgNCs and the specific binding affinity of AS1411 for nucleolin, the AS1411 enhanced AgNCs can stain cancer cells for bioimaging.Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially

  4. Nanoemulsion-templated polylelectrolyte multifunctional nanocapsules for DNA entrapment and bioimaging.

    Science.gov (United States)

    Bazylińska, Urszula; Saczko, Jolanta

    2016-01-01

    The emerging field of bionanotechnology aims at advancing colloidal and biomedical research via introduction of multifunctional nanoparticle-based containers intended for both gene therapy and bioimaging. In the present contribution we entrapped the model genetic material (herring testes DNA) in the newly-designed non-viral vectors, i.e., multifunctional nanocapsules obtained by layer-by-layer (LbL) adsorption of DNA and oppositely charged polysaccharide-based chitosan (CHIT) on the nanoemulsion core, loaded by IR-780 indocyanine (used as the fluorescent marker) and stabilized by gemini-type ammonium salts: N,N,N',N'-tetramethyl-N,N'-di(dodecyl)-ethylenediammonium bromide, d(DDA)PBr and N,N,N',N'-tetramethyl-N,N'-di(dodecyl)-butylenediammonium d(DDA)BBr. Ternary phase diagrams of the surfactant-oil-water systems were determined by titration method. Then, the stability of the nanoemulsions obtained with IR-780 solubilized in the oleic acid (OA) or isopropyl myristate (IPM) phase was evaluated by backscattering (BS) profiles and ζ-potential measurements. In the next step, CHIT and DNA layers were subsequently deposited on the kinetically stable nanoemulsion cores. The IR-780-loaded nanocarriers covered by (DNA/CHIT)4 bilayers shown the high ζ-potential value (about +43mV provided by Doppler electrophoresis), the size <120nm and the spherical shape as analyzed by dynamic light scattering (DLS), atomic force microscopy (AFM) and scanning electron microscopy (SEM). Finally, the long-lasting nanosystems were subjected to in vitro biological studies on human cancer cell lines - doxorubicin-sensitive breast (MCF-7/WT), epithelial lung adenocarcinoma (A549) and skin melanoma (MEWO). Biological response of the cell culture was expressed as cytotoxic activity evaluated by MTT-based proliferation assay as well as bioimaging of intracellular localization of IR-780 molecules loaded in the multilayer DNA-deposited nanocontainers - provided by confocal laser scanning microscopy

  5. Organising multi-dimensional biological image information: the BioImage Database.

    Science.gov (United States)

    Carazo, J M; Stelzer, E H; Engel, A; Fita, I; Henn, C; Machtynger, J; McNeil, P; Shotton, D M; Chagoyen, M; de Alarcón, P A; Fritsch, R; Heymann, J B; Kalko, S; Pittet, J J; Rodriguez-Tomé, P; Boudier, T

    1999-01-01

    Nowadays it is possible to unravel complex information at all levels of cellular organization by obtaining multi-dimensional image information. At the macromolecular level, three-dimensional (3D) electron microscopy, together with other techniques, is able to reach resolutions at the nanometer or subnanometer level. The information is delivered in the form of 3D volumes containing samples of a given function, for example, the electron density distribution within a given macromolecule. The same situation happens at the cellular level with the new forms of light microscopy, particularly confocal microscopy, all of which produce biological 3D volume information. Furthermore, it is possible to record sequences of images over time (videos), as well as sequences of volumes, bringing key information on the dynamics of living biological systems. It is in this context that work on BioImage started two years ago, and that its first version is now presented here. In essence, BioImage is a database specifically designed to contain multi-dimensional images, perform queries and interactively work with the resulting multi-dimensional information on the World Wide Web, as well as accomplish the required cross-database links. Two sister home pages of BioImage can be accessed at http://www. bioimage.org and http://www-embl.bioimage.org

  6. Development of functional gold nanorods for bioimaging and photothermal therapy

    International Nuclear Information System (INIS)

    Gold nanorods have strong surface plasmon band at near-infrared light region, and are used as a photothermal converter. Since the near-infrared light penetrates into tissues deeply, it has been expected as a contrast agent for near infrared light bioimaging, a photosensitizer for photothermal therapy, and functional device for drug delivery system responding to near-infrared light irradiation. In this study, the surface plasmon bands of intravenously injected gold nanorods were monitored in the mouse abdomen using a spectrophotometer equipped with an integrating sphere, then we determined pharmacokinetics parameters of the gold nanorods after intravenous injection. Next, the PEG-modified gold nanorods were directly injected into subcutaneous tumors in mice, then, near-infrared pulsed laser light was irradiated the tumors. Significant tumor damage and suppression of the tumor growth was observed. We constructed targeted delivery system of the gold nanorods by modifying with a thermo-responsive polymer and a peptide responding to a protease activity. These modified gold nanorods are expected as functional nanodevices for photothermal therapy and drug delivery system.

  7. Development of functional gold nanorods for bioimaging and photothermal therapy

    Energy Technology Data Exchange (ETDEWEB)

    Niidome, T, E-mail: niidome.takuro.655@m.kyushu-u.ac.j [Faculty of Engineering, Kyushu University, Fukuoka 819-0395 (Japan) and Center for Future Chemistry, Kyushu University, Fukuoka 819-0395 (Japan) and PRESTO, Japan Science and Technology Agency, Kawaguchi 332-0012 (Japan)

    2010-06-01

    Gold nanorods have strong surface plasmon band at near-infrared light region, and are used as a photothermal converter. Since the near-infrared light penetrates into tissues deeply, it has been expected as a contrast agent for near infrared light bioimaging, a photosensitizer for photothermal therapy, and functional device for drug delivery system responding to near-infrared light irradiation. In this study, the surface plasmon bands of intravenously injected gold nanorods were monitored in the mouse abdomen using a spectrophotometer equipped with an integrating sphere, then we determined pharmacokinetics parameters of the gold nanorods after intravenous injection. Next, the PEG-modified gold nanorods were directly injected into subcutaneous tumors in mice, then, near-infrared pulsed laser light was irradiated the tumors. Significant tumor damage and suppression of the tumor growth was observed. We constructed targeted delivery system of the gold nanorods by modifying with a thermo-responsive polymer and a peptide responding to a protease activity. These modified gold nanorods are expected as functional nanodevices for photothermal therapy and drug delivery system.

  8. Tunable Fabrication of Molybdenum Disulfide Quantum Dots for Intracellular MicroRNA Detection and Multiphoton Bioimaging.

    Science.gov (United States)

    Dai, Wenhao; Dong, Haifeng; Fugetsu, Bunshi; Cao, Yu; Lu, Huiting; Ma, Xinlei; Zhang, Xueji

    2015-09-01

    Molybdenum disulfide (MoS2 ) quantum dots (QDs) (size MoS2 QDs has not been investigated in great detail. Here, a facile and efficient approach for synthesis of controllable-size MoS2 QDs with excellent photoluminescence (PL) by using a sulfuric acid-assisted ultrasonic route is developed for this investigation. Various MoS2 structures including monolayer MoS2 flake, nanoporous MoS2 , and MoS2 QDs can be yielded by simply controlling the ultrasonic durations. Comprehensive microscopic and spectroscopic tools demonstrate that the MoS2 QDs have uniform lateral size and possess excellent excitation-independent blue PL. The as-generated MoS2 QDs show high quantum yield of 9.65%, long fluorescence lifetime of 4.66 ns, and good fluorescent stability over broad pH values from 4 to 10. Given the good intrinsic optical properties and large surface area combined with excellent physiological stability and biocompatibility, a MoS2 QDs-based intracellular microRNA imaging analysis system is successfully constructed. Importantly, the MoS2 QDs show good performance as multiphoton bioimaging labeling. The proposed synthesis strategy paves a new way for facile and efficient preparing MoS2 QDs with tunable-size for biomedical imaging and optoelectronic devices application. PMID:26033986

  9. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  10. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging

    Science.gov (United States)

    Zhou, Tao; Jia, Lei; Luo, Yi-Feng; Xu, Jun; Chen, Ru-Hua; Ge, Zhi-Jun; Ma, Tie-Liang; Chen, Hong; Zhu, Tao-Feng

    2016-01-01

    A novel multifunctional halloysite nanotube (HNT)-based Fe3O4@HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane)3 nanocomposite (Fe-HNT-Eu NC) with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI) contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers. PMID:27698562

  11. Oleyl-hyaluronan micelles loaded with upconverting nanoparticles for bio-imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pospisilova, Martina, E-mail: martina.pospisilova@contipro.com; Mrazek, Jiri; Matuska, Vit; Kettou, Sofiane; Dusikova, Monika; Svozil, Vit; Nesporova, Kristina; Huerta-Angeles, Gloria; Vagnerova, Hana; Velebny, Vladimir [Contipro Biotech (Czech Republic)

    2015-09-15

    Hyaluronan (HA) represents an interesting polymer for nanoparticle coating due to its biocompatibility and enhanced cell interaction via CD44 receptor. Here, we describe incorporation of oleate-capped β–NaYF{sub 4}:Yb{sup 3+}, Er{sup 3+} nanoparticles (UCNP-OA) into amphiphilic HA by microemulsion method. Resulting structures have a spherical, micelle-like appearance with a hydrodynamic diameter of 180 nm. UCNP-OA-loaded HA micelles show a good stability in PBS buffer and cell culture media. The intensity of green emission of UCNP-OA-loaded HA micelles in water is about five times higher than that of ligand-free UCNP, indicating that amphiphilic HA effectively protects UCNP luminescence from quenching by water molecules. We found that UCNP-OA-loaded HA micelles in concentrations up to 50 μg mL{sup −1} increase cell viability of normal human dermal fibroblasts (NHDF), while viability of human breast adenocarcinoma cells MDA–MB–231 is reduced at these concentrations. The utility of UCNP-OA-loaded HA micelles as a bio-imaging probe was demonstrated in vitro by successful labelling of NHDF and MDA–MB–231 cells overexpressing the CD44 receptor.

  12. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    Science.gov (United States)

    Gongalsky, M. B.; Osminkina, L. A.; Pereira, A.; Manankov, A. A.; Fedorenko, A. A.; Vasiliev, A. N.; Solovyev, V. V.; Kudryavtsev, A. A.; Sentis, M.; Kabashin, A. V.; Timoshenko, V. Yu.

    2016-04-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparency near 800 nm. Based on the laser ablation of crystalline Si targets in gaseous helium, followed by ultrasound-assisted dispersion of the deposited films in physiological saline, the proposed method avoids any toxic by-products during the synthesis. We demonstrate efficient contrast of the Si QDs in living cells by following the exciton PL. We also show that the prepared QDs do not provoke any cytoxicity effects while penetrating into the cells and efficiently accumulating near the cell membrane and in the cytoplasm. Combined with the possibility of enabling parallel therapeutic channels, ultrapure laser-synthesized Si nanostructures present unique object for cancer theranostic applications.

  13. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  14. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  15. Method of photon spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  16. Method of photon spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  17. Statistical methods for bioimpedance analysis

    OpenAIRE

    Christian Tronstad; Are Hugo Pripp

    2014-01-01

    This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements a...

  18. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2003-01-01

    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  19. Metal and Complementary Molecular Bioimaging in Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Nady eBraidy

    2014-07-01

    Full Text Available Alzheimer’s disease (AD is the leading cause of dementia in the elderly. AD represents a complex neurological disorder which is best understood as the consequence of a number of interconnected genetic and lifestyle variables, which culminate in multiple changes to brain structure and function. At a molecular level, metal dyshomeostasis is frequently observed in AD due to anomalous binding of metals such as Iron (Fe, Copper (Cu and Zinc (Zn, or impaired regulation of redox-active metals which can induce the formation of cytotoxic reactive oxygen species and neuronal damage. Neuroimaging of metals in a variety of intact brain cells and tissues is emerging as an important tool for increasing our understanding of the role of metal dysregulation in AD. Several imaging techniques have been used to study the cerebral metallo-architecture in biological specimens to obtain spatially resolved data on chemical elements present in a sample. Hyperspectral techniques, such as particle-induced X-ray emission (PIXE, energy dispersive X-ray spectroscopy (EDS, X-ray fluorescence microscopy (XFM, synchrotron X-ray fluorescence (SXRF, secondary ion mass spectrometry (SIMS, and laser ablation inductively coupled mass spectrometry (LA-ICPMS can reveal relative intensities and even semi-quantitative concentrations of a large set of elements with differing spatial resolution and detection sensitivities. Other mass spectrometric and spectroscopy imaging techniques such as laser ablation electrospray ionisation mass spectrometry (LA ESI-MS, MALDI imaging mass spectrometry (MALDI-IMS, and Fourier transform infrared spectroscopy (FTIR can be used to correlate changes in elemental distribution with the underlying pathology in AD brain specimens. The current review aims to discuss the advantages and challenges of using these emerging elemental and molecular imaging techniques, and highlight clinical achievements in AD research using bioimaging techniques.

  20. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard

    2006-03-01

    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  1. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    , thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project...... (Danmarks Grundforskningfond) 'Centre of Excellence in GeoGenetics' grant, with additional funding provided by the Danish Council for Independent Research 'Sapere Aude' programme. The thesis comprises five chapters, all of which represent different projects that involved the analysis of massive amounts...... of sequence data, generated using next-generation sequencing (NGS) technologies, from either forensic (Chapter 1) or ancient (Chapters 2-5) materials. These chapters present projects very different in nature, reflecting the diversity of questions that have become possible to address in the ancient DNA field...

  2. One-step fabrication of nitrogen-doped fluorescent nanoparticles from non-conjugated natural products and their temperature-sensing and bioimaging applications

    Directory of Open Access Journals (Sweden)

    Xiaoling Zeng

    2015-03-01

    Full Text Available A facile solvothermal method was used to prepare N-doped fluorescent nanoparticles (NFNPs at gram scale from tartaric acid/citric acid/ethylenediamine using oleic acid as the reaction medium. The quantum yield of the obtained fluorescent nanoparticles could reach 48.7%. The NFNPs were characterized by multiple analytical techniques. By combining with the circular dichroism (CD spectra, the structure and the origin of photoluminescence of the NFNPs were discussed. The fluorescent intensity of the obtained NFNPs had remarkable stability, and exhibited a reversible temperature-dependent enhancement/quenching. The products with low cytotoxicity could be introduced into the target cells for in vitro bioimaging.

  3. Recent developments in gold(I) coordination chemistry: luminescence properties and bioimaging opportunities.

    Science.gov (United States)

    Langdon-Jones, Emily E; Pope, Simon J A

    2014-09-18

    The fascinating biological activity of gold coordination compounds has led to the development of a wide range of complexes. The precise biological action of such species is often poorly understood and the ability to map gold distribution in cellular environments is key. This article discusses the recent progress in luminescent Au(I) complexes whilst considering their utility in bioimaging and therapeutics.

  4. From Science History and Applications Developments of Life System to Bio-Imaging Technology

    Institute of Scientific and Technical Information of China (English)

    YAN Li-min; LOU Wei; HE Guo-sen

    2004-01-01

    This paper presents brief science history and application developments of imaging technology,and discusses the bio-imaging technology. Real-time image measurement techniques and parallel processing of a realistic example is given. Finally coming converging technology of nano-bio-Info-cgno (NBIC) is extended for future trend.

  5. Multiscale 3D bioimaging: from cell, tissue to whole organism

    Science.gov (United States)

    Lau, S. H.; Wang, Ge; Chandrasekeran, Margam; Fan, Victor; Nazrul, Mohd; Chang, Hauyee; Fong, Tiffany; Gelb, Jeff; Feser, Michael; Yun, Wenbing

    2009-05-01

    While electron microscopes and AFMs are capable of high resolution imaging to molecular levels, there is an ongoing problem in integrating these results into the larger scale structure and functions of tissue and organs within a complex organism. Imaging biological samples with optical microscopy is predominantly done with histology and immunohistochemistry, which can take up to a several weeks to prepare, are artifact prone and only available as individual 2D images. At the nano resolution scale, the higher resolution electron microscopy and AFM are used, but again these require destructive sample preparation and data are in 2D. To bridge this gap, we describe a rapid non invasive hierarchical bioimaging technique using a novel lab based x-ray computed tomography to characterize complex biological organism in multiscale- from whole organ (mesoscale) to calcified and soft tissue (microscale), to subcellular structures, nanomaterials and cellular-scaffold interaction (nanoscale). While MicroCT (micro x-ray computed tomography) is gaining in popularity for non invasive bones and tissue imaging, contrast and resolution are still vastly inadequate compared to histology. In this study we will present multiscale results from a novel microCT and nanoCT (nano x-ray tomography system). The novel MicroCT can image large specimen and tissue sample at histology resolution of submicron voxel resolution, often without contrast agents, while the nanoCT using x-ray optics similar to those used in synchrotron radiation facilities, has 20nm voxel resolution, suitable for studying cellular, subcellular morphology and nanomaterials. Multiscale examples involving both calcified and soft tissue will be illustrated, which include imaging a rat tibia to the individual channels of osteocyte canaliculli and lacunae and an unstained whole murine lung to its alveoli. The role of the novel CT will also be discussed as a possible means for rapid virtual histology using a biopsy of a human

  6. Alexa fluor-labeled fluorescent cellulose nanocrystals for bioimaging solid cellulose in spatially structured microenvironments.

    Science.gov (United States)

    Grate, Jay W; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G; Kelly, Ryan T; Orr, Galya; Hu, Dehong; Dehoff, Karl J; Brockman, Fred J; Wilkins, Michael J

    2015-03-18

    Methods to covalently conjugate Alexa Fluor dyes to cellulose nanocrystals, at limiting amounts that retain the overall structure of the nanocrystals as model cellulose materials, were developed using two approaches. In the first, aldehyde groups are created on the cellulose surfaces by reaction with limiting amounts of sodium periodate, a reaction well-known for oxidizing vicinal diols to create dialdehyde structures. Reductive amination reactions were then applied to bind Alexa Fluor dyes with terminal amino-groups on the linker section. In the absence of the reductive step, dye washes out of the nanocrystal suspension, whereas with the reductive step, a colored product is obtained with the characteristic spectral bands of the conjugated dye. In the second approach, Alexa Fluor dyes were modified to contain chloro-substituted triazine ring at the end of the linker section. These modified dyes then were reacted with cellulose nanocrystals in acetonitrile at elevated temperature, again isolating material with the characteristic spectral bands of the Alexa Fluor dye. Reactions with Alexa Fluor 546 are given as detailed examples, labeling on the order of 1% of the total glucopyranose rings of the cellulose nanocrystals at dye loadings of ca. 5 μg/mg cellulose. Fluorescent cellulose nanocrystals were deposited in pore network microfluidic structures (PDMS) and proof-of-principle bioimaging experiments showed that the spatial localization of the solid cellulose deposits could be determined, and their disappearance under the action of Celluclast enzymes or microbes could be observed over time. In addition, single molecule fluorescence microscopy was demonstrated as a method to follow the disappearance of solid cellulose deposits over time, following the decrease in the number of single blinking dye molecules with time instead of fluorescent intensity.

  7. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  8. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-09-15

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a study of the performance of the self-seeding scheme accounting for spatiotemporal coupling caused by the use of a single crystal monochromator. Our analysis indicates that this distortion is easily suppressed by the right choice of diamond crystal planes and that the proposed undulator source yields about the same performance as in the case for a X-ray seed pulse with no coupling. Simulations show that the FEL power reaches 2 TW in the 3 keV-5 keV photon energy range, which is the most preferable for single biomolecule imaging.

  9. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Zhou T

    2016-09-01

    Full Text Available Tao Zhou,1 Lei Jia,1 Yi-Feng Luo,2 Jun Xu,1 Ru-Hua Chen,2 Zhi-Jun Ge,2 Tie-Liang Ma,2 Hong Chen,2 Tao-Feng Zhu2 1Department of Physics and Chemistry, Henan Polytechnic University, Jiaozuo, Henan, 2The Affiliated Yixing Hospital of Jiangsu University, Yixing, Jiangsu, People’s Republic of China Abstract: A novel multifunctional halloysite nanotube (HNT-based Fe3O4@HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane3 nanocomposite (Fe-HNT-Eu NC with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers. Keywords: halloysite nanotube, lanthanide complex, iron oxide, luminescence, contrast agent

  10. Rapid solid-phase microwave synthesis of highly photoluminescent nitrogen-doped carbon dots for Fe3+ detection and cellular bioimaging

    Science.gov (United States)

    He, Guili; Xu, Minghan; Shu, Mengjun; Li, Xiaolin; Yang, Zhi; Zhang, Liling; Su, Yanjie; Hu, Nantao; Zhang, Yafei

    2016-09-01

    Recently, carbon dots (CDs) have been playing an increasingly important role in industrial production and biomedical field because of their excellent properties. As such, finding an efficient method to quickly synthesize a large scale of relatively high purity CDs is of great interest. Herein, a facile and novel microwave method has been applied to prepare nitrogen doped CDs (N-doped CDs) within 8 min using L-glutamic acid as the sole reaction precursor in the solid phase condition. The as-prepared N-doped CDs with an average size of 1.64 nm are well dispersed in aqueous solution. The photoluminescence of N-doped CDs is pH-sensitive and excitation-dependent. The N-doped CDs show a strong blue fluorescence with relatively high fluorescent quantum yield of 41.2%, which remains stable even under high ionic strength. Since the surface is rich in oxygen-containing functional groups, N-doped CDs can be applied to selectively detect Fe3+ with the limit of detection of 10-5 M. In addition, they are also used for cellular bioimaging because of their high fluorescent intensity and nearly zero cytotoxicity. The solid-phase microwave method seems to be an effective strategy to rapidly obtain high quality N-doped CDs and expands their applications in ion detection and cellular bioimaging.

  11. Rapid solid-phase microwave synthesis of highly photoluminescent nitrogen-doped carbon dots for Fe(3+) detection and cellular bioimaging.

    Science.gov (United States)

    He, Guili; Xu, Minghan; Shu, Mengjun; Li, Xiaolin; Yang, Zhi; Zhang, Liling; Su, Yanjie; Hu, Nantao; Zhang, Yafei

    2016-09-30

    Recently, carbon dots (CDs) have been playing an increasingly important role in industrial production and biomedical field because of their excellent properties. As such, finding an efficient method to quickly synthesize a large scale of relatively high purity CDs is of great interest. Herein, a facile and novel microwave method has been applied to prepare nitrogen doped CDs (N-doped CDs) within 8 min using L-glutamic acid as the sole reaction precursor in the solid phase condition. The as-prepared N-doped CDs with an average size of 1.64 nm are well dispersed in aqueous solution. The photoluminescence of N-doped CDs is pH-sensitive and excitation-dependent. The N-doped CDs show a strong blue fluorescence with relatively high fluorescent quantum yield of 41.2%, which remains stable even under high ionic strength. Since the surface is rich in oxygen-containing functional groups, N-doped CDs can be applied to selectively detect Fe(3+) with the limit of detection of 10(-5) M. In addition, they are also used for cellular bioimaging because of their high fluorescent intensity and nearly zero cytotoxicity. The solid-phase microwave method seems to be an effective strategy to rapidly obtain high quality N-doped CDs and expands their applications in ion detection and cellular bioimaging. PMID:27573680

  12. Tissue cartography: compressing bio-image data by dimensional reduction.

    Science.gov (United States)

    Heemskerk, Idse; Streichan, Sebastian J

    2015-12-01

    The high volumes of data produced by state-of-the-art optical microscopes encumber research. We developed a method that reduces data size and processing time by orders of magnitude while disentangling signal by taking advantage of the laminar structure of many biological specimens. Our Image Surface Analysis Environment automatically constructs an atlas of 2D images for arbitrarily shaped, dynamic and possibly multilayered surfaces of interest. Built-in correction for cartographic distortion ensures that no information on the surface is lost, making the method suitable for quantitative analysis. We applied our approach to 4D imaging of a range of samples, including a Drosophila melanogaster embryo and a Danio rerio beating heart.

  13. Novel methods for spectral analysis

    Science.gov (United States)

    Roy, R.; Sumpter, B. G.; Pfeffer, G. A.; Gray, S. K.; Noid, D. W.

    1991-06-01

    In this review article, various techniques for obtaining estimates of parameters related to the spectrum of an underlying process are discussed. These techniques include the conventional nonparametric FFT approach and more recently developed parametric techniques such as maximum entropy, MUSIC, and ESPRIT, the latter two being classified as signal-subspace or eigenvector techniques. These estimators span the spectrum of possible estimators in that extremes of a priori knowledge are assumed (nonparametric versus parametric) and extremes in the underlying model of the observed process (deterministic versus stochastic) are involved. The advantage of parametric techniques is their ability to provide very accurate estimates using data from extremely short time intervals. Several applications of these novel methods for frequency analysis of very short time data are presented. These include calculation of dispersion curves, and the density of vibrational states g(ω) for many-body systems, semiclassical transition frequencies, overtone linewidths, and resonance energies of the time-dependent Schrödinger equation for few-body problems.

  14. Stable and Size-Tunable Aggregation-Induced Emission Nanoparticles Encapsulated with Nanographene Oxide and Applications in Three-Photon Fluorescence Bioimaging.

    Science.gov (United States)

    Zhu, Zhenfeng; Qian, Jun; Zhao, Xinyuan; Qin, Wei; Hu, Rongrong; Zhang, Hequn; Li, Dongyu; Xu, Zhengping; Tang, Ben Zhong; He, Sailing

    2016-01-26

    Organic fluorescent dyes with high quantum yield are widely applied in bioimaging and biosensing. However, most of them suffer from a severe effect called aggregation-caused quenching (ACQ), which means that their fluorescence is quenched at high molecular concentrations or in the aggregation state. Aggregation-induced emission (AIE) is a diametrically opposite phenomenon to ACQ, and luminogens with this feature can effectively solve this problem. Graphene oxide has been utilized as a quencher for many fluorescent dyes, based on which biosensing can be achieved. However, using graphene oxide as a surface modification agent of fluorescent nanoparticles is seldom reported. In this article, we used nanographene oxide (NGO) to encapsulate fluorescent nanoparticles, which consisted of a type of AIE dye named TPE-TPA-FN (TTF). NGO significantly improved the stability of nanoparticles in aqueous dispersion. In addition, this method could control the size of nanoparticles' flexibly as well as increase their emission efficiency. We then used the NGO-modified TTF nanoparticles to achieve three-photon fluorescence bioimaging. The architecture of ear blood vessels in mice and the distribution of nanoparticles in zebrafish could be observed clearly. Furthermore, we extended this method to other AIE luminogens and showed it was widely feasible.

  15. Sustainable, Rapid Synthesis of Bright-Luminescent CuInS2-ZnS Alloyed Nanocrystals: Multistage Nano-xenotoxicity Assessment and Intravital Fluorescence Bioimaging in Zebrafish-Embryos

    Science.gov (United States)

    Chetty, S. Shashank; Praneetha, S.; Basu, Sandeep; Sachidanandan, Chetana; Murugan, A. Vadivel

    2016-05-01

    Near-infrared (NIR) luminescent CuInS2-ZnS alloyed nanocrystals (CIZS-NCs) for highly fluorescence bioimaging have received considerable interest in recent years. Owing, they became a desirable alternative to heavy-metal based-NCs and organic dyes with unique optical properties and low-toxicity for bioimaging and optoelectronic applications. In the present study, bright and robust CIZS-NCs have been synthesized within 5 min, as-high-as 230 °C without requiring any inert-gas atmosphere via microwave-solvothermal (MW-ST) method. Subsequently, the in vitro and in vivo nano-xenotoxicity and cellular uptake of the MUA-functionalized CIZS-NCs were investigated in L929, Vero, MCF7 cell lines and zebrafish-embryos. We observed minimal toxicity and acute teratogenic consequences upto 62.5 μg/ml of the CIZS-NCs in zebrafish-embryos. We also observed spontaneous uptake of the MUA-functionalized CIZS-NCs by 3 dpf older zebrafish-embryos that are evident through bright red fluorescence-emission at a low concentration of 7.8 μg/mL. Hence, we propose that the rapid, low-cost, large-scale “sustainable” MW-ST synthesis of CIZS-NCs, is an ideal bio-nanoprobe with good temporal and spatial resolution for rapid labeling, long-term in vivo tracking and intravital-fluorescence-bioimaging (IVBI).

  16. Neodymium-doped nanoparticles for infrared fluorescence bioimaging: The role of the host

    Energy Technology Data Exchange (ETDEWEB)

    Rosal, Blanca del; Pérez-Delgado, Alberto; Rocha, Ueslen; Martín Rodríguez, Emma; Jaque, Daniel, E-mail: daniel.jaque@uam.es [Fluorescence Imaging Group, Dpto. de Física de Materiales, Facultad de Ciencias, Universidad Autónoma de Madrid, Campus de Cantoblanco, Madrid 28049 (Spain); Misiak, Małgorzata; Bednarkiewicz, Artur [Wroclaw Research Centre EIT+, ul. Stabłowicka 147, 54-066 Wrocław (Poland); Institute of Physics, University of Tartu, 14c Ravila Str., 50411 Tartu (Estonia); Vanetsev, Alexander S. [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Orlovskii, Yurii [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Prokhorov General Physics Institute RAS, 38 Vavilov Str., 119991 Moscow (Russian Federation); Jovanović, Dragana J.; Dramićanin, Miroslav D. [Vinča Institute of Nuclear Sciences, University of Belgrade, P.O. Box 522, Belgrade 11001 (Serbia); Upendra Kumar, K.; Jacinto, Carlos [Grupo de Fotônica e Fluidos Complexos, Instituto de Física, Universidade Federal de Alagoas, 57072-900 Maceió-AL (Brazil); Navarro, Elizabeth [Depto. de Química, Eco Catálisis, UAM-Iztapalapa, Sn. Rafael Atlixco 186, México 09340, D.F (Mexico); and others

    2015-10-14

    The spectroscopic properties of different infrared-emitting neodymium-doped nanoparticles (LaF{sub 3}:Nd{sup 3+}, SrF{sub 2}:Nd{sup 3+}, NaGdF{sub 4}: Nd{sup 3+}, NaYF{sub 4}: Nd{sup 3+}, KYF{sub 4}: Nd{sup 3+}, GdVO{sub 4}: Nd{sup 3+}, and Nd:YAG) have been systematically analyzed. A comparison of the spectral shapes of both emission and absorption spectra is presented, from which the relevant role played by the host matrix is evidenced. The lack of a “universal” optimum system for infrared bioimaging is discussed, as the specific bioimaging application and the experimental setup for infrared imaging determine the neodymium-doped nanoparticle to be preferentially used in each case.

  17. Upconverting and NIR emitting rare earth based nanostructures for NIR-bioimaging

    Science.gov (United States)

    Hemmer, Eva; Venkatachalam, Nallusamy; Hyodo, Hiroshi; Hattori, Akito; Ebina, Yoshie; Kishimoto, Hidehiro; Soga, Kohei

    2013-11-01

    In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near-infrared (NIR) range under NIR excitation may overcome those problems. Due to the outstanding optical and magnetic properties of lanthanide ions (Ln3+), nanoscopic host materials doped with Ln3+, e.g. Y2O3:Er3+,Yb3+, are promising candidates for NIR-NIR bioimaging. Ln3+-doped gadolinium-based inorganic nanostructures, such as Gd2O3:Er3+,Yb3+, have a high potential as opto-magnetic markers allowing the combination of time-resolved optical imaging and magnetic resonance imaging (MRI) of high spatial resolution. Recent progress in our research on over-1000 nm NIR fluorescent nanoprobes for in vivo NIR-NIR bioimaging will be discussed in this review.In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near

  18. Nanoparticles prepared from porous silicon nanowires for bio-imaging and sonodynamic therapy.

    Science.gov (United States)

    Osminkina, Liubov A; Sivakov, Vladimir A; Mysov, Grigory A; Georgobiani, Veronika A; Natashina, Ulyana А; Talkenberg, Florian; Solovyev, Valery V; Kudryavtsev, Andrew A; Timoshenko, Victor Yu

    2014-01-01

    Evaluation of cytotoxicity, photoluminescence, bio-imaging, and sonosensitizing properties of silicon nanoparticles (SiNPs) prepared by ultrasound grinding of porous silicon nanowires (SiNWs) have been investigated. SiNWs were formed by metal (silver)-assisted wet chemical etching of heavily boron-doped (100)-oriented single crystalline silicon wafers. The prepared SiNWs and aqueous suspensions of SiNPs exhibit efficient room temperature photoluminescence (PL) in the spectral region of 600 to 1,000 nm that is explained by the radiative recombination of excitons confined in small silicon nanocrystals, from which SiNWs and SiNPs consist of. On the one hand, in vitro studies have demonstrated low cytotoxicity of SiNPs and possibilities of their bio-imaging applications. On the other hand, it has been found that SiNPs can act as efficient sensitizers of ultrasound-induced suppression of the viability of Hep-2 cancer cells.

  19. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  20. Rapid coal proximate analysis by thermogravimetric method

    Energy Technology Data Exchange (ETDEWEB)

    Mao Jianxiong; Yang Dezhong; Zhao Baozhong

    1987-09-01

    A rapid coal proximate analysis by thermogravimetric analysis (TGA) can be used as an alternative method for the standard proximate analysis. This paper presents a program set up to rapidly perform coal proximate analysis by using a thermal analyzer and TGA module. A comparison between coal proximate analyses by standard method (GB) and TGA is also given. It shows that most data from TGA fall within the tolerance limit of standard method.

  1. Alexa Fluor-labeled Fluorescent Cellulose Nanocrystals for Bioimaging Solid Cellulose in Spatially Structured Microenvironments

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G.; Kelly, Ryan T.; Orr, Galya; Hu, Dehong; Dehoff, Karl J.; Brockman, Fred J.; Wilkins, Michael J.

    2015-03-18

    Cellulose nanocrystal materials have been labeled with modern Alexa Fluor dyes in a process that first links the dye to a cyanuric chloride molecule. Subsequent reaction with cellulose nanocrystals provides dyed solid microcrystalline cellulose material that can be used for bioimaging and suitable for deposition in films and spatially structured microenvironments. It is demonstrated with single molecular fluorescence microscopy that these films are subject to hydrolysis by cellulose enzymes.

  2. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    OpenAIRE

    M. B. Gongalsky; Osminkina, L. A.; A. Pereira; A. A. Manankov; Fedorenko, A. A.; Vasiliev, A. N.; Solovyev, V. V.; Kudryavtsev, A. A.; Sentis, M.; Kabashin, A. V.; V. Yu. Timoshenko

    2016-01-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparenc...

  3. Functional CdSe/CdS@SiO2 nanoparticles for bioimaging applications

    OpenAIRE

    Aubert, Tangi; Wassmuth, Daniel; Soenen, Stefaan; Van Deun, Rik; Braeckmans, Kevin; Hens, Zeger

    2014-01-01

    Semiconductor quantum dots (QDs) constitute very promising candidates as light emitters for numerous applications in the field of biotechnology, such as cell labeling or other bioimaging techniques. For such applications, semiconductor QDs represent an attractive alternative to classic organic fluorophores as they exhibit a far superior photostability by several orders of magnitude and a higher brightness thanks to large absorption cross-sections. Within this family of materials, core-shell h...

  4. Breathing laser as an inertia-free swept source for high-quality ultrafast optical bioimaging.

    Science.gov (United States)

    Wei, Xiaoming; Xu, Jingjiang; Xu, Yiqing; Yu, Luoqin; Xu, Jianbing; Li, Bowen; Lau, Andy K S; Wang, Xie; Zhang, Chi; Tsia, Kevin K; Wong, Kenneth K Y

    2014-12-01

    We demonstrate an all-fiber breathing laser as inertia-free swept source (BLISS), with an ultra-compact design, for the emerging ultrafast bioimaging modalities. The unique feature of BLISS is its broadband wavelength-swept operation (∼60  nm) with superior temporal stability in terms of both long term (0.08 dB over 27 h) and shot-to-shot power variations (2.1%). More importantly, it enables a wavelength sweep rate of >10  MHz (∼7×10⁸  nm/s)—orders-of-magnitude faster than the existing swept sources based on mechanical or electrical tuning techniques. BLISS thus represents a practical and new generation of swept source operating in the unmet megahertz swept-rate regime that aligns with the pressing need for scaling the optical bioimaging speed in ultrafast phenomena study or high-throughput screening applications. To showcase its utility in high-speed optical bioimaging, we here employ BLISS for ultrafast time-stretch microscopy and multi-MHz optical coherence tomography of the biological specimen at a single-shot line-scan rate or A-scan rate of 11.5 MHz. PMID:25490629

  5. The Intersection of CMOS Microsystems and Upconversion Nanoparticles for Luminescence Bioimaging and Bioassays

    Directory of Open Access Journals (Sweden)

    Liping Wei

    2014-09-01

    Full Text Available Organic fluorophores and quantum dots are ubiquitous as contrast agents for bio-imaging and as labels in bioassays to enable the detection of biological targets and processes. Upconversion nanoparticles (UCNPs offer a different set of opportunities as labels in bioassays and for bioimaging. UCNPs are excited at near-infrared (NIR wavelengths where biological molecules are optically transparent, and their luminesce in the visible and ultraviolet (UV wavelength range is suitable for detection using complementary metal-oxide-semiconductor (CMOS technology. These nanoparticles provide multiple sharp emission bands, long lifetimes, tunable emission, high photostability, and low cytotoxicity, which render them particularly useful for bio-imaging applications and multiplexed bioassays. This paper surveys several key concepts surrounding upconversion nanoparticles and the systems that detect and process the corresponding luminescence signals. The principle of photon upconversion, tuning of emission wavelengths, UCNP bioassays, and UCNP time-resolved techniques are described. Electronic readout systems for signal detection and processing suitable for UCNP luminescence using CMOS technology are discussed. This includes recent progress in miniaturized detectors, integrated spectral sensing, and high-precision time-domain circuits. Emphasis is placed on the physical attributes of UCNPs that map strongly to the technical features that CMOS devices excel in delivering, exploring the interoperability between the two technologies.

  6. Evaluation methods of SWOT analysis

    OpenAIRE

    VANĚK, Michal; Mikoláš, Milan; Žváková, Kateřina

    2012-01-01

    Strategic management is an integral part of top management. By formulating the right strategy and its subsequent implementation, a managed organization can attract and retain a comparative advantage. In order to fulfil this expectation, the strategy also has to be supported with relevant findings of performed strategic analyses. The best known and probably the most common of these is a SWOT analysis. In practice, however, the analysis is reduced to mere presentation of influence factors, whic...

  7. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  8. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  9. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  10. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam

    2000-08-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  11. Root Cause Analysis: Methods and Mindsets.

    Science.gov (United States)

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  12. Cost Analysis: Methods and Realities.

    Science.gov (United States)

    Cummings, Martin M.

    1989-01-01

    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  13. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  14. Analysis methods for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  15. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  16. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    覃卓敏

    2008-01-01

    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  17. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  18. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  19. The Qualitative Method of Impact Analysis.

    Science.gov (United States)

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  20. CMEIAS bioimage informatics that define the landscape ecology of immature microbial biofilms developed on plant rhizoplane surfaces

    Directory of Open Access Journals (Sweden)

    Frank B Dazzo

    2015-10-01

    Full Text Available Colonization of the rhizoplane habitat is an important activity that enables certain microorganisms to promote plant growth. Here we describe various types of computer-assisted microscopy that reveal important ecological insights of early microbial colonization behavior within biofilms on plant root surfaces grown in soil. Examples of the primary data are obtained by analysis of processed images of rhizoplane biofilm landscapes analyzed at single-cell resolution using the emerging technology of CMEIAS bioimage informatics software. Included are various quantitative analyses of the in situ biofilm landscape ecology of microbes during their pioneer colonization of white clover roots, and of a rhizobial biofertilizer strain colonized on rice roots where it significantly enhances the productivity of this important crop plant. The results show that spatial patterns of immature biofilms developed on rhizoplanes that interface rhizosphere soil are highly structured (rather than distributed randomly when analyzed at the appropriate spatial scale, indicating that regionalized microbial cell-cell interactions and the local environment can significantly affect their cooperative and competitive colonization behaviors.

  1. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    Science.gov (United States)

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  2. Probabilistic structural analysis by extremum methods

    Science.gov (United States)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  3. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  4. Water-soluble photoluminescent fullerene capped mesoporous silica for pH-responsive drug delivery and bioimaging

    Science.gov (United States)

    Tan, Lei; Wu, Tao; Tang, Zhao-Wen; Xiao, Jian-Yun; Zhuo, Ren-Xi; Shi, Bin; Liu, Chuan-Jun

    2016-08-01

    In this paper, a biocompatible and water-soluble fluorescent fullerene (C60-TEG-COOH) coated mesoporous silica nanoparticle (MSN) was successfully fabricated for pH-sensitive drug release and fluorescent cell imaging. The MSN was first reacted with 3-aminopropyltriethoxysilane to obtain an amino-modified MSN, and then the water-soluble C60 with a carboxyl group was used to cover the surface of the MSN through electrostatic interaction with the amino group in PBS solution (pH = 7.4). The release of doxorubicin hydrochloride (DOX) could be triggered under a mild acidic environment (lysosome, pH = 5.0) due to the protonation of C60-TEG-COO-, which induced the dissociation of the C60-TEG-COOH modified MSN (MSN@C60). Furthermore, the uptake of nanoparticles by cells could be tracked because of the green fluorescent property of the C60-modified MSN. In an in vitro study, the prepared materials showed excellent biocompatibility and the DOX-loaded nanocarrier exhibited efficient anticancer ability. This work offered a simple method for designing a simultaneous pH-responsive drug delivery and bioimaging system.

  5. Matrix methods for bare resonator eigenvalue analysis.

    Science.gov (United States)

    Latham, W P; Dente, G C

    1980-05-15

    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  6. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  7. Stimulus responsive nanogel with innate near IR fluorescent capability for drug delivery and bioimaging.

    Science.gov (United States)

    Vijayan, Vineeth M; Shenoy, Sachin J; Victor, Sunita P; Muthu, Jayabalan

    2016-10-01

    A brighter, non toxic and biocompatible optical imaging agent is one of the major quests of biomedical research. Here in, we report a photoluminescent comacromer [PEG-poly(propylene fumarate)-citric acid-glycine] and novel stimulus (pH) responsive nanogel endowed with excitation wavelength dependent fluorescence (EDF) for combined drug delivery and bioimaging applications. The comacromer when excited at different wavelengths in visible region from 400nm to 640nm exhibits fluorescent emissions from 510nm to 718nm in aqueous condition. It has high Stokes shift (120nm), fluorescent lifetime (7 nanoseconds) and quantum yield (50%). The nanogel, C-PLM-NG, prepared with this photoluminescent comacromer and N,N-dimethyl amino ethylmethacrylate (DMEMA) has spherical morphology with particle size around 100nm and 180nm at pH 7.4 (physiological) and 5.5 (intracellular acidic condition of cancer cells) respectively. The studies on fluorescence characteristics of C-PLM NG in aqueous condition reveal large red-shift with emissions from 523nm to 700nm for excitations from 460nm to 600nm ascertaining the EDF characteristics. Imaging the near IR emission with excitation at 535nm was accomplished using cut-off filters. The nanogel undergoes pH responsive swelling and releases around 50% doxorubicin (DOX) at pH 5.5 in comparison with 15% observed at pH 7.4. The studies on in vitro cytotoxicity with MTT assay and hemolysis revealed that the present nanogel is non-toxic. The DOX-loaded C-PLM-NG encapsulated in Hela cells induces lysis of cancer cells. The inherent EDF characteristics associated with C-PLM NG enable cellular imaging of Hela cells. The studies on biodistribution and clearance mechanism of C-PLM-NG from the body of mice reveal bioimaging capability and safety of the present nanogel. This is the first report on a polymeric nanogel with innate near IR emissions for bioimaging applications.

  8. Evidence of three-level trophic transfer of quantum dots in an aquatic food chain by using bioimaging.

    Science.gov (United States)

    Lee, Woo-Mi; An, Youn-Joo

    2015-05-01

    In this study, we demonstrated the three-level trophic transfer of quantum dots (QDs) within the aquatic food chain. Using bioimaging, we observed QD transfer from protozoa (Astasia longa) to zooplankton (Moina macrocopa) to fish (Danio rerio). Bioimaging is an effective tool that can improve our understanding of the delivery of nanomaterials in vivo. Measurement with an intravital multiphoton laser scanning microscope visually proved the transfer of QDs from the first to the second and the second to the third levels. As QDs may be passed from lower organisms to humans via the food chain, our findings have implications for the safety of their use. PMID:25119416

  9. Size effects in the quantum yield of Cd Te quantum dots for optimum fluorescence bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Jacinto, C.; Rocha, U.S. [Universidade Federal de Alagoas (UFAL), Maceio, AL (Brazil). Inst. de Fisica. Grupo de Fotonica e Fluidos Complexos; Maestro, L.M.; Garcia-Sole, J.; Jaque, D. [Universidad Autonoma de Madrid (Spain). Dept. de Fisica de Materiales. Fluorescence Imaging Group

    2011-07-01

    Full text: Semiconductor nano-crystals, usually referred as Quantum Dots (QDs) are nowadays regarded as one of the building-blocks in modern photonics. They constitute bright and photostable fluorescence sources whose emission and absorption properties can be adequately tailored through their size. Recent advances on the controlled modification of their surface has made possible the development of water soluble QDs, without causing any deterioration in their fluorescence properties. This has made them excellent optical selective markers to be used in fluorescence bio-imaging experiments. The suitability of colloidal QDs for bio-imaging is pushed forward by their large two-photon absorption cross section so that their visible luminescence (associated to the recombination of electro-hole pairs) can be also efficiently excited under infrared excitation (two-photon excitation). This, in turns, allows for large penetration depths in tissues, minimization of auto-fluorescence and achievement of superior spatial imaging resolution. In addition, recent works have demonstrated the ability of QDs to act as nano-thermometers based on the thermal sensitivity of their fluorescence bands. Based on all these outstanding properties, QDs have been successfully used to mark individual receptors in cell membranes, to intracellular temperature measurements and to label living embryos at different stages. Most of the QD based bio-images reported up to now were obtained by using whether CdSe or CdTe QDs since both are currently commercial available with a high degree of quality. They show similar fluorescence properties and optical performance when used in bio-imaging. Nevertheless, CdTe-QDs have very recently attracted much attention since the hyper-thermal sensitivity of their fluorescence bands was discovered. Based on this, it has been postulated that intracellular thermal sensing with resolutions as large as 0.25 deg C can be achieved based on CdTe-QDs, three times better than

  10. Three-photon-excited luminescence from unsymmetrical cyanostilbene aggregates: morphology tuning and targeted bioimaging.

    Science.gov (United States)

    Mandal, Amal Kumar; Sreejith, Sivaramapanicker; He, Tingchao; Maji, Swarup Kumar; Wang, Xiao-Jun; Ong, Shi Li; Joseph, James; Sun, Handong; Zhao, Yanli

    2015-05-26

    We report an experimental observation of aggregation-induced enhanced luminescence upon three-photon excitation in aggregates formed from a class of unsymmetrical cyanostilbene derivatives. Changing side chains (-CH3, -C6H13, -C7H15O3, and folic acid) attached to the cyanostilbene core leads to instantaneous formation of aggregates with sizes ranging from micrometer to nanometer scale in aqueous conditions. The crystal structure of a derivative with a methyl side chain reveals the planarization in the unsymmetrical cyanostilbene core, causing luminescence from corresponding aggregates upon three-photon excitation. Furthermore, folic acid attached cyanostilbene forms well-dispersed spherical nanoaggregates that show a high three-photon cross-section of 6.0 × 10(-80) cm(6) s(2) photon(-2) and high luminescence quantum yield in water. In order to demonstrate the targeted bioimaging capability of the nanoaggregates, three cell lines (HEK293 healthy cell line, MCF7 cancerous cell line, and HeLa cancerous cell line) were employed for the investigations on the basis of their different folate receptor expression level. Two kinds of nanoaggregates with and without the folic acid targeting ligand were chosen for three-photon bioimaging studies. The cell viability of three types of cells incubated with high concentration of nanoaggregates still remained above 70% after 24 h. It was observed that the nanoaggregates without the folic acid unit could not undergo the endocytosis by both healthy and cancerous cell lines. No obvious endocytosis of folic acid attached nanoaggregates was observed from the HEK293 and MCF7 cell lines having a low expression of the folate receptor. Interestingly, a significant amount of endocytosis and internalization of folic acid attached nanoaggregates was observed from HeLa cells with a high expression of the folate receptor under three-photon excitation, indicating targeted bioimaging of folic acid attached nanoaggregates to the cancer cell line

  11. Recent Progress on the Preparation of Luminescent Silicon Nanoparticles for Bio-Imaging Applications

    Science.gov (United States)

    Maurice, V.; Sublemontier, O.; Herlin-Boime, N.; Doris, E.; Raccurt, O.; Sanson, A.

    2010-10-01

    : Luminescent silicon nanoparticles particles produced by laser pyrolysis are considered as possible alternative to replace toxic Quantum Dot in bioimaging applications. However, these nanoparticles are fully oxidized when kept in water, therefore, the luminescent silicon core must be be protected from oxidation. The Si nanoparticles were embedded in monodisperse silica beads (˜50 nm) produced in microemulsion. The silica beads provide protection of the silicon core and allow stability of the photoluminescence over time. They are well dispersed in water and biological medium with a colloidal stability of several days.

  12. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  13. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  14. Causal Moderation Analysis Using Propensity Score Methods

    Science.gov (United States)

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  15. ANALYSIS OF MODERN CAR BODY STRAIGHTENING METHODS

    Directory of Open Access Journals (Sweden)

    Arhun, Sch.

    2013-01-01

    Full Text Available The analysis of modern car body panels straightening methods is carried out. There have been described both traditional and alternative methods of car body panels straightening. The urgency of magnetic pulse teсhnology dignment is grounded. The main advantages of magnetic pulse teсhno-logy of car body straightening are defernined.

  16. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy anal

  17. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  18. Multifunctional NaYF4:Yb, Er@mSiO2@Fe3O4-PEG nanoparticles for UCL/MR bioimaging and magnetically targeted drug delivery

    Science.gov (United States)

    Liu, Bei; Li, Chunxia; Ma, Ping'an; Chen, Yinyin; Zhang, Yuanxin; Hou, Zhiyao; Huang, Shanshan; Lin, Jun

    2015-01-01

    A low toxic multifunctional nanoplatform, integrating both mutimodal diagnosis methods and antitumor therapy, is highly desirable to assure its antitumor efficiency. In this work, we show a convenient and adjustable synthesis of multifunctional nanoparticles NaYF4:Yb, Er@mSiO2@Fe3O4-PEG (MFNPs) based on different sizes of up-conversion nanoparticles (UCNPs). With strong up-conversion fluorescence offered by UCNPs, superparamagnetism properties attributed to Fe3O4 nanoparticles and porous structure coming from the mesoporous SiO2 shell, the as-obtained MFNPs can be utilized not only as a contrast agent for dual modal up-conversion luminescence (UCL)/magnetic resonance (MR) bio-imaging, but can also achieve an effective magnetically targeted antitumor chemotherapy both in vitro and in vivo. Furthermore, the UCL intensity of UCNPs and the magnetic properties of Fe3O4 in the MFNPs were carefully balanced. Silica coating and further PEG modifying can improve the hydrophilicity and biocompatibility of the as-synthesized MFNPs, which was confirmed by the in vitro/in vivo biocompatibility and in vivo long-time bio-distributions tests. Those results revealed that the UCNPs based magnetically targeted drug carrier system we synthesized has great promise in the future for multimodal bio-imaging and targeted cancer therapy.A low toxic multifunctional nanoplatform, integrating both mutimodal diagnosis methods and antitumor therapy, is highly desirable to assure its antitumor efficiency. In this work, we show a convenient and adjustable synthesis of multifunctional nanoparticles NaYF4:Yb, Er@mSiO2@Fe3O4-PEG (MFNPs) based on different sizes of up-conversion nanoparticles (UCNPs). With strong up-conversion fluorescence offered by UCNPs, superparamagnetism properties attributed to Fe3O4 nanoparticles and porous structure coming from the mesoporous SiO2 shell, the as-obtained MFNPs can be utilized not only as a contrast agent for dual modal up-conversion luminescence (UCL

  19. Serum-stable quantum dot--protein hybrid nanocapsules for optical bio-imaging

    Science.gov (United States)

    Lee, Jeong Yu; Nam, Dong Heon; Oh, Mi Hwa; Kim, Youngsun; Choi, Hyung Seok; Jeon, Duk Young; Beum Park, Chan; Nam, Yoon Sung

    2014-05-01

    We introduce shell cross-linked protein/quantum dot (QD) hybrid nanocapsules as a serum-stable systemic delivery nanocarrier for tumor-targeted in vivo bio-imaging applications. Highly luminescent, heavy-metal-free Cu0.3InS2/ZnS (CIS/ZnS) core-shell QDs are synthesized and mixed with amine-reactive six-armed poly(ethylene glycol) (PEG) in dichloromethane. Emulsification in an aqueous solution containing human serum albumin (HSA) results in shell cross-linked nanocapsules incorporating CIS/ZnS QDs, exhibiting high luminescence and excellent dispersion stability in a serum-containing medium. Folic acid is introduced as a tumor-targeting ligand. The feasibility of tumor-targeted in vivo bio-imaging is demonstrated by measuring the fluorescence intensity of several major organs and tumor tissue after an intravenous tail vein injection of the nanocapsules into nude mice. The cytotoxicity of the QD-loaded HSA-PEG nanocapsules is also examined in several types of cells. Our results show that the cellular uptake of the QDs is critical for cytotoxicity. Moreover, a significantly lower level of cell death is observed in the CIS/ZnS QDs compared to nanocapsules loaded with cadmium-based QDs. This study suggests that the systemic tumor targeting of heavy-metal-free QDs using shell cross-linked HSA-PEG hybrid nanocapsules is a promising route for in vivo tumor diagnosis with reduced non-specific toxicity.

  20. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  1. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  2. Semiconductor Quantum Dots for Bioimaging and Biodiagnostic Applications

    OpenAIRE

    Kairdolf, Brad A.; Andrew M Smith; Stokes, Todd H.; Wang, May D.; Young, Andrew N.; Nie, Shuming

    2013-01-01

    Semiconductor quantum dots (QDs) are light-emitting particles on the nanometer scale that have emerged as a new class of fluorescent labels for chemical analysis, molecular imaging, and biomedical diagnostics. Compared with traditional fluorescent probes, QDs have unique optical and electronic properties such as size-tunable light emission, narrow and symmetric emission spectra, and broad absorption spectra that enable the simultaneous excitation of multiple fluorescence colors. QDs are also ...

  3. A special issue on reviews in biomedical applications of nanomaterials, tissue engineering, stem cells, bioimaging, and toxicity.

    Science.gov (United States)

    Nalwa, Hari Singh

    2014-10-01

    This second special issue of the Journal of Biomedical Nanotechnology in a series contains another 30 state-of-the-art reviews focused on the biomedical applications of nanomaterials, biosensors, bone tissue engineering, MRI and bioimaging, single-cell detection, stem cells, endothelial progenitor cells, toxicity and biosafety of nanodrugs, nanoparticle-based new therapeutic approaches for cancer, hepatic and cardiovascular disease.

  4. A special issue on reviews in biomedical applications of nanomaterials, tissue engineering, stem cells, bioimaging, and toxicity.

    Science.gov (United States)

    Nalwa, Hari Singh

    2014-10-01

    This second special issue of the Journal of Biomedical Nanotechnology in a series contains another 30 state-of-the-art reviews focused on the biomedical applications of nanomaterials, biosensors, bone tissue engineering, MRI and bioimaging, single-cell detection, stem cells, endothelial progenitor cells, toxicity and biosafety of nanodrugs, nanoparticle-based new therapeutic approaches for cancer, hepatic and cardiovascular disease. PMID:25992404

  5. Biochemical and Bioimaging Evidence of Cholesterol in Acquired Cholesteatoma

    DEFF Research Database (Denmark)

    Thorsted, Bjarne; Bloksgaard, Maria; Groza, Alexandra;

    2016-01-01

    OBJECTIVES: To quantify the barrier sterols and image the lipid structures in the matrix of acquired cholesteatoma and compare the distribution with that found in stratum corneum from normal skin, with the goal to resolve their potential influence on cholesteatoma growth. METHODS: High-performanc......OBJECTIVES: To quantify the barrier sterols and image the lipid structures in the matrix of acquired cholesteatoma and compare the distribution with that found in stratum corneum from normal skin, with the goal to resolve their potential influence on cholesteatoma growth. METHODS: High......-performance thin-layer chromatography (HPTLC) was used to achieve a quantitative biochemical determination of the sterols. The intercellular lipids were visualized by Coherent Anti-Stokes Raman scattering (CARS) microscopy, which enables label-free imaging of the lipids in intact tissue samples. RESULTS...

  6. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  7. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  8. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  9. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  10. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  11. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, J F

    2007-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentrablatt Math "". . . carefully structured with many detailed worked examples . . .""-The Mathematical Gazette "". . . an up-to-date and user-friendly account . . .""-Mathematika An Introduction to Numerical Methods and Analysis addresses the mathematics underlying approximation and scientific computing and successfully explains where approximation methods come from, why they sometimes work (or d

  12. Gold-Speckled Multimodal Nanoparticles for Noninvasive Bioimaging

    Science.gov (United States)

    2008-01-01

    In this report the synthesis, characterization, and functional evaluation of a multimodal nanoparticulate contrast agent for noninvasive imaging through both magnetic resonance imaging (MRI) and photoacoustic tomography (PAT) is presented. The nanoparticles described herein enable high resolution and highly sensitive three-dimensional diagnostic imaging through the synergistic coupling of MRI and PAT capabilities. Gadolinium (Gd)-doped gold-speckled silica (GSS) nanoparticles, ranging from 50 to 200 nm, have been prepared in a simple one-pot synthesis using nonionic microemulsions. The photoacoustic signal is generated from a nonuniform, discontinuous gold nanodomains speckled across the silica surface, whereas the MR contrast is provided through Gd incorporated in the silica matrix. The presence of a discontinuous speckled surface, as opposed to a continuous gold shell, allows sufficient bulk water exchange with the Gd ions to generate a strong MR contrast. The dual imaging capabilities of the particles have been demonstrated through in silicio and in vitro methods. The described particles also have the capacity for therapeutic applications including the thermal ablation of tumors through the absorption of irradiated light. PMID:19466201

  13. Progress of MEMS Scanning Micromirrors for Optical Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Lih Y. Lin

    2015-11-01

    Full Text Available Microelectromechanical systems (MEMS have an unmatched ability to incorporate numerous functionalities into ultra-compact devices, and due to their versatility and miniaturization, MEMS have become an important cornerstone in biomedical and endoscopic imaging research. To incorporate MEMS into such applications, it is critical to understand underlying architectures involving choices in actuation mechanism, including the more common electrothermal, electrostatic, electromagnetic, and piezoelectric approaches, reviewed in this paper. Each has benefits and tradeoffs and is better suited for particular applications or imaging schemes due to achievable scan ranges, power requirements, speed, and size. Many of these characteristics are fabrication-process dependent, and this paper discusses various fabrication flows developed to integrate additional optical functionality beyond simple lateral scanning, enabling dynamic control of the focus or mirror surface. Out of this provided MEMS flexibility arises some challenges when obtaining high resolution images: due to scanning non-linearities, calibration of MEMS scanners may become critical, and inherent image artifacts or distortions during scanning can degrade image quality. Several reviewed methods and algorithms have been proposed to address these complications from MEMS scanning. Given their impact and promise, great effort and progress have been made toward integrating MEMS and biomedical imaging.

  14. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  15. Practical Fourier analysis for multigrid methods

    CERN Document Server

    Wienands, Roman

    2004-01-01

    Before applying multigrid methods to a project, mathematicians, scientists, and engineers need to answer questions related to the quality of convergence, whether a development will pay out, whether multigrid will work for a particular application, and what the numerical properties are. Practical Fourier Analysis for Multigrid Methods uses a detailed and systematic description of local Fourier k-grid (k=1,2,3) analysis for general systems of partial differential equations to provide a framework that answers these questions.This volume contains software that confirms written statements about convergence and efficiency of algorithms and is easily adapted to new applications. Providing theoretical background and the linkage between theory and practice, the text and software quickly combine learning by reading and learning by doing. The book enables understanding of basic principles of multigrid and local Fourier analysis, and also describes the theory important to those who need to delve deeper into the detai...

  16. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  17. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  18. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  19. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  20. Heteroscedastic regression analysis method for mixed data

    Institute of Scientific and Technical Information of China (English)

    FU Hui-min; YUE Xiao-rui

    2011-01-01

    The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data, type- I censored data and type- Ⅱ censored data from the location-scale distribution. The best unbiased estimations of regression coefficients, as well as the confidence limits of the location parameter and scale parameter were given. Furthermore, the point estimations and confidence limits of percentiles were obtained. Thus, the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution. So the presented method has a broad range of promising applications.

  1. Influence of gold nanoparticle architecture on in vitro bioimaging and cellular uptake

    Energy Technology Data Exchange (ETDEWEB)

    Polat, Ozlem [Fatih University, Department of Bio and Nanotechnology Engineering (Turkey); Karagoz, Aysel; Isık, Sevim [Fatih University, Department of Biology, Faculty of Arts and Science (Turkey); Ozturk, Ramazan, E-mail: rozturk@fatih.edu.tr [Fatih University, Department of Chemistry, Faculty of Arts and Science (Turkey)

    2014-12-15

    Gold nanoparticles (GNPs) are favorable nanostructures for several biological applications due to their easy synthesis and biocompatible properties. Commonly studied GNP shapes are nanosphere (AuNS), nanorod (AuNR), and nanocage (AuNC). In addition to distinct geometries and structural symmetries, these shapes have different photophysical properties detected by surface plasmon resonances. Therefore, choosing the best shaped GNP for a specific purpose is crucial to the success of the application. In this study, all three shapes of GNP were investigated for their potency to interact with cell surface receptors. Anti-HER2 antibody was conjugated to the surface of nanoparticles. MCF-7 breast adenocarcinoma and hMSC human mesenchymal cell lines were treated with GNPs and analyzed for cellular uptake and bioimaging efficiencies using the UV–vis spectroscopy and dark-field microscopy.

  2. Power System Transient Stability Analysis through a Homotopy Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  3. A DECOMPOSITION METHOD OF STRUCTURAL DECOMPOSITION ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    LI Jinghua

    2005-01-01

    Over the past two decades,structural decomposition analysis(SDA)has developed into a major analytical tool in the field of input-output(IO)techniques,but the method was found to suffer from one or more of the following problems.The decomposition forms,which are used to measure the contribution of a specific determinant,are not unique due to the existence of a multitude of equivalent forms,irrational due to the weights of different determinants not matching,inexact due to the existence of large interaction terms.In this paper,a decomposition method is derived to overcome these deficiencies,and we prove that the result of this approach is equal to the Shapley value in cooperative games,and so some properties of the method are obtained.Beyond that,the two approaches that have been used predominantly in the literature have been proved to be the approximate solutions of the method.

  4. Generalized analysis method for neutron resonance transmission analysis

    International Nuclear Information System (INIS)

    Neutron resonance densitometry (NRD) is a non-destructive analysis method, which can be applied to quantify special nuclear materials (SNM) in small particle-like debris of melted fuel that are formed in severe accidents of nuclear reactors such as the Fukushima Daiichi nuclear power plants. NRD uses neutron resonance transmission analysis (NRTA) to quantify SNM and neutron resonance capture analysis (NRCA) to identify matrix materials and impurities. To apply NRD for the characterization of arbitrary-shaped thick materials, a generalized method for the analysis of NRTA data has been developed. The method has been applied on data resulting from transmission through thick samples with an irregular shape and an areal density of SNM up to 0.253 at/b (≈100 g/cm2). The investigation shows that NRD can be used to quantify SNM with a high accuracy not only in inhomogeneous samples made of particle-like debris but also in samples made of large rocks with an irregular shape by applying the generalized analysis method for NRTA. (author)

  5. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  6. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  7. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link......Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...

  8. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  9. Text analysis devices, articles of manufacture, and text analysis methods

    Science.gov (United States)

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  10. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  11. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  12. Single-cell analysis - Methods and protocols

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2013-06-01

    Full Text Available This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  13. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  14. Spectroscopic chemical analysis methods and apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  15. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  16. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  17. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  18. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  19. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  20. Optical methods for the analysis of dermatopharmacokinetics

    Science.gov (United States)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  1. A comparison of analytical methods for detection of [14C]trichloro acetic acid-derived radioactivity in needles and branches of spruce (Picea sp.)

    International Nuclear Information System (INIS)

    The branches (wood and needles) of spruces of varying age treated with [14C]trichloro acetic acid (3.7 GBq/mmol) were studied, using the following methods: Qualitative: - Conventional macroautoradiography with X-ray film and histological classification. Quantitative: - 14C combustion analysis with the sample oxidizer A 307 (Canberra/Packard) followed by measurement of radioactivity using the LS counter 6000 (Beckman Instrumentts); - digital autoradiography with the Digital Autoradiograph LB 286 (Berthold GmbH); -digital autoradiography with the Bio-imaging Analyzer BAS 2000 (Fuji Film Co.). (orig.)

  2. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications

    Directory of Open Access Journals (Sweden)

    Li-wei Liu

    2014-06-01

    Full Text Available In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell not only enhances the brightness of the QDs but also improves their stability in a biological environment. An in-vivo imaging study showed that F127-CdS/ZnS micelles had strong luminescence. These results suggest that these nanoparticles have significant advantages for bioimaging applications and may offer a new direction for the early detection of cancer in humans.

  3. Mesoporous silica nanoparticles with organo-bridged silsesquioxane framework as innovative platforms for bioimaging and therapeutic agent delivery.

    Science.gov (United States)

    Du, Xin; Li, Xiaoyu; Xiong, Lin; Zhang, Xueji; Kleitz, Freddy; Qiao, Shi Zhang

    2016-06-01

    Mesoporous silica material with organo-bridged silsesquioxane frameworks is a kind of synergistic combination of inorganic silica, mesopores and organics, resulting in some novel or enhanced physicochemical and biocompatible properties compared with conventional mesoporous silica materials with pure Si-O composition. With the rapid development of nanotechnology, monodispersed nanoscale periodic mesoporous organosilica nanoparticles (PMO NPs) and organo-bridged mesoporous silica nanoparticles (MSNs) with various organic groups and structures have recently been synthesized from 100%, or less, bridged organosilica precursors, respectively. Since then, these materials have been employed as carrier platforms to construct bioimaging and/or therapeutic agent delivery nanosystems for nano-biomedical application, and they demonstrate some unique and/or enhanced properties and performances. This review article provides a comprehensive overview of the controlled synthesis of PMO NPs and organo-bridged MSNs, physicochemical and biocompatible properties, and their nano-biomedical application as bioimaging agent and/or therapeutic agent delivery system. PMID:27017579

  4. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications.

    Science.gov (United States)

    Liu, Li-Wei; Hu, Si-Yi; Pan, Ying; Zhang, Jia-Qi; Feng, Yue-Shu; Zhang, Xi-He

    2014-01-01

    In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell not only enhances the brightness of the QDs but also improves their stability in a biological environment. An in-vivo imaging study showed that F127-CdS/ZnS micelles had strong luminescence. These results suggest that these nanoparticles have significant advantages for bioimaging applications and may offer a new direction for the early detection of cancer in humans. PMID:24991530

  5. Thermal Analysis Methods for Aerobraking Heating

    Science.gov (United States)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on

  6. Blood proteins analysis by Raman spectroscopy method

    Science.gov (United States)

    Artemyev, D. N.; Bratchenko, I. A.; Khristoforova, Yu. A.; Lykina, A. A.; Myakinin, O. O.; Kuzmina, T. P.; Davydkin, I. L.; Zakharov, V. P.

    2016-04-01

    This work is devoted to study the possibility of plasma proteins (albumin, globulins) concentration measurement using Raman spectroscopy setup. The blood plasma and whole blood were studied in this research. The obtained Raman spectra showed significant variation of intensities of certain spectral bands 940, 1005, 1330, 1450 and 1650 cm-1 for different protein fractions. Partial least squares regression analysis was used for determination of correlation coefficients. We have shown that the proposed method represents the structure and biochemical composition of major blood proteins.

  7. Method and apparatus for simultaneous spectroelectrochemical analysis

    Science.gov (United States)

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  8. FUZZY METHOD FOR FAILURE CRITICALITY ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The greatest benefit is realized from failure mode, effect and criticality analysis (FMECA) when it is done early in the design phase and tracks product changes as they evolve; design changes can then be made more economically than if the problems are discovered after the design has been completed. However, when the discovered design flaws must be prioritized for corrective actions, precise information on their probability of occurrence, the effect of the failure, and their detectability often are not availabe. To solve this problem, this paper described a new method, based on fuzzy sets, for prioritizing failures for corrective actions in a FMCEA. Its successful application to the container crane shows that the proposed method is both reasonable and practical.

  9. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  10. 3D analysis methods - Study and seminar

    International Nuclear Information System (INIS)

    The first part of the report results from a study that was performed as a Nordic co-operation activity with active participation from Studsvik Scandpower and Westinghouse Atom in Sweden, and VTT in Finland. The purpose of the study was to identify and investigate the effects rising from using the 3D transient com-puter codes in BWR safety analysis, and their influence on the transient analysis methodology. One of the main questions involves the critical power ratio (CPR) calculation methodology. The present way, where the CPR calculation is per-formed with a separate hot channel calculation, can be artificially conservative. In the investigated cases, no dramatic minimum CPR effect coming from the 3D calculation is apparent. Some cases show some decrease in the transient change of minimum CPR with the 3D calculation, which confirms the general thinking that the 1D calculation is conservative. On the other hand, the observed effect on neutron flux behaviour is quite large. In a slower transient the 3D effect might be stronger. The second part of the report is a summary of a related seminar that was held on the 3D analysis methods. The seminar was sponsored by the Reactor Safety part (NKS-R) of the Nordic Nuclear Safety Research Programme (NKS). (au)

  11. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  12. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  13. Analysis Method for Quantifying Vehicle Design Goals

    Science.gov (United States)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  14. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  15. Digital methods for mediated discourse analysis

    DEFF Research Database (Denmark)

    Kjær, Malene; Larsen, Malene Charlotte

    2015-01-01

    In this paper we discuss methodological strategies for collecting multimodal data using digital resources. The aim is to show how digital resources can provide ethnographic insights into mediated actions (Scollon, 2002) that can otherwise be difficult to observe or engage in, due to, for instance......, restrictions or privately mediated settings. Having used mediated discourse analysis (Scollon 2002, Scollon & Scollon, 2004) as a framework in two different research projects, we show how the framework, in correlation with digital resources for data gathering, provides new understandings of 1) the daily...... practice of health care professionals (Author 1, 2014) and 2) young people’s identity construction on social media platforms (Author 2, 2010, 2015, in press). The paper’s contribution is a methodological discussion on digital data collection using methods such as online interviewing (via e-mail or chat...

  16. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  17. Gap analysis: Concepts, methods, and recent results

    Science.gov (United States)

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  18. Three-dimensional elemental bio-imaging of Fe, Zn, Cu, Mn and P in a 6-hydroxydopamine lesioned mouse brain.

    Science.gov (United States)

    Hare, Dominic J; George, Jessica L; Grimm, Rudolph; Wilkins, Simon; Adlard, Paul A; Cherny, Robert A; Bush, Ashley I; Finkelstein, David I; Doble, Philip

    2010-11-01

    Three dimensional maps of iron (Fe), zinc (Zn), copper (Cu), manganese (Mn) and phosphorous (P) in a 6-hydroxydopamine (6-OHDA) lesioned mouse brain were constructed employing a novel quantitative laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) imaging method known as elemental bio-imaging. The 3D maps were produced by ablating serial consecutive sections taken from the same animal. Each section was quantified against tissue standards resulting in a three dimensional map that represents the variation of trace element concentrations of the mouse brain in the area surrounding the substantia nigra (SN). Damage caused by the needle or the toxin did not alter the distribution of Zn, and Cu but significantly altered Fe in and around the SN and both Mn and Fe around the needle track. A 20% increase in nigral Fe concentration was observed within the lesioned hemisphere. This technique clearly shows the natural heterogeneous distributions of these elements throughout the brain and the perturbations that occur following trauma or intoxication. The method may applied to three-dimensional modelling of trace elements in a wide range of tissue samples. PMID:21072366

  19. Super-resolution fluorescent materials: an insight into design and bioimaging applications.

    Science.gov (United States)

    Yang, Zhigang; Sharma, Amit; Qi, Jing; Peng, Xiao; Lee, Dong Yeop; Hu, Rui; Lin, Danying; Qu, Junle; Kim, Jong Seung

    2016-08-22

    Living organisms are generally composed of complex cellular processes which persist only within their native environments. To enhance our understanding of the biological processes lying within complex milieus, various techniques have been developed. Specifically, the emergence of super-resolution microscopy has generated a renaissance in cell biology by redefining the existing dogma towards nanoscale cell dynamics, single synaptic vesicles, and other complex bioprocesses by overcoming the diffraction-imposed resolution barrier that is associated with conventional microscopy techniques. Besides the typical technical reliance on the optical framework and computational algorithm, super-resolution imaging microscopy resorts largely to fluorescent materials with special photophysical properties, including fluorescent proteins, organic fluorophores and nanomaterials. In this tutorial review article, with the emphasis on cell biology, we summarize the recent developments in fluorescent materials being utilized in various super-resolution techniques with successful integration into bio-imaging applications. Fluorescent proteins (FP) applied in super-resolution microscopy will not be covered herein as it has already been well summarized; additionally, we demonstrate the breadth of opportunities offered from a future perspective. PMID:27296269

  20. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2012-01-01

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a stu...

  1. Fluorescent Polymer Nanoparticles Based on Dyes: Seeking Brighter Tools for Bioimaging.

    Science.gov (United States)

    Reisch, Andreas; Klymchenko, Andrey S

    2016-04-01

    Speed, resolution and sensitivity of today's fluorescence bioimaging can be drastically improved by fluorescent nanoparticles (NPs) that are many-fold brighter than organic dyes and fluorescent proteins. While the field is currently dominated by inorganic NPs, notably quantum dots (QDs), fluorescent polymer NPs encapsulating large quantities of dyes (dye-loaded NPs) have emerged recently as an attractive alternative. These new nanomaterials, inspired from the fields of polymeric drug delivery vehicles and advanced fluorophores, can combine superior brightness with biodegradability and low toxicity. Here, we describe the strategies for synthesis of dye-loaded polymer NPs by emulsion polymerization and assembly of pre-formed polymers. Superior brightness requires strong dye loading without aggregation-caused quenching (ACQ). Only recently several strategies of dye design were proposed to overcome ACQ in polymer NPs: aggregation induced emission (AIE), dye modification with bulky side groups and use of bulky hydrophobic counterions. The resulting NPs now surpass the brightness of QDs by ≈10-fold for a comparable size, and have started reaching the level of the brightest conjugated polymer NPs. Other properties, notably photostability, color, blinking, as well as particle size and surface chemistry are also systematically analyzed. Finally, major and emerging applications of dye-loaded NPs for in vitro and in vivo imaging are reviewed. PMID:26901678

  2. Gadolinia nanofibers as a multimodal bioimaging and potential radiation therapy agent

    Energy Technology Data Exchange (ETDEWEB)

    Grishin, A. M., E-mail: grishin@kth.se, E-mail: grishin@inmatech.com [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Petrozavodsk State University, 185910 Petrozavodsk, Karelian Republic (Russian Federation); Jalalian, A. [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Tsindlekht, M. I. [Racah Institute of Physics, Hebrew University of Jerusalem, 91904 Jerusalem (Israel)

    2015-05-15

    Continuous bead-free C-type cubic gadolinium oxide (Gd{sub 2}O{sub 3}) nanofibers 20-30 μm long and 40-100 nm in diameter were sintered by sol-gel calcination assisted electrospinning technique. Dipole-dipole interaction of neighboring Gd{sup 3+} ions in nanofibers with large length-to-diameter aspect ratio results in some kind of superparamagnetic behavior: fibers are magnetized twice stronger than Gd{sub 2}O{sub 3} powder. Being compared with commercial Gd-DTPA/Magnevist{sup ®}, Gd{sub 2}O{sub 3} diethyleneglycol-coated (Gd{sub 2}O{sub 3}-DEG) fibers show high 1/T{sub 1} and 1/T{sub 2} proton relaxivities. Intense room temperature photoluminescence, high NMR relaxivity and high neutron scattering cross-section of {sup 157}Gd nucleus promise to integrate Gd{sub 2}O{sub 3} fibers for multimodal bioimaging and neutron capture therapy.

  3. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xiaodong [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China); Gao, Ya; Huang, Zhibing; Chen, Xiaohui; Ke, Zhiyong [School of Basic Medical Science, Southern Medical University, Guangzhou 510515 (China); Zhao, Peiliang; Yan, Yichen [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Liu, Ruiyuan, E-mail: ruiyliu@smu.edu.cn [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Qu, Jinqing, E-mail: cejqqu@scut.edu.cn [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China)

    2015-07-01

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully.

  4. Aqueous synthesis and biostabilization of CdS@ZnS quantum dots for bioimaging applications

    Science.gov (United States)

    Chen, L.; Liu, Y.; Lai, C.; Berry, R. M.; Tam, K. C.

    2015-10-01

    Bionanohybrids, combining biocompatible natural polymers with inorganic materials, have aroused interest because of their structural, functional, and environmental advantages. In this work, we report on the stabilization of CdS@ZnS core-shell quantum dots (QDs) using carboxylated cellulose nanocrytals (CNCs) as nanocarrieers in aqueous phase. The high colloidal stability was achieved with sufficient negative charge on CNC surface and the coordination of Cd2+ to carboxylate groups. This coordination allows the in-situ nucleation and growth of QDs on CNC surface. The influences of QD to CNC ratio, pH and ZnS coating on colloidal stability and photoluminescence property of CNC/QD nanohybirds were also studied. The results showed that products obtained at pH 8 with a CdS to CNC weight ratio of 0.19 and a ZnS/CdS molar ratio of 1.5 possessed excellent colloidal stability and highest photoluminescence intensity. By anchoring QDs on rigid bionanotemplates, CNC/CdS@ZnS exhibited long-term colloidal and optical stability. Using biocompatible CNC as nanocarriers, the products have been demonstrated to exhibit low cytotoxicity towards HeLa cells and can serve as promising red-emitting fluorescent bioimaging probes.

  5. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    International Nuclear Information System (INIS)

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully

  6. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    International Nuclear Information System (INIS)

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract

  7. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Polito, Anthony B.; Maurer-Gardner, Elizabeth I.; Hussain, Saber M., E-mail: saber.hussain@us.af.mil [Air Force Research Laboratory, Molecular Bioeffects Branch, Bioeffects Division, Human Effectiveness Directorate (United States)

    2015-12-15

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract.

  8. System and method for making quantum dots

    KAUST Repository

    Bakr, Osman M.

    2015-05-28

    Embodiments of the present disclosure provide for methods of making quantum dots (QDs) (passivated or unpassivated) using a continuous flow process, systems for making QDs using a continuous flow process, and the like. In one or more embodiments, the QDs produced using embodiments of the present disclosure can be used in solar photovoltaic cells, bio-imaging, IR emitters, or LEDs.

  9. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main ...

  10. Bio-image warehouse system: concept and implementation of a diagnosis-based data warehouse for advanced imaging modalities in neuroradiology.

    Science.gov (United States)

    Minati, L; Ghielmetti, F; Ciobanu, V; D'Incerti, L; Maccagnano, C; Bizzi, A; Bruzzone, M G

    2007-03-01

    Advanced neuroimaging techniques, such as functional magnetic resonance imaging (fMRI), chemical shift spectroscopy imaging (CSI), diffusion tensor imaging (DTI), and perfusion-weighted imaging (PWI) create novel challenges in terms of data storage and management: huge amounts of raw data are generated, the results of analysis may depend on the software and settings that have been used, and most often intermediate files are inherently not compliant with the current DICOM (digital imaging and communication in medicine) standard, as they contain multidimensional complex and tensor arrays and various other types of data structures. A software architecture, referred to as Bio-Image Warehouse System (BIWS), which can be used alongside a radiology information system/picture archiving and communication system (RIS/PACS) system to store neuroimaging data for research purposes, is presented. The system architecture is conceived with the purpose of enabling to query by diagnosis according to a predefined two-layered classification taxonomy. The operational impact of the system and the time needed to get acquainted with the web-based interface and with the taxonomy are found to be limited. The development of modules enabling automated creation of statistical templates is proposed.

  11. Comprehensive cosmographic analysis by Markov chain method

    International Nuclear Information System (INIS)

    We study the possibility of extracting model independent information about the dynamics of the Universe by using cosmography. We intend to explore it systematically, to learn about its limitations and its real possibilities. Here we are sticking to the series expansion approach on which cosmography is based. We apply it to different data sets: Supernovae type Ia (SNeIa), Hubble parameter extracted from differential galaxy ages, gamma ray bursts, and the baryon acoustic oscillations data. We go beyond past results in the literature extending the series expansion up to the fourth order in the scale factor, which implies the analysis of the deceleration q0, the jerk j0, and the snap s0. We use the Markov chain Monte Carlo method (MCMC) to analyze the data statistically. We also try to relate direct results from cosmography to dark energy (DE) dynamical models parametrized by the Chevallier-Polarski-Linder model, extracting clues about the matter content and the dark energy parameters. The main results are: (a) even if relying on a mathematical approximate assumption such as the scale factor series expansion in terms of time, cosmography can be extremely useful in assessing dynamical properties of the Universe; (b) the deceleration parameter clearly confirms the present acceleration phase; (c) the MCMC method can help giving narrower constraints in parameter estimation, in particular for higher order cosmographic parameters (the jerk and the snap), with respect to the literature; and (d) both the estimation of the jerk and the DE parameters reflect the possibility of a deviation from the ΛCDM cosmological model.

  12. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    Directory of Open Access Journals (Sweden)

    Azadeh Manayi

    2015-01-01

    Full Text Available Echinacea purpurea (Asteraceae is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists′ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant′s mechanism of action using new, complementary methods.

  13. Methods for the proximate analysis of peat

    Energy Technology Data Exchange (ETDEWEB)

    Sheppard, J.D.; Tibbetts, T.E.; Forgeron, D.W.

    1986-01-01

    An investigation was conducted into methods for determining the percentages of volatile matter and ash in peat. Experiments were performed on two types of sphagnum peat, a decomposed fuel peat and a commercial horticultural grade peat. The heating apparatus consisted of both a standard programmable furnace (Fisher Coal Analyser) and a thermogravimetric analyser with a module for differential scanning calorimetry (Mettler TA 3000 system). The results indicate that the seven minute test for volatile matter at either 900 C or 950 C does not fully differentiate volatiles from fixed carbon and, depending on the degree of decomposition, up to sixty minutes at 900 C may be required. The TGA system is very useful in discriminating between different fractions of volatile matter. The relative fractions are more important in determining burning characteristics than the total percentage of volatiles. Ashing must be performed under conditions sufficiently severe to ensure complete combustion of organics. The severity that is required is mainly dependent on the degree of decomposition and sample size. Use of TGA and DSC for studying the combustion of peat provides much more information than the standard proximate analysis. 14 refs.

  14. Mapping Cigarettes Similarities using Cluster Analysis Methods

    Directory of Open Access Journals (Sweden)

    Lorentz Jäntschi

    2007-09-01

    Full Text Available The aim of the research was to investigate the relationship and/or occurrences in and between chemical composition information (tar, nicotine, carbon monoxide, market information (brand, manufacturer, price, and public health information (class, health warning as well as clustering of a sample of cigarette data. A number of thirty cigarette brands have been analyzed. Six categorical (cigarette brand, manufacturer, health warnings, class and four continuous (tar, nicotine, carbon monoxide concentrations and package price variables were collected for investigation of chemical composition, market information and public health information. Multiple linear regression and two clusterization techniques have been applied. The study revealed interesting remarks. The carbon monoxide concentration proved to be linked with tar and nicotine concentration. The applied clusterization methods identified groups of cigarette brands that shown similar characteristics. The tar and carbon monoxide concentrations were the main criteria used in clusterization. An analysis of a largest sample could reveal more relevant and useful information regarding the similarities between cigarette brands.

  15. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  16. Critical Security Methods : New Frameworks for Analysis

    NARCIS (Netherlands)

    Voelkner, Nadine; Huysmans, Jef; Claudia, Aradau; Neal, Andrew

    2015-01-01

    Critical Security Methods offers a new approach to research methods in critical security studies. It argues that methods are not simply tools to bridge the gap between security theory and security practice. Rather, to practise methods critically means engaging in a more free and experimental interpl

  17. K-method of cognitive mapping analysis

    OpenAIRE

    Snarskii, A. A.; Zorinets, D. I.; Lande, D. V.; Levchenko, A. V.

    2016-01-01

    Introduced a new calculation method (K-method) for cognitive maps. K - method consists of two consecutive steps. In the first stage, allocated subgraph composed of all paths from one selected node (concept) to another node (concept) from the cognitive map (directed weighted graph) . In the second stage, after the transition to an undirected graph (symmetrization adjacency matrix) the influence of one node to another calculated with Kirchhoff method. In the proposed method, there is no problem...

  18. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  19. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications

    OpenAIRE

    Li-wei Liu; Si-yi Hu; Ying Pan; Jia-qi Zhang; Yue-shu Feng; Xi-he Zhang

    2014-01-01

    In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell n...

  20. Solving Generalised Riccati Differential Equations by Homotopy Analysis Method

    Directory of Open Access Journals (Sweden)

    J. Vahidi

    2013-07-01

    Full Text Available In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM. Comparisons are made between Adomian’s decomposition method (ADM and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

  1. Polydiacetylene-enclosed near-infrared fluorescent semiconducting polymer dots for bioimaging and sensing.

    Science.gov (United States)

    Wu, Pei-Jing; Kuo, Shih-Yu; Huang, Ya-Chi; Chen, Chuan-Pin; Chan, Yang-Hsiang

    2014-05-20

    Semiconducting polymer dots (P-dots) recently have emerged as a new type of ultrabright fluorescent probe with promising applications in biological imaging and detection. With the increasing desire for near-infrared (NIR) fluorescing probes for in vivo biological measurements, the currently available NIR-emitting P-dots are very limited and the leaching of the encapsulated dyes/polymers has usually been a concern. To address this challenge, we first embedded the NIR dyes into the matrix of poly[(9,9-dioctylfluorene)-co-2,1,3-benzothiadiazole-co-4,7-di(thiophen-2-yl)-2,1,3-benzothiadiazole] (PF-BT-DBT) polymer and then enclosed the doped P-dots with polydiacetylenes (PDAs) to avoid potential leakage of the entrapped NIR dyes from the P-dot matrix. These PDA-enclosed NIR-emitting P-dots not only emitted much stronger NIR fluorescence than conventional organic molecules but also exhibited enhanced photostability over CdTe quantum dots, free NIR dyes, and gold nanoclusters. We next conjugated biomolecules onto the surface of the resulting P-dots and demonstrated their capability for specific cellular labeling without any noticeable nonspecific binding. To employ this new class of material as a facile sensing platform, an easy-to-prepare test paper, obtained by soaking the paper into the PDA-enclosed NIR-emitting P-dot solution, was used to sense external stimuli such as ions, temperature, or pH, depending on the surface functionalization of PDAs. We believe these PDA-coated NIR-fluorescing P-dots will be very useful in a variety of bioimaging and analytical applications. PMID:24749695

  2. Low-temperature approach to highly emissive copper indium sulfide colloidal nanocrystals and their bioimaging applications.

    Science.gov (United States)

    Yu, Kui; Ng, Peter; Ouyang, Jianying; Zaman, Md Badruz; Abulrob, Abedelnasser; Baral, Toya Nath; Fatehi, Dorothy; Jakubek, Zygmunt J; Kingston, David; Wu, Xiaohua; Liu, Xiangyang; Hebert, Charlie; Leek, Donald M; Whitfield, Dennis M

    2013-04-24

    We report our newly developed low-temperature synthesis of colloidal photoluminescent (PL) CuInS2 nanocrystals (NCs) and their in vitro and in vivo imaging applications. With diphenylphosphine sulphide (SDPP) as a S precursor made from elemental S and diphenylphosphine, this is a noninjection based approach in 1-dodecanethiol (DDT) with excellent synthetic reproducibility and large-scale capability. For a typical synthesis with copper iodide (CuI) as a Cu source and indium acetate (In(OAc)3) as an In source, the growth temperature was as low as 160 °C and the feed molar ratios were 1Cu-to-1In-to-4S. Amazingly, the resulting CuInS2 NCs in toluene exhibit quantum yield (QY) of ~23% with photoemission peaking at ~760 nm and full width at half maximum (FWHM) of ~140 nm. With a mean size of ~3.4 nm (measured from the vertices to the bases of the pyramids), they are pyramidal in shape with a crystal structure of tetragonal chalcopyrite. In situ (31)P NMR (monitored from 30 °C to 100 °C) and in situ absorption at 80 °C suggested that the Cu precursor should be less reactive toward SDPP than the In precursor. For our in vitro and in vivo imaging applications, CuInS2/ZnS core-shell QDs were synthesized; afterwards, dihydrolipoic acid (DHLA) or 11-mercaptoundecanoic acid (MUA) were used for ligand exchange and then bio-conjugation was performed. Two single-domain antibodies (sdAbs) were used. One was 2A3 for in vitro imaging of BxPC3 pancreatic cancer cells. The other was EG2 for in vivo imaging of a Glioblastoma U87MG brain tumour model. The bioimaging data illustrate that the CuInS2 NCs from our SDPP-based low-temperature noninjection approach are good quality. PMID:23486927

  3. Methods for analysis of fluoroquinolones in biological fluids

    Science.gov (United States)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  4. NOA: a novel Network Ontology Analysis method

    OpenAIRE

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) meth...

  5. Methods for Mediation Analysis with Missing Data

    Science.gov (United States)

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  6. Statistical Analysis and Multivariate Methods in MS Excel

    OpenAIRE

    Postler, Štěpán

    2010-01-01

    The aim of this thesis is creating an application in Microsoft Office Excel 2003 that allows user to solve problems using some statistical analysis methods. This application is provided in stat.xls file, which is the main part of the thesis. The XLS file enables Excel to applicate methods of univariate and bivariate categorical analysis of frequencies, univariate, bivariate and even multivariate analysis of quantitative data and index analysis of economic data. To applicate the methods the fi...

  7. Multiscale Methods for Nuclear Reactor Analysis

    Science.gov (United States)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  8. SYSTEMATIZATION AND ANALYSIS OF METHODS FOR MACHINING HOLES IN COMPOSITES

    OpenAIRE

    Пасічник, Віталій Анатолійович; Черказний, Віталій Юрійович

    2015-01-01

    Purpose. Analysis of the literature on mechanical methods of processing holes in composite materials, analysis and systematization. Creating a system of evaluation according to the needs and situation.Design/methodology/approach. Managing estimates for sets of indicators on quality, productivity and efficiency, technological capabilities of each method, the system determines which method in this situation is the most effective.Findings. Conducted analysis of methods for machining holes in com...

  9. Present methods for mineralogical analysis of uranium ores

    International Nuclear Information System (INIS)

    Most promising methods of mineralogic analysis of uranium and uranium-containing minerals, ores and rocks are considered. They include X-ray diffraction, electron microscopy and fluorescence spectroscopy methods. Principle physical basis and capabilities of each method are described; examples of its practical application are presented. Comparative characteristic of method for mineralogic analysis of radioactive ores and their reprocessing products is given. Attention is paid to the equipment and various devices for analysis

  10. Cost benefit analysis methods in public sector

    OpenAIRE

    Kinnunen, T.

    2016-01-01

    Cost-benefit analysis is an economic analysis tool that can be used to support public decision making, when there are several mutually exclusive alternatives being considered. It compares the monetary value of the benefits resulting from a specific project or policy with the costs accrued by it. However, it would appear that it is currently used mainly for investment projects, and not for analyzing public services. This thesis is a literature study on the use of cost-benefit analysis in the p...

  11. Nuclear analysis methods in monitoring occupational health

    International Nuclear Information System (INIS)

    With the increasing industrialisation of the world has come an increase in exposure to hazardous chemicals. Their effect on the body depends upon the concentration of the element in the work environment; its chemical form; the possible different routes of intake; and the individual's biological response to the chemical. Nuclear techniques of analysis such as neutron activation analysis (NAA) and proton induced X-ray emission analysis (PIXE), have played an important role in understanding the effects hazardous chemicals can have on occupationally exposed workers. In this review, examples of their application, mainly in monitoring exposure to heavy metals is discussed

  12. Numerical analysis in electromagnetics the TLM method

    CERN Document Server

    Saguet, Pierre

    2013-01-01

    The aim of this book is to give a broad overview of the TLM (Transmission Line Matrix) method, which is one of the "time-domain numerical methods". These methods are reputed for their significant reliance on computer resources. However, they have the advantage of being highly general.The TLM method has acquired a reputation for being a powerful and effective tool by numerous teams and still benefits today from significant theoretical developments. In particular, in recent years, its ability to simulate various situations with excellent precision, including complex materials, has been

  13. Chemical Analysis Methods for Silicon Carbide

    Institute of Scientific and Technical Information of China (English)

    Shen Keyin

    2006-01-01

    @@ 1 General and Scope This Standard specifies the determination method of silicon dioxide, free silicon, free carbon, total carbon, silicon carbide, ferric sesquioxide in silicon carbide abrasive material.

  14. Steel mill products analysis using qualities methods

    Directory of Open Access Journals (Sweden)

    B. Gajdzik

    2016-10-01

    Full Text Available The article presents the subject matter of steel mill product analysis using quality tools. The subject of quality control were bolts and a ball bushing. The Pareto chart and fault mode and effect analysis (FMEA were used to assess faultiness of the products. The faultiness analysis in case of the bolt enabled us to detect the following defects: failure to keep the dimensional tolerance, dents and imprints, improper roughness, lack of pre-machining, non-compatibility of the electroplating and faults on the surface. Analysis of the ball bushing has also revealed defects such as: failure to keep the dimensional tolerance, dents and imprints, improper surface roughness, lack of surface premachining as well as sharp edges and splitting of the material.

  15. Human resources methods analysis in engineering company

    OpenAIRE

    Akšteinová, Michaela

    2009-01-01

    Aim of this bachelor thesis is to make analysis of human resources activities in company SPG Czech,s.r.o. which produces machines. Thesis is divided in theoretical and practical part. In theoretical part human resources activities are described: work analysis, obtaining and choosing of employees, accepting and adjustment of employees, managing the job performance and evaluating the employees, education and formation of the employees, rewarding of the employees and care of the employees. Based...

  16. Analysis of queues methods and applications

    CERN Document Server

    Gautam, Natarajan

    2012-01-01

    Introduction Analysis of Queues: Where, What, and How?Systems Analysis: Key ResultsQueueing Fundamentals and Notations Psychology in Queueing Reference Notes Exercises Exponential Interarrival and Service Times: Closed-Form Expressions Solving Balance Equations via Arc CutsSolving Balance Equations Using Generating Functions Solving Balance Equations Using Reversibility Reference Notes ExercisesExponential Interarrival and Service Times: Numerical Techniques and Approximations Multidimensional Birth and Death ChainsMultidimensional Markov Chains Finite-State Markov ChainsReference Notes Exerci

  17. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  18. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  19. Sensometrics methods for descriptive analysis and sorting task. Two applications

    OpenAIRE

    Miranda, Karen

    2013-01-01

    We have presented the results obtained through two sensory methods that are descriptive analysis and categorization. Several statistical methods have been applied to analyze the results: ANOVA; PCA, MCA and MFA.. Sensometrics methods for descriptive analysis and sorting task have to be applied in two cases related to food and beverage industries

  20. Methods of Fourier analysis and approximation theory

    CERN Document Server

    Tikhonov, Sergey

    2016-01-01

    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  1. Method for chromium analysis and speciation

    Energy Technology Data Exchange (ETDEWEB)

    Aiken, Abigail M.; Peyton, Brent M.; Apel, William A.; Petersen, James N.

    2004-11-02

    A method of detecting a metal in a sample comprising a plurality of metal is disclosed. The method comprises providing the sample comprising a metal to be detected. The sample is added to a reagent solution comprising an enzyme and a substrate, where the enzyme is inhibited by the metal to be detected. An array of chelating agents is used to eliminate the inhibitory effects of additional metals in the sample. An enzymatic activity in the sample is determined and compared to an enzymatic activity in a control solution to detect the metal to be detected. A method of determining a concentration of the metal in the sample is also disclosed. A method of detecting a valence state of a metal is also disclosed.

  2. Economic Analysis of Paddy Threshing Methods

    OpenAIRE

    Prasanna, P.H.S.N.; L H P Gunaratne; Withana, W.D.R.S.

    2004-01-01

    Post-harvest losses of paddy in Sri Lanka are as high as 15 percent of total production. Of this, about 24 percent of losses occur during the threshing and cleaning stage with tractor treading being the most common paddy threshing method. In order to overcome these deficiencies, recently small and combined threshers have been introduced. This study attempted to determine the efficiency of different paddy threshing methods, and to estimate the profitability of small and combined thresher owner...

  3. AVIS: analysis method for document coherence

    International Nuclear Information System (INIS)

    The present document intends to give a short insight into AVIS, a method which permits to verify the quality of technical documents. The paper includes the presentation of the applied approach based on the K.O.D. method, the definition of quality criteria of a technical document, as well as a description of the means of valuating these criteria. (authors). 9 refs., 2 figs

  4. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  5. Chemical aspects of nuclear methods of analysis

    International Nuclear Information System (INIS)

    This final report includes papers which fall into three general areas: development of practical pre-analysis separation techniques, uranium/thorium separation from other elements for analytical and processing operations, and theory and mechanism of separation techniques. A separate abstract was prepared for each of the 9 papers

  6. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  7. Analysis of Two Methods to Evaluate Antioxidants

    Science.gov (United States)

    Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor

    2012-01-01

    This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content…

  8. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  9. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  10. Adaptive computational methods for aerothermal heating analysis

    Science.gov (United States)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  11. Discrete element analysis methods of generic differential quadratures

    CERN Document Server

    Chen, Chang-New

    2008-01-01

    Presents generic differential quadrature, the extended differential quadrature and the related discrete element analysis methods. This book demonstrated their ability for solving generic scientific and engineering problems.

  12. An ESDIRK Method with Sensitivity Analysis Capabilities

    DEFF Research Database (Denmark)

    Kristensen, Morten Rode; Jørgensen, John Bagterp; Thomsen, Per Grove;

    2004-01-01

    of the sensitivity equations. A key feature is the reuse of information already computed for the state integration, hereby minimizing the extra effort required for sensitivity integration. Through case studies the new algorithm is compared to an extrapolation method and to the more established BDF based approaches...

  13. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  14. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  15. Updated Methods for Seed Shape Analysis

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2016-01-01

    Full Text Available Morphological variation in seed characters includes differences in seed size and shape. Seed shape is an important trait in plant identification and classification. In addition it has agronomic importance because it reflects genetic, physiological, and ecological components and affects yield, quality, and market price. The use of digital technologies, together with development of quantification and modeling methods, allows a better description of seed shape. Image processing systems are used in the automatic determination of seed size and shape, becoming a basic tool in the study of diversity. Seed shape is determined by a variety of indexes (circularity, roundness, and J index. The comparison of the seed images to a geometrical figure (circle, cardioid, ellipse, ellipsoid, etc. provides a precise quantification of shape. The methods of shape quantification based on these models are useful for an accurate description allowing to compare between genotypes or along developmental phases as well as to establish the level of variation in different sets of seeds.

  16. MLPA method for PMP22 gene analysis:

    OpenAIRE

    Kokalj-Vokač, Nadja; Stangler Herodež, Špela; Zagradišnik, Boris

    2005-01-01

    DNA copy number alterations are responsible for several categories of human diseases and syndromes. These changes can be detected by cytogenetic studies when there is involvement of several kilobases or megabases of DNA. Examination of sub-microscopic changes is possible by using short probes flanked by the same primer pairs. Multiplex ligation-dependent probe amplification (MLPA) is a simple, high resolution method by which not sample nucleic acids but probes added to the samples are amplifi...

  17. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  18. Geometrical Methods for Power Network Analysis

    CERN Document Server

    Bellucci, Stefano; Gupta, Neeraj

    2013-01-01

    This book is a short introduction to power system planning and operation using advanced geometrical methods. The approach is based on well-known insights and techniques developed in theoretical physics in the context of Riemannian manifolds. The proof of principle and robustness of this approach is examined in the context of the IEEE 5 bus system. This work addresses applied mathematicians, theoretical physicists and power engineers interested in novel mathematical approaches to power network theory.

  19. PERFORMANCE ANALYSIS OF HARDWARE TROJAN DETECTION METHODS

    OpenAIRE

    Ehsan, Sharifi; Kamal, Mohammadiasl; Mehrdad, Havasi; Amir, Yazdani

    2015-01-01

    Due to the increasing use of information and communication technologies in most aspects of life, security of the information has drawn the attention of governments and industry as well as the researchers. In this regard, structural attacks on the functions of a chip are called hardware Trojans, and are capable of rendering ineffective the security protecting our systems and data. This method represents a big challenge for cyber-security as it is nearly impossible to detect with any currently ...

  20. Casting defects analysis by the Pareto method

    Directory of Open Access Journals (Sweden)

    B. Borowiecki

    2011-07-01

    Full Text Available On the basis of receive results formed of diagram Pareto Lorenz. On the basis of receive graph it affirmed, that for 70% general number casting defects answered 3 defects (9 contribution – 100% defects. For 70% general number defects of influence it has three type of causes: sand holes, porosity and slaginclusions. Thedefects show that it is necessary to take up construction gatingsystem. The remaining 8 causes have been concerned only 25%, with general number of casting defects. Analysis of receive results permit to determine of direction of correct actions in order to eliminate or to limit the most defects.

  1. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested...... to describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations...

  2. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that per

  3. Sensitivity analysis via reduced order adjoint method

    International Nuclear Information System (INIS)

    Notwithstanding the voluminous literature on adjoint sensitivity analysis, it has been generally dismissed by practitioners as cumbersome with limited value in realistic engineering models. This perception reflects two limitations about adjoint sensitivity analysis: a) its most effective application is limited to calculation of first-order variations; when higher order derivatives are required, it quickly becomes computationally inefficient; and b) the number of adjoint model evaluations depends on the number of responses, which renders it ineffective for multi-physics model where entire distributions, such as flux and power distribution, are often transferred between the various physics models. To overcome these challenges, this manuscript employs recent advances in reduced order modeling to re-cast the adjoint model equations into a form that renders its application to real reactor models practical. Past work applied reduced order modeling techniques to render reduction for general nonlinear high dimensional models by identifying mathematical subspaces, called active subspaces, that capture all dominant features of the model, including both linear and nonlinear variations. We demonstrate the application of these techniques to the calculation of first-order derivatives, or as commonly known sensitivity coefficients, for a fuel assembly model with many responses. We show that the computational cost becomes dependent on the physics model itself, via the so-called rank of the active subspace, rather than the number of responses or parameters. (author)

  4. The Constant Comparative Method of Qualitative Analysis

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, Ph.D.

    2008-11-01

    Full Text Available Currently, the general approaches to the analysis of qualitative data are these:1. If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i2. If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties.We wish to suggest a third approach

  5. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  6. Vulnerability analysis of three remote voting methods

    CERN Document Server

    Enguehard, Chantal

    2009-01-01

    This article analyses three methods of remote voting in an uncontrolled environment: postal voting, internet voting and hybrid voting. It breaks down the voting process into different stages and compares their vulnerabilities considering criteria that must be respected in any democratic vote: confidentiality, anonymity, transparency, vote unicity and authenticity. Whether for safety or reliability, each vulnerability is quantified by three parameters: size, visibility and difficulty to achieve. The study concludes that the automatisation of treatments combined with the dematerialisation of the objects used during an election tends to substitute visible vulnerabilities of a lesser magnitude by invisible and widespread vulnerabilities.

  7. Stochastic back analysis of permeability coefficient using generalized Bayesian method

    Institute of Scientific and Technical Information of China (English)

    Zheng Guilan; Wang Yuan; Wang Fei; Yang Jian

    2008-01-01

    Owing to the fact that the conventional deterministic back analysis of the permeability coefficient cannot reflect the uncertainties of parameters, including the hydraulic head at the boundary, the permeability coefficient and measured hydraulic head, a stochastic back analysis taking consideration of uncertainties of parameters was performed using the generalized Bayesian method. Based on the stochastic finite element method (SFEM) for a seepage field, the variable metric algorithm and the generalized Bayesian method, formulas for stochastic back analysis of the permeability coefficient were derived. A case study of seepage analysis of a sluice foundation was performed to illustrate the proposed method. The results indicate that, with the generalized Bayesian method that considers the uncertainties of measured hydraulic head, the permeability coefficient and the hydraulic head at the boundary, both the mean and standard deviation of the permeability coefficient can be obtained and the standard deviation is less than that obtained by the conventional Bayesian method. Therefore, the present method is valid and applicable.

  8. Experimental and analysis methods in radiochemical experiments

    Science.gov (United States)

    Cattadori, C. M.; Pandola, L.

    2016-04-01

    Radiochemical experiments made the history of neutrino physics by achieving the first observation of solar neutrinos (Cl experiment) and the first detection of the fundamental pp solar neutrinos component (Ga experiments). They measured along decades the integral νe charged current interaction rate in the exposed target. The basic operation principle is the chemical separation of the few atoms of the new chemical species produced by the neutrino interactions from the rest of the target, and their individual counting in a low-background counter. The smallness of the expected interaction rate (1 event per day in a ˜ 100 ton target) poses severe experimental challenges on the chemical and on the counting procedures. The main aspects related to the analysis techniques employed in solar neutrino experiments are reviewed and described, with a special focus given to the event selection and the statistical data treatment.

  9. Surface analysis methods in materials science

    CERN Document Server

    Sexton, Brett; Smart, Roger

    1992-01-01

    The idea for this book stemmed from a remark by Philip Jennings of Murdoch University in a discussion session following a regular meeting of the Australian Surface Science group. He observed that a text on surface analysis and applica­ tions to materials suitable for final year undergraduate and postgraduate science students was not currently available. Furthermore, the members of the Australian Surface Science group had the research experience and range of coverage of sur­ face analytical techniques and applications to provide a text for this purpose. A of techniques and applications to be included was agreed at that meeting. The list intended readership of the book has been broadened since the early discussions, particularly to encompass industrial users, but there has been no significant alter­ ation in content. The editors, in consultation with the contributors, have agreed that the book should be prepared for four major groups of readers: - senior undergraduate students in chemistry, physics, metallur...

  10. Experimental and analysis methods in radiochemical experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cattadori, C.M. [INFN, Milano (Italy); Pandola, L. [Laboratori Nazionali del Sud, INFN, Catania (Italy); Gran Sasso Science Institute, INFN, L' Aquila (Italy)

    2016-04-15

    Radiochemical experiments made the history of neutrino physics by achieving the first observation of solar neutrinos (Cl experiment) and the first detection of the fundamental pp solar neutrinos component (Ga experiments). They measured along decades the integral ν{sub e} charged current interaction rate in the exposed target. The basic operation principle is the chemical separation of the few atoms of the new chemical species produced by the neutrino interactions from the rest of the target, and their individual counting in a low-background counter. The smallness of the expected interaction rate (1 event per day in a ∝ 100 ton target) poses severe experimental challenges on the chemical and on the counting procedures. The main aspects related to the analysis techniques employed in solar neutrino experiments are reviewed and described, with a special focus given to the event selection and the statistical data treatment. (orig.)

  11. Comparative Study Among Lease Square Method, Steepest Descent Method, and Conjugate Gradient Method for Atmopsheric Sounder Data Analysis

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-09-01

    Full Text Available Comparative study among Least Square Method: LSM, Steepest Descent Method: SDM, and Conjugate Gradient Method: CGM for atmospheric sounder data analysis (estimation of vertical profiles for water vapor is conducted. Through simulation studies, it is found that CGM shows the best estimation accuracy followed by SDM and LSM. Method dependency on atmospheric models is also clarified.

  12. Molten Salt Breeder Reactor Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinsu; Jeong, Yongjin; Lee, Deokjung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2015-05-15

    Utilizing the uranium-thorium fuel cycle shows considerable potential for the possibility of MSR. The concept of MSBR should be revised because of molten salt reactor's advantage such as outstanding neutron economy, possibility of continuous online reprocessing and refueling, a high level of inherent safety, and economic benefit by keeping off the fuel fabrication process. For the development of MSR research, this paper provides the MSBR single-cell, two-cell and whole core model for computer code input, and several calculation results including depletion calculation of each models. The calculations are carried out by using MCNP6, a Monte Carlo computer code, which has CINDER90 for depletion calculation using ENDF-VII nuclear data. From the calculation results of various reactor design parameters, the temperature coefficients are all negative at the initial state and MTC becomes positive at the equilibrium state. From the results of core rod worth, the graphite control rod alone cannot makes the core subcritical at initial state. But the equilibrium state, the core can be made subcritical state only by graphite control rods. Through the comparison of the results of each models, the two-cell method can represent the MSBR core model more accurately with a little more computational resources than the single-cell method. Many of the thermal spectrum MSR have adopted a multi-region single-fluid strategy.

  13. Chemical bioimaging for the subcellular localization of trace elements by high contrast TEM, TEM/X-EDS, and NanoSIMS.

    Science.gov (United States)

    Penen, Florent; Malherbe, Julien; Isaure, Marie-Pierre; Dobritzsch, Dirk; Bertalan, Ivo; Gontier, Etienne; Le Coustumer, Philippe; Schaumlöffel, Dirk

    2016-09-01

    Chemical bioimaging offers an important contribution to the investigation of biochemical functions, biosorption and bioaccumulation processes of trace elements via their localization at the cellular and even at the subcellular level. This paper describes the combined use of high contrast transmission electron microscopy (HC-TEM), energy dispersive X-ray spectroscopy (X-EDS), and nano secondary ion mass spectrometry (NanoSIMS) applied to a model organism, the unicellular green algae Chlamydomonas reinhardtii. HC-TEM providing a lateral resolution of 1nm was used for imaging the ultrastructure of algae cells which have diameters of 5-10μm. TEM coupled to X-EDS (TEM/X-EDS) combined textural (morphology and size) analysis with detection of Ca, P, K, Mg, Fe, and Zn in selected subcellular granules using an X-EDS probe size of approx. 1μm. However, instrumental sensitivity was at the limit for trace element detection. NanoSIMS allowed chemical imaging of macro and trace elements with subcellular resolution (element mapping). Ca, Mg, and P as well as the trace elements Fe, Cu, and Zn present at basal levels were detected in pyrenoids, contractile vacuoles, and granules. Some metals were even localized in small vesicles of about 200nm size. Sensitive subcellular localization of trace metals was possible by the application of a recently developed RF plasma oxygen primary ion source on NanoSIMS which has shown good improvements in terms of lateral resolution (below 50nm), sensitivity, and stability. Furthermore correlative single cell imaging was developed combining the advantages of TEM and NanoSIMS. An advanced sample preparation protocol provided adjacent ultramicrotome sections for parallel TEM and NanoSIMS analyses of the same cell. Thus, the C. reinhardtii cellular ultrastructure could be directly related to the spatial distribution of metals in different cell organelles such as vacuoles and chloroplast. PMID:27288221

  14. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  15. Simultaneous realization of Hg2+ sensing, magnetic resonance imaging and upconversion luminescence in vitro and in vivo bioimaging based on hollow mesoporous silica coated UCNPs and ruthenium complex

    Science.gov (United States)

    Ge, Xiaoqian; Sun, Lining; Ma, Binbin; Jin, Di; Dong, Liang; Shi, Liyi; Li, Nan; Chen, Haige; Huang, Wei

    2015-08-01

    We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak of the Ru complex is observed and the energy transfer process between the UCNPs and the Ru complex was blocked, resulting in an increase of the green emission intensity of the UCNPs. The un-changed 801 nm emission of the nanoprobe was used as an internal standard reference and the detection limit of Hg2+ was determined to be 0.16 μM for this nanoprobe in aqueous solution. In addition, based on the low cytotoxicity as studied by CCK-8 assay, the nanoprobe was successfully applied for cell imaging and small animal imaging. Furthermore, when doped with Gd3+ ions, the nanoprobe was successfully applied to in vivo magnetic resonance imaging (MRI) of Kunming mice, which demonstrates its potential as a MRI positive-contrast agent. Therefore, the method and results may provide more exciting opportunities to afford nanoprobes with multimodal bioimaging and multifunctional applications.We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak

  16. Modified Homotopy Analysis Method for Zakharov-Kuznetsov Equations

    Directory of Open Access Journals (Sweden)

    Muhammad USMAN

    2013-01-01

    Full Text Available In this paper, we apply Modified Homotopy Analysis Method (MHAM to find appropriate solutions of Zakharov-Kuznetsov equations which are of utmost importance in applied and engineering sciences. The proposed modification is the elegant coupling of Homotopy Analysis Method (HAM and Taylor’s series. Numerical results coupled with graphical representation explicitly reveal the complete reliability of the proposed algorithm.

  17. An improved evaluation method for fault tree kinetic analysis

    International Nuclear Information System (INIS)

    By means of the exclusive sum of products of a fault tree, the improved method uses the basic event parameters direct in the synthetic evaluation and makes the fault tree kinetic analysis more simple. This paper provides a reasonable evaluation method for the kinetic analysis of basic events which has parameters of the synthetic distribution, too

  18. A Comparison of Card-sorting Analysis Methods

    DEFF Research Database (Denmark)

    Nawaz, Ather

    2012-01-01

    This study investigates how the choice of analysis method for card sorting studies affects the suggested information structure for websites. In the card sorting technique, a variety of methods are used to analyse the resulting data. The analysis of card sorting data helps user experience (UX...

  19. Analysis of mesoscale forecasts using ensemble methods

    CERN Document Server

    Gross, Markus

    2016-01-01

    Mesoscale forecasts are now routinely performed as elements of operational forecasts and their outputs do appear convincing. However, despite their realistic appearance at times the comparison to observations is less favorable. At the grid scale these forecasts often do not compare well with observations. This is partly due to the chaotic system underlying the weather. Another key problem is that it is impossible to evaluate the risk of making decisions based on these forecasts because they do not provide a measure of confidence. Ensembles provide this information in the ensemble spread and quartiles. However, running global ensembles at the meso or sub mesoscale involves substantial computational resources. National centers do run such ensembles, but the subject of this publication is a method which requires significantly less computation. The ensemble enhanced mesoscale system presented here aims not at the creation of an improved mesoscale forecast model. Also it is not to create an improved ensemble syste...

  20. Analysis of speech waveform quantization methods

    Directory of Open Access Journals (Sweden)

    Tadić Predrag R.

    2008-01-01

    Full Text Available Digitalization, consisting of sampling and quantization, is the first step in any digital signal processing algorithm. In most cases, the quantization is uniform. However, having knowledge of certain stochastic attributes of the signal (namely, the probability density function, or pdf, quantization can be made more efficient, in the sense of achieving a greater signal to quantization noise ratio. This means that narrower channel bandwidths are required for transmitting a signal of the same quality. Alternatively, if signal storage is of interest, rather than transmission, considerable savings in memory space can be made. This paper presents several available methods for speech signal pdf estimation, and quantizer optimization in the sense of minimizing the quantization error power.

  1. Analysis of the Wing Tsun Punching Methods

    Directory of Open Access Journals (Sweden)

    Jeff Webb

    2012-07-01

    Full Text Available The three punching techniques of Wing Tsun, while few in number, represent an effective approach to striking with the closed fist. At first glance, the rather short stroke of each punch would seem disproportionate to the amount of power it generates. Therefore, this article will discuss the structure and body mechanics of each punch, in addition to the various training methods employed for developing power. Two of the Wing Tsun punches, namely the lifting punch and the hooking punch, are often confused with similar punches found in Western boxing. The key differences between the Wing Tsun and boxing punches, both in form and function, will be discussed. Finally, the strategy for applying the Wing Tsun punches will serve as the greatest factor in differentiating them from the punches of other martial arts styles.

  2. New analysis method for passive microrheology

    Science.gov (United States)

    Nishi, Kengo; Schmidt, Christoph; Mackintosh, Fred

    Passive microrheology is an experimental technique used to measure the mechanical response of materials from the fluctuations of micron-sized beads embedded in the medium. Microrheology is well suited to study rheological properties of materials that are difficult to obtain in larger amounts and also of materials inside of single cells. In one common approach, one uses the fluctuation-dissipation theorem to obtain the imaginary part of the material response function from the power spectral density of bead displacement fluctuations, while the real part of the response function is calculated using a Kramers-Kronig integral. The high-frequency cut-off of this integral strongly affects the real part of the response function in the high frequency region. Here, we discuss how to obtain more accurate values of the real part of the response function by an alternative method using autocorrelation functions.

  3. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  4. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺

    2002-01-01

    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  5. Method and procedure of fatigue analysis for nuclear equipment

    International Nuclear Information System (INIS)

    As an example, the fatigue analysis for the upper head of the pressurizer in one NPP was carried out by using ANSYS, a finite element method analysis software. According to RCC-M code, only two kinds of typical transients of temperature and pressure were considered in the fatigue analysis. Meanwhile, the influence of earthquake was taken into account. The method and procedure of fatigue analysis for nuclear safety equipment were described in detail. This paper provides a reference for fatigue analysis and assessment of nuclear safety grade equipment and pipe. (authors)

  6. Printing metal-spiked inks for LA-ICP-MS bioimaging internal standardization: comparison of the different nephrotoxic behavior of cisplatin, carboplatin, and oxaliplatin.

    Science.gov (United States)

    Moraleja, Irene; Esteban-Fernández, Diego; Lázaro, Alberto; Humanes, Blanca; Neumann, Boris; Tejedor, Alberto; Luz Mena, M; Jakubowski, Norbert; Gómez-Gómez, M Milagros

    2016-03-01

    The study of the distribution of the cytostatic drugs cisplatin, carboplatin, and oxaliplatin along the kidney may help to understand their different nephrotoxic behavior. Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) allows the acquisition of trace element images in biological tissues. However, results obtained are affected by several variations concerning the sample matrix and instrumental drifts. In this work, an internal standardization method based on printing an Ir-spiked ink onto the surface of the sample has been developed to evaluate the different distributions and accumulation levels of the aforementioned drugs along the kidney of a rat model. A conventional ink-jet printer was used to print fresh sagittal kidney tissue slices of 4 μm. A reproducible and homogenous deposition of the ink along the tissue was observed. The ink was partially absorbed on top of the tissue. Thus, this approach provides a pseudo-internal standardization, due to the fact that the ablation sample and internal standard take place subsequently and not simultaneously. A satisfactory normalization of LA-ICP-MS bioimages and therefore a reliable comparison of the kidney treated with different Pt-based drugs were achieved even for tissues analyzed on different days. Due to the complete ablation of the sample, the transport of the ablated internal standard and tissue to the inductively coupled plasma-mass spectrometry (ICP-MS) is practically taking place at the same time. Pt accumulation in the kidney was observed in accordance to the dosages administered for each drug. Although the accumulation rate of cisplatin and oxaliplatin is high in both cases, their Pt distributions differ. The strong nephrotoxicity observed for cisplatin and the absence of such side effect in the case of oxaliplatin could explain these distribution differences. The homogeneous distribution of oxaliplatin in the cortical and medullar areas could be related with its higher affinity for

  7. Complex system analysis using CI methods

    Science.gov (United States)

    Fathi, Madjid; Hildebrand, Lars

    1999-03-01

    Modern technical tasks often need the use of complex system models. In many complex cases the model parameters can be gained using neural networks, but these systems allow only a one-way simulation from the input values to the learned output values. If evaluation in the other direction is needed, these model allow no direct evaluation. This task can be solved using evolutionary algorithms, which are part of the computational intelligence. The term computational intelligence covers three special fields of the artificial intelligence, fuzzy logic, artificial neural networks and evolutionary algorithms. We will focus only on the topic of evolutionary algorithms and fuzzy logic. Evolutionary algorithms covers the fields of genetic algorithms, evolution strategies and evolutionary programming. These methods can be used to optimize technical problems. Evolutionary algorithms have certain advantages, if these problems have no mathematical properties, like steadiness or the possibility to obtain the derivatives. Fuzzy logic systems normally lack these properties. The use of a combination of evolutionary algorithms and fuzzy logic allow an evaluation of the learned simulation models in the direction form output to the input values. An example can be given from the field of screw rotor design.

  8. RESULTS OF THE QUESTIONNAIRE: ANALYSIS METHODS

    CERN Multimedia

    Staff Association

    2014-01-01

    Five-yearly review of employment conditions   Article S V 1.02 of our Staff Rules states that the CERN “Council shall periodically review and determine the financial and social conditions of the members of the personnel. These periodic reviews shall consist of a five-yearly general review of financial and social conditions;” […] “following methods […] specified in § I of Annex A 1”. Then, turning to the relevant part in Annex A 1, we read that “The purpose of the five-yearly review is to ensure that the financial and social conditions offered by the Organization allow it to recruit and retain the staff members required for the execution of its mission from all its Member States. […] these staff members must be of the highest competence and integrity.” And for the menu of such a review we have: “The five-yearly review must include basic salaries and may include any other financial or soc...

  9. Complementarity of Traffic Flow Intersecting Method with Intersection Capacity Analysis

    OpenAIRE

    Lanović, Zdenko

    2009-01-01

    The paper studies the complementarity of the methods from the field of traffic flow theory: methods of traffic flow intersecting intensity and the method for the at-grade intersection capacity analysis. Apart from checking mutual implications of these methods, the proportionality of mutual influences is assessed. Harmonized application of these methods acts efficiently on the entire traffic network, and not only on the intersections that are usually incorrectly represented as the only network...

  10. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  11. Reliability analysis method applied in slope stability: slope prediction and forecast on stability analysis

    Institute of Scientific and Technical Information of China (English)

    Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG

    2006-01-01

    Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.

  12. Fluorescent MoS2 Quantum Dots: Ultrasonic Preparation, Up-Conversion and Down-Conversion Bioimaging, and Photodynamic Therapy.

    Science.gov (United States)

    Dong, Haifeng; Tang, Songsong; Hao, Yansong; Yu, Haizhu; Dai, Wenhao; Zhao, Guifeng; Cao, Yu; Lu, Huiting; Zhang, Xueji; Ju, Huangxian

    2016-02-10

    Small size molybdenum disulfide (MoS2) quantum dots (QDs) with desired optical properties were controllably synthesized by using tetrabutylammonium-assisted ultrasonication of multilayered MoS2 powder via OH-mediated chain-like Mo-S bond cleavage mode. The tunable up-bottom approach of precise fabrication of MoS2 QDs finally enables detailed experimental investigations of their optical properties. The synthesized MoS2 QDs present good down-conversion photoluminescence behaviors and exhibit remarkable up-conversion photoluminescence for bioimaging. The mechanism of the emerging photoluminescence was investigated. Furthermore, superior (1)O2 production ability of MoS2 QDs to commercial photosensitizer PpIX was demonstrated, which has great potential application for photodynamic therapy. These early affording results of tunable synthesis of MoS2 QDs with desired photo properties can lead to application in fields of biomedical and optoelectronics.

  13. Fluorescent MoS2 Quantum Dots: Ultrasonic Preparation, Up-Conversion and Down-Conversion Bioimaging, and Photodynamic Therapy.

    Science.gov (United States)

    Dong, Haifeng; Tang, Songsong; Hao, Yansong; Yu, Haizhu; Dai, Wenhao; Zhao, Guifeng; Cao, Yu; Lu, Huiting; Zhang, Xueji; Ju, Huangxian

    2016-02-10

    Small size molybdenum disulfide (MoS2) quantum dots (QDs) with desired optical properties were controllably synthesized by using tetrabutylammonium-assisted ultrasonication of multilayered MoS2 powder via OH-mediated chain-like Mo-S bond cleavage mode. The tunable up-bottom approach of precise fabrication of MoS2 QDs finally enables detailed experimental investigations of their optical properties. The synthesized MoS2 QDs present good down-conversion photoluminescence behaviors and exhibit remarkable up-conversion photoluminescence for bioimaging. The mechanism of the emerging photoluminescence was investigated. Furthermore, superior (1)O2 production ability of MoS2 QDs to commercial photosensitizer PpIX was demonstrated, which has great potential application for photodynamic therapy. These early affording results of tunable synthesis of MoS2 QDs with desired photo properties can lead to application in fields of biomedical and optoelectronics. PMID:26761391

  14. Stability and Accuracy Analysis for Taylor Series Numerical Method

    Institute of Scientific and Technical Information of China (English)

    赵丽滨; 姚振汉; 王寿梅

    2004-01-01

    The Taylor series numerical method (TSNM) is a time integration method for solving problems in structural dynamics. In this paper, a detailed analysis of the stability behavior and accuracy characteristics of this method is given. It is proven by a spectral decomposition method that TSNM is conditionally stable and belongs to the category of explicit time integration methods. By a similar analysis, the characteristic indicators of time integration methods, the percentage period elongation and the amplitude decay of TSNM, are derived in a closed form. The analysis plays an important role in implementing a procedure for automatic searching and finding convergence radii of TSNM. Finally, a linear single degree of freedom undamped system is analyzed to test the properties of the method.

  15. Stochastic Plane Stress Analysis with Elementary Stiffness Matrix Decomposition Method

    Science.gov (United States)

    Er, G. K.; Wang, M. C.; Iu, V. P.; Kou, K. P.

    2010-05-01

    In this study, the efficient analytical method named elementary stiffness matrix decomposition (ESMD) method is further investigated and utilized for the moment evaluation of stochastic plane stress problems in comparison with the conventional perturbation method in stochastic finite element analysis. In order to evaluate the performance of this method, computer programs are written and some numerical results about stochastic plane stress problems are obtained. The numerical analysis shows that the computational efficiency is much increased and the computer EMS memory requirement can be much reduced by using ESMD method.

  16. Research on the Analysis Method of Micro Concentration of Uranium

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Spectrophotometric method is used for the analysis of micro concentration of uraninum in aqueousand organic phase in order to test the feasibility of TBP/OK-dimethylbenzene-TTA method for assayingorganic phase and concentrated hydrochloric acid-arsenazo Ⅲ method for assaying aqueous phase. It is

  17. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  18. One-step synthesis of amino-functionalized ultrasmall near infrared-emitting persistent luminescent nanoparticles for in vitro and in vivo bioimaging

    Science.gov (United States)

    Shi, Junpeng; Sun, Xia; Zhu, Jianfei; Li, Jinlei; Zhang, Hongwu

    2016-05-01

    Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be easily conjugated with many bio-functional molecules, which has been successfully utilized to realize in vitro and in vivo imaging. More importantly, the ZGO achieved re-excitation imaging using 650 nm and 808 nm NIR light in situ, which is advantageous for long-term and higher SNR bioimaging.Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be

  19. Geothermal water and gas: collected methods for sampling and analysis. Comment issue. [Compilation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, J.G.; Serne, R.J.; Shannon, D.W.; Woodruff, E.M.

    1976-08-01

    A collection of methods for sampling and analysis of geothermal fluids and gases is presented. Compilations of analytic options for constituents in water and gases are given. Also, a survey of published methods of laboratory water analysis is included. It is stated that no recommendation of the applicability of the methods to geothermal brines should be assumed since the intent of the table is to encourage and solicit comments and discussion leading to recommended analytical procedures for geothermal waters and research. (WHK)

  20. Inverse thermal analysis method to study solidification in cast iron

    DEFF Research Database (Denmark)

    Dioszegi, Atilla; Hattel, Jesper

    2004-01-01

    Solidification modelling of cast metals is widely used to predict final properties in cast components. Accurate models necessitate good knowledge of the solidification behaviour. The present study includes a re-examination of the Fourier thermal analysis method. This involves an inverse numerical...... solution of a 1-dimensional heat transfer problem connected to solidification of cast alloys. In the analysis, the relation between the thermal state and the fraction solid of the metal is evaluated by a numerical method. This method contains an iteration algorithm controlled by an under relaxation term...... inverse thermal analysis was tested on both experimental and simulated data....

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  2. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  3. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  4. Comparison of extraction methods for analysis of flavonoids in onions

    OpenAIRE

    Soeltoft, Malene; Knuthsen, Pia; Nielsen, John

    2008-01-01

    Onions are known to contain high levels of flavonoids and a comparison of the efficiency, reproducibility and detection limits of various extraction methods has been made in order to develop fast and reliable analytical methods for analysis of flavonoids in onions. Conventional and classical methods are time- and solvent-consuming and the presence of light and oxygen during sample preparation facilitate degradation reactions. Thus, classical methods were compared with microwave (irradiatio...

  5. An operational modal analysis method in frequency and spatial domain

    Institute of Scientific and Technical Information of China (English)

    Wang Tong; Zhang Lingmi; Tamura Yukio

    2005-01-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  6. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  7. A METHOD FOR EXERGY ANALYSIS OF SUGARCANE BAGASSE BOILERS

    Directory of Open Access Journals (Sweden)

    CORTEZ L.A.B.

    1998-01-01

    Full Text Available This work presents a method to conduct a thermodynamic analysis of sugarcane bagasse boilers. The method is based on the standard and actual reactions which allows the calculation of the enthalpies of each process subequation and the exergies of each of the main flowrates participating in the combustion. The method is presented using an example with real data from a sugarcane bagasse boiler. A summary of the results obtained is also presented together based on the 1st Law of Thermodynamics analysis, the exergetic efficiencies, and the irreversibility rates. The method presented is very rigorous with respect to data consistency, particularly for the flue gas composition.

  8. A Simple Buckling Analysis Method for Airframe Composite Stiffened Panel by Finite Strip Method

    Science.gov (United States)

    Tanoue, Yoshitsugu

    Carbon fiber reinforced plastics (CFRP) have been used in structural components for newly developed aircraft and spacecraft. The main structures of an airframe, such as the fuselage and wings, are essentially composed of stiffened panels. Therefore, in the structural design of airframes, it is important to evaluate the buckling strength of the composite stiffened panels. Widely used finite element method (FEM) can analyzed any stiffened panel shape with various boundary conditions. However, in the early phase of airframe development, many studies are required in structural design prior to carrying out detail drawing. In this phase, performing structural analysis using only FEM may not be very efficient. This paper describes a simple buckling analysis method for composite stiffened panels, which is based on finite strip method. This method can deal with isotropic and anisotropic laminated plates and shells with several boundary conditions. The accuracy of this method was verified by comparing it with theoretical analysis and FEM analysis (NASTRAN). It has been observed that the buckling coefficients calculated via the present method are in agreement with results found by detail analysis methods. Consequently, this method is designed to be an effective calculation tool for the buckling analysis in the early phases of airframe design.

  9. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  10. Drop impact analysis method of radioactive material container

    International Nuclear Information System (INIS)

    Background: It is important for the safety of the radioactive material containers during transportation. Purpose: In the procedure of reviewing radioactive material containers transportation, it is very important factor to carry a drop impact analysis of container. Methods: This paper presents a drop impact analysis method of radioactive material container. First, do the calculation of several drop cases of the container such as horizontal drop, vertical drip and gradient drop with the famous transient dynamic analysis program LS-DYNA. Second, do the stress evaluation according to the rules in the ASME Section Ⅲ Division I Appendices which are about the fatigue analysis. Results: With this method, we can do the judgment that whether the container's strength is good enough or not. Conclusions: The radioactive material container's strength is good enough by analysis. (authors)

  11. Comparison of methods for vibration analysis of electrostatic precipitators

    Institute of Scientific and Technical Information of China (English)

    Iwona Adamiec-Wójcik; Andrzej Nowak; Stanis(l)aw Wojciech

    2011-01-01

    The paper presents two methods for the formulation of free vibration analysis of collecting electrodes of precipitators. The first, called the hybrid finite element method,combines the finit element method used for calculations of spring deformations with the rigid finite element method used to reflect mass and geometrical features, which is called the hybrid finite element method. As a result, a model with a diagonal mass matrix is obtained. Due to a specific geometry of the electrodes, which are long plates of complicated shapes, the second method proposed is the strip method which is a semi-analytical method. The strip method allows us to formulate the equations of motion with a considerably smaller number of generalized coordinates. Results of numerical calculations obtained by both methods are compared with those obtained using commercial software like ANSYS and ABAQUS. Good compatibility of results is achieved.

  12. Meshless methods in biomechanics bone tissue remodelling analysis

    CERN Document Server

    Belinha, Jorge

    2014-01-01

    This book presents the complete formulation of a new advanced discretization meshless technique: the Natural Neighbour Radial Point Interpolation Method (NNRPIM). In addition, two of the most popular meshless methods, the EFGM and the RPIM, are fully presented. Being a truly meshless method, the major advantages of the NNRPIM over the FEM, and other meshless methods, are the remeshing flexibility and the higher accuracy of the obtained variable field. Using the natural neighbour concept, the NNRPIM permits to determine organically the influence-domain, resembling the cellulae natural behaviour. This innovation permits the analysis of convex boundaries and extremely irregular meshes, which is an advantage in the biomechanical analysis, with no extra computational effort associated.   This volume shows how to extend the NNRPIM to the bone tissue remodelling analysis, expecting to contribute with new numerical tools and strategies in order to permit a more efficient numerical biomechanical analysis.

  13. Review of assessment methods discount rate in investment analysis

    Directory of Open Access Journals (Sweden)

    Yamaletdinova Guzel Hamidullovna

    2011-08-01

    Full Text Available The article examines the current methods of calculating discount rate in investment analysis and business valuation, as well as analyzes the key problems using various techniques in terms of the Russian economy.

  14. Comparative analysis of myocardial revascularization methods for ischemic heart disease

    Directory of Open Access Journals (Sweden)

    Sinkeev M.S.

    2012-09-01

    Full Text Available The review of literature is devoted to the comparative analysis of clinical researches of efficiency and frequency of complications after application of surgical and medicamentous methods of treatment of coronary heart disease.

  15. Allomtric Scaling Analysis for City Development: Model, Method, and Applications

    CERN Document Server

    Chen, Yanguang

    2011-01-01

    An allometric scaling analysis method based on the idea from fractal theory, general system theory, and analytical hierarchy process are proposed to make a comprehensive evaluation for the relative level of city development.

  16. Development of root observation method by image analysis system

    OpenAIRE

    Kim, Giyoung

    1995-01-01

    Knowledge of plant roots is important for determining plant-soil relationships, managing soil effectively, studying nutrient and water extraction, and creating a soil quality index. Plant root research is limited by the large amount of time and labor required to wash the roots from the soil and measure the viable roots. A root measurement method based on image analysis was proposed to reduce the time and labor requirement. A thinning algorithm-based image analysis method was us...

  17. Rapid method to determine proximate analysis and pyritic sulfur

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.W.; Hyman, M.

    1985-05-01

    The use of thermomagnetogravimetry has been proposed as an alternative to the ASTM methods for measuring the pyritic sulphur content of coal and for proximate analysis. This paper presents a comparison of the results of thermogravimetry for proximate analysis and thermomagnetometry for pyritic sulphur with ASTM values on the same samples. The thermomagnetogravimetric technique is quicker and easier than the ASTM methods, and of comparable accuracy.

  18. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  19. Method development of gas analysis with mass spectrometer

    International Nuclear Information System (INIS)

    Dissolved gas content in deep saline groundwater is an important factor, which has to be known and taken into account when planning the deep repository for the spent nuclear fuel. Posiva has investigated dissolved gases in deep groundwaters since the 1990's. In 2002 Posiva started a project that focused on developing the mass spectrometric method for measuring the dissolved gas content in deep saline groundwater. The main idea of the project was to analyse the dissolved gas content of both the gas phase and the water phase by a mass spectrometer. The development of the method started in 2003 (in the autumn). One of the aims was to create a parallel method for gas analysis with the gas chromatographic method. The starting point of this project was to test if gases could be analysed directly from water using a membrane inlet in the mass spectrometer. The main objective was to develop mass spectrometric methods for gas analysis with direct and membrane inlets. An analysis method for dissolved gases was developed for direct gas inlet mass spectrometry. The accuracy of the analysis method is tested with parallel real PAVE samples analysed in the laboratory of Insinoeoeritoimisto Paavo Ristola Oy. The results were good. The development of the membrane inlet mass spectrometric method still continues. Two different membrane materials (silicone and teflon) were tested. Some basic tests (linearity,repeatability and detection limits for different gases) will be done by this method. (orig.)

  20. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  1. Stochastic Analysis Method of Sea Environment Simulated by Numerical Models

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 焦桂英; 张明霞; 温书勤

    2003-01-01

    This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.

  2. The Constant Comparative Analysis Method Outside of Grounded Theory

    Science.gov (United States)

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  3. Method of morphological analysis of enterprise management organizational structure

    OpenAIRE

    Heorhiadi, N.; Iwaszczuk, N.; Vilhutska, R.

    2013-01-01

    The essence of the method of morphological analysis of enterprise management organizational structure is described in the article. Setting levels of morphological decomposition and specification of sets of elements are necessary for morphological analysis. Based on empirical research identified factors that influence the formation and use of enterprises management organizational structures.

  4. Application of Data Mining methods in analysis of company's activity

    OpenAIRE

    Tyurina Dina N.

    2013-01-01

    The article considers expediency of application of Data Mining means along with traditional statistical methods of analysis of financial and economic activity of a company for revealing all possible factors that influence upon effectiveness of its functioning by means of solving clusterisation tasks. It shows main advantages of application of Data Mining means in analysis of company's activity. It offers an algorithm of conduction analysis of company's activity, which facilitates significant ...

  5. One-step green synthetic approach for the preparation of multicolor emitting copper nanoclusters and their applications in chemical species sensing and bioimaging.

    Science.gov (United States)

    Bhamore, Jigna R; Jha, Sanjay; Mungara, Anil Kumar; Singhal, Rakesh Kumar; Sonkeshariya, Dhanshri; Kailasa, Suresh Kumar

    2016-06-15

    One-step green microwave synthetic approach was developed for the synthesis of copper nanoclusters (Cu NCs) and used as a fluorescent probe for the sensitive detection of thiram and paraquat in water and food samples. Unexpectedly, the prepared Cu NCs exhibited strong orange fluorescence and showed emission peak at 600 nm, respectively. Under optimized conditions, the quenching of Cu NCs emission peak at 600 nm was linearly proportional to thiram and paraquat concentrations in the ranges from 0.5 to 1000 µM, and from 0.2 to 1000 µM, with detection limits of 70 nM and 49 nM, respectively. In addition, bioimaging studies against Bacillus subtilis through confocal fluorescence microscopy indicated that Cu NCs showed strong blue and green fluorescence signals, good permeability and minimum toxicity against the various bacteria species, which demonstrates their potential feasibility for chemical species sensing and bioimaging applications. PMID:26851582

  6. Review of analysis methods for prestressed concrete reactor vessels

    Energy Technology Data Exchange (ETDEWEB)

    Dodge, W.G.; Bazant, Z.P.; Gallagher, R.H.

    1977-02-01

    Theoretical and practical aspects of analytical models and numerical procedures for detailed analysis of prestressed concrete reactor vessels are reviewed. Constitutive models and numerical algorithms for time-dependent and nonlinear response of concrete and various methods for modeling crack propagation are discussed. Published comparisons between experimental and theoretical results are used to assess the accuracy of these analytical methods.

  7. Application of numerical analysis methods to thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    This report presents the application of numerical methods to thermoluminescence dosimetry (TLD), showing the advantages obtained over conventional evaluation systems. Different configurations of the analysis method are presented to operate in specific dosimetric applications of TLD, such as environmental monitoring and mailed dosimetry systems for quality assurance in radiotherapy facilities. (Author) 10 refs

  8. Practical Use of Computationally Frugal Model Analysis Methods.

    Science.gov (United States)

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  9. Investigating Convergence Patterns for Numerical Methods Using Data Analysis

    Science.gov (United States)

    Gordon, Sheldon P.

    2013-01-01

    The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…

  10. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  11. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  12. Integrated numerical methods for hypersonic aircraft cooling systems analysis

    Science.gov (United States)

    Petley, Dennis H.; Jones, Stuart C.; Dziedzic, William M.

    1992-01-01

    Numerical methods have been developed for the analysis of hypersonic aircraft cooling systems. A general purpose finite difference thermal analysis code is used to determine areas which must be cooled. Complex cooling networks of series and parallel flow can be analyzed using a finite difference computer program. Both internal fluid flow and heat transfer are analyzed, because increased heat flow causes a decrease in the flow of the coolant. The steady state solution is a successive point iterative method. The transient analysis uses implicit forward-backward differencing. Several examples of the use of the program in studies of hypersonic aircraft and rockets are provided.

  13. The Methods of Sensitivity Analysis and Their Usage for Analysis of Multicriteria Decision

    Directory of Open Access Journals (Sweden)

    Rūta Simanavičienė

    2011-08-01

    Full Text Available In this paper we describe the application's fields of the sensitivity analysis methods. We pass in review the application of these methods in multiple criteria decision making, when the initial data are numbers. We formulate the problem, which of the sensitivity analysis methods is more effective for the usage in the decision making process.Article in Lithuanian

  14. Fault Diagnosis and Reliability Analysis Using Fuzzy Logic Method

    Institute of Scientific and Technical Information of China (English)

    Miao Zhinong; Xu Yang; Zhao Xiangyu

    2006-01-01

    A new fuzzy logic fault diagnosis method is proposed. In this method, fuzzy equations are employed to estimate the component state of a system based on the measured system performance and the relationship between component state and system performance which is called as "performance-parameter" knowledge base and constructed by expert. Compared with the traditional fault diagnosis method, this fuzzy logic method can use humans intuitive knowledge and dose not need a precise mapping between system performance and component state. Simulation proves its effectiveness in fault diagnosis. Then, the reliability analysis is performed based on the fuzzy logic method.

  15. Bearing Capacity Analysis Using Meshless Local Petrov-Galerkin Method

    Directory of Open Access Journals (Sweden)

    Mužík Juraj

    2014-05-01

    Full Text Available The paper deals with use of the meshless method for soil bearing capacity analysis. There are many formulations of the meshless methods. The article presents the Meshless Local Petrov-Galerkin method (MLPG - local weak formulation of the equilibrium equations. The main difference between meshless methods and the conventional finite element method (FEM is that meshless shape functions are constructed using randomly scattered set of points without any relation between points. The Heaviside step function is test function used in the meshless implementation presented in the article. Heaviside test function makes weak formulation integral very simple, because only body integral in governing equation is due a body force.

  16. Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak

    Science.gov (United States)

    Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman

    2010-08-01

    In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.

  17. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    Science.gov (United States)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  18. Application of homotopy analysis method for solving nonlinear Cauchy problem

    Directory of Open Access Journals (Sweden)

    V.G. Gupta

    2012-11-01

    Full Text Available In this paper, by means of the homotopy analysis method (HAM, the solutions of some nonlinear Cauchy problem of parabolic-hyperbolic type are exactly obtained in the form of convergent Taylor series. The HAM contains the auxiliary parameter \\hbar that provides a convenient way of controlling the convergent region of series solutions. This analytical method is employed to solve linear examples to obtain the exact solutions. The results reveal that the proposed method is very effective and simple.

  19. Parametric quadratic programming method for elastic contact fracture analysis

    OpenAIRE

    Su, RKL; Zhu, Y; Leung, AYT

    2002-01-01

    A solution procedure for elastic contact fracture mechanics has been proposed in this paper. The procedure is based on the quadratic programming and finite element method (FEM). In this paper, parametric quadratic programming method for two-dimensional contact mechanics analysis is applied to the crack problems involving the crack surfaces in frictional contact. Based on a linear complementary contact condition, the parametric variational principle and FEM, a linear complementary method is ex...

  20. Unascertained Factor Method of Dynamic Characteristic Analysis for Antenna Structures

    Institute of Scientific and Technical Information of China (English)

    ZHU Zeng-qing; LIANG Zhen-tao; CHEN Jian-jun

    2008-01-01

    The dynamic characteristic analysis model of antenna structures is built, in which the structural physical parameters and geometrical dimensions are all considered as unascertained variables, And a structure dynamic characteristic analysis method based on the unascertained factor method is given. The computational expression of structural characteristic is developed by the mathematics expression of unascertained factor and the principles of unascertained rational numbers arithmetic. An example is given, in which the possible values and confidence degrees of the unascertained structure characteristics are obtained. The calculated results show that the method is feasible and effective.

  1. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel

    2011-01-01

    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  2. Summary of fracture mechanics problems analysis method in ABAQUS

    Directory of Open Access Journals (Sweden)

    Duan Hongjun

    2015-07-01

    Full Text Available Fracture mechanics is the study of the strength of the materials or structures with crack and crack propagation regularity of a discipline. There are a lot of analysis function of ABAQUS, including fracture analysis. ABAQUS is very easy to use and easy to establish a model of the complicated problem. In order to effectively study of strong discontinuity problems such as crack, provides two methods of simulating the problem of cracks of ABAQUS. This paper describes the two methods respectively, and compare two methods.

  3. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Dalia Krikščiūnienė

    2012-07-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks.The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules.The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables.The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis.The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  4. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  5. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  6. SOLUTION OF THE ENSO DELAYED OSCILLATOR WITH HOMOTOPY ANALYSIS METHOD

    Institute of Scientific and Technical Information of China (English)

    WU Zi-ku

    2009-01-01

    An ENSO delayed oscillator is considered.The El Nino atmospheric physics oscillation is an abnormal phenomenon involved in the tropical Pacific ocean-atmosphere interactions.The conceptual oscillator model should consider the variations of both the eastern and western Pacific anomaly patterns.Using the homotopy analysis method,the approximate expansions of the solution of corresponding problem are constructed.The method is based on a continuous variation from an initial trial to the exact solution.A Maclaurin series expansion provides a successive approximation of the solution through repeated application of a differential operator with the initial trial as the first term.This approach does not require the use of perturbation parameters and the solution series converges rapidly with the number of terms.Comparing the approximate analytical solution by homotopy analysis method with the exact solution,we can find that the homotopy analysis method is valid for solving the strong nonlinear ENSO delayed oscillator model.

  7. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik;

    2005-01-01

    Sample preparation is considered one of the limiting steps in microbial metabolome analysis. Eukaryotes and prokaryotes behave very differently during the several steps of classical sample preparation methods for analysis of metabolites. Even within the eukaryote kingdom there is a vast diversity...... of cell structures that make it imprudent to blindly adopt protocols that were designed for a specific group of microorganisms. We have therefore reviewed and evaluated the whole sample preparation procedures for analysis of yeast metabolites. Our focus has been on the current needs in metabolome analysis......, which is the analysis of a large number of metabolites with very diverse chemical and physical properties. This work reports the leakage of intracellular metabolites observed during quenching yeast cells with cold methanol solution, the efficacy of six different methods for the extraction...

  8. Qualitative analysis of the CCEBC/EEAC method

    Institute of Scientific and Technical Information of China (English)

    LIAO Haohui; TANG Yun

    2004-01-01

    The CCEBC/EEAC method is an effective method in the quantitative analysis of power system transient stability. This paper provides a qualitative analysis of the CCEBC/EEAC method and shows that from a geometrical point of view, the CCCOI-RM transformation used in the CCEBC/EEAC method can be regarded as a projection of the variables of the system model on a weighted vector space, from which a generalized(-P)-(-δ) trajectory is obtained. Since a transient process of power systems can be approximately regarded as a time-piecewise simple Hamiltonian system, in order to qualitatively analyse the CCEBC/EEAC method, this paper compares the potential energy of a two-machine infinite bus system with its CCEBC/EEAC energy. Numerical result indicates their similarity. Clarifying the qualitative relation between these two kinds of energies is significant in verifying mathematically the CCEBC/EEAC method for judging the criterion of power system transient stability. Moreover, the qualitative analysis of the CCEBC/EEAC method enables us to better understand some important phenomena revealed by quantitative analysis, such as multi-swing loss of stability and isolated stable domain.

  9. INDIRECT DETERMINATION METHOD OF DYNAMIC FORCEBY USING CEPSTRUM ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    吴淼; 魏任之

    1996-01-01

    The dynamic load spectrum is one of the most important basis of design and dynamic characteristics analysis of machines. But it is difficult to measure it on many occasions, especially for mining machines, due to their bad working circumstances and high cost of measurements. For such situation, the load spectrum has to be obtained by indirect determination methods. A new method to identify the load spectrum, cepstrum analysis method, was presented in this paper.This method can be used to eliminate the filtering influence of transfer function to the response signals so that the load spectrum can be determined indirectly. The experimental and engineering actual examples indicates that this method has the advantages that the calculation is simple and the measurement is easy.

  10. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  11. A study of applicability of soil-structure interaction analysis method using boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, M. K. [KAERI, Taejon (Korea, Republic of); Kim, M. K. [Yonsei University, Seoul (Korea, Republic of)

    2003-07-01

    In this study, a numerical method for Soil-Structure Interaction (SSI) analysis using FE-BE coupling method is developed. The total system is divided into two parts so called far field and near field. The far field is modeled by boundary element formulation using the multi-layered dynamic fundamental solution and coupled with near field modeled by finite elements. In order to verify the seismic response analysis, the results are compared with those of other commercial code. Finally, several SSI analyses which induced seismic loading are performed to examine the dynamic behavior of the system. As a result, it is shown that the developed method can be an efficient numerical method for solving the SSI analysis.

  12. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  13. AN ANALYSIS METHOD FOR HIGH-SPEED CIRCUIT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new method for analyzing high-speed circuit systems is presented. The method adds transmission line end currents to the circuit variables of the classical modified nodal approach. Then the matrix equation describing high-speed circuit system can be formulated directly and analyzed conveniently for its normative form. A time-domain analysis method for transmission lines is also introduced. The two methods are combined together to efficiently analyze high-speed circuit systems having general transmission lines. Numerical experiment is presented and the results are compared with that calculated by Hspice.

  14. Rapid method of proximate analysis of coals from Indian coalfields

    Energy Technology Data Exchange (ETDEWEB)

    Alam, N.; Paul, S.K. [CFRI, Dhanbad (India)

    2001-07-01

    The proximate analysis has been useful in finding out the percentage of ash in coal, which is the threshold requirement by which the commercial value of coal is adjudged. In the present study, a rapid method for determination of ash percentage in coal has been discussed. Rapid method is always warranted to release more quality parameters within scheduled time. The results obtained by this method were found to be satisfactory and comparable with the results obtained by the British Standard specifications. It is concluded that this method can be effectively adopted where sulphur, carbonates and alkalis are present in low amounts. 4 refs., 3 tabs.

  15. Extended Finite Element Method for Fracture Analysis of Structures

    CERN Document Server

    Mohammadi, Soheil

    2008-01-01

    This important textbook provides an introduction to the concepts of the newly developed extended finite element method (XFEM) for fracture analysis of structures, as well as for other related engineering applications.One of the main advantages of the method is that it avoids any need for remeshing or geometric crack modelling in numerical simulation, while generating discontinuous fields along a crack and around its tip. The second major advantage of the method is that by a small increase in number of degrees of freedom, far more accurate solutions can be obtained. The method has recently been

  16. Continuum damage growth analysis using element free Galerkin method

    Indian Academy of Sciences (India)

    C O Arun; B N Rao; S M Srinivasan

    2010-06-01

    This paper presents an elasto-plastic element free Galerkin formulation based on Newton–Raphson algorithm for damage growth analysis. Isotropic ductile damage evolution law is used. A study has been carried out in this paper using the proposed element free Galerkin method to understand the effect of initial damage and its growth on structural response of single and bi-material problems. A simple method is adopted for enforcing EBCs by scaling the function approximation using a scaling matrix, when non-singular weight functions are used over the entire domain of the problem definition. Numerical examples comprising of one-and two-dimensional problems are presented to illustrate the effectiveness of the proposed method in analysis of uniform and non-uniform damage evolution problems. Effect of material discontinuity on damage growth analysis is also presented.

  17. The colour analysis method applied to homogeneous rocks

    Science.gov (United States)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  18. Identification of the isomers using principal component analysis (PCA) method

    Science.gov (United States)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  19. Structural analysis with the finite element method linear statics

    CERN Document Server

    Oñate, Eugenio

    2013-01-01

    STRUCTURAL ANALYSIS WITH THE FINITE ELEMENT METHOD Linear Statics Volume 1 : The Basis and Solids Eugenio Oñate The two volumes of this book cover most of the theoretical and computational aspects of the linear static analysis of structures with the Finite Element Method (FEM). The content of the book is based on the lecture notes of a basic course on Structural Analysis with the FEM taught by the author at the Technical University of Catalonia (UPC) in Barcelona, Spain for the last 30 years. Volume1 presents the basis of the FEM for structural analysis and a detailed description of the finite element formulation for axially loaded bars, plane elasticity problems, axisymmetric solids and general three dimensional solids. Each chapter describes the background theory for each structural model considered, details of the finite element formulation and guidelines for the application to structural engineering problems. The book includes a chapter on miscellaneous topics such as treatment of inclined supports, elas...

  20. Nonlinear Dimensionality Reduction Methods in Climate Data Analysis

    CERN Document Server

    Ross, Ian

    2008-01-01

    Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. In this thesis I apply three such techniques to the study of El Nino/Southern Oscillation variability in tropical Pacific sea surface temperatures and thermocline depth, comparing observational data with simulations from coupled atmosphere-ocean general circulation models from the CMIP3 multi-model ensemble. The three methods used here are a nonlinear principal component analysis (NLPCA) approach based on neural networks, the Isomap isometric mappin...

  1. Beyond perturbation introduction to the homotopy analysis method

    CERN Document Server

    Liao, Shijun

    2003-01-01

    Solving nonlinear problems is inherently difficult, and the stronger the nonlinearity, the more intractable solutions become. Analytic approximations often break down as nonlinearity becomes strong, and even perturbation approximations are valid only for problems with weak nonlinearity.This book introduces a powerful new analytic method for nonlinear problems-homotopy analysis-that remains valid even with strong nonlinearity. In Part I, the author starts with a very simple example, then presents the basic ideas, detailed procedures, and the advantages (and limitations) of homotopy analysis. Part II illustrates the application of homotopy analysis to many interesting nonlinear problems. These range from simple bifurcations of a nonlinear boundary-value problem to the Thomas-Fermi atom model, Volterra''s population model, Von Kármán swirling viscous flow, and nonlinear progressive waves in deep water.Although the homotopy analysis method has been verified in a number of prestigious journals, it has yet to be ...

  2. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  3. Adaptive computational methods for SSME internal flow analysis

    Science.gov (United States)

    Oden, J. T.

    1986-01-01

    Adaptive finite element methods for the analysis of classes of problems in compressible and incompressible flow of interest in SSME (space shuttle main engine) analysis and design are described. The general objective of the adaptive methods is to improve and to quantify the quality of numerical solutions to the governing partial differential equations of fluid dynamics in two-dimensional cases. There are several different families of adaptive schemes that can be used to improve the quality of solutions in complex flow simulations. Among these are: (1) r-methods (node-redistribution or moving mesh methods) in which a fixed number of nodal points is allowed to migrate to points in the mesh where high error is detected; (2) h-methods, in which the mesh size h is automatically refined to reduce local error; and (3) p-methods, in which the local degree p of the finite element approximation is increased to reduce local error. Two of the three basic techniques have been studied in this project: an r-method for steady Euler equations in two dimensions and a p-method for transient, laminar, viscous incompressible flow. Numerical results are presented. A brief introduction to residual methods of a-posterior error estimation is also given and some pertinent conclusions of the study are listed.

  4. Fusion genetic analysis of jasmonate-signalling mutants in Arabidopsis

    DEFF Research Database (Denmark)

    Jensen, Anders Bøgh; Raventos, D.; Mundy, John Williams

    2002-01-01

    activity was also induced by the protein kinase inhibitor staurosporine and antagonized by the protein phosphatase inhibitor okadaic acid. FLUC bio-imaging, RNA gel-blot analysis and progeny analyses identified three recessive mutants that underexpress the FLUC reporter, designated jue1, 2 and 3, as well...

  5. MANNER OF STOCKS SORTING USING CLUSTER ANALYSIS METHODS

    Directory of Open Access Journals (Sweden)

    Jana Halčinová

    2014-06-01

    Full Text Available The aim of the present article is to show the possibility of using the methods of cluster analysis in classification of stocks of finished products. Cluster analysis creates groups (clusters of finished products according to similarity in demand i.e. customer requirements for each product. Manner stocks sorting of finished products by clusters is described a practical example. The resultants clusters are incorporated into the draft layout of the distribution warehouse.

  6. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  7. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... different so that general conclusions based on the non-parametric estimation deviate from the conclusions based on the Translog model....

  8. A Coupling Model of the Discontinuous Deformation Analysis Method and the Finite Element Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ming; YANG Heqing; LI Zhongkui

    2005-01-01

    Neither the finite element method nor the discontinuous deformation analysis method can solve problems very well in rock mechanics and engineering due to their extreme complexities. A coupling method combining both of them should have wider applicability. Such a model coupling the discontinuous deformation analysis method and the finite element method is proposed in this paper. In the model, so-called line blocks are introduced to deal with the interaction via the common interfacial boundary of the discontinuous deformation analysis domain with the finite element domain. The interfacial conditions during the incremental iteration process are satisfied by means of the line blocks. The requirement of gradual small displacements in each incremental step of this coupling method is met through a displacement control procedure. The model is simple in concept and is easy in numerical implementation. A numerical example is given. The displacement obtained by the coupling method agrees well with those obtained by the finite element method, which shows the rationality of this model and the validity of the implementation scheme.

  9. Comparative study of analysis methods in biospeckle phenomenon

    Science.gov (United States)

    da Silva, Emerson Rodrigo; Muramatsu, Mikiya

    2008-04-01

    In this work we present a review of main statistical properties of speckle patterns and accomplish a comparative study of the more used methods for analysis and extraction of information from optical grainy. The first and second order space-time statistics are dicussed in an overview perspective. The biospeckle phenomenon has detailed attention, specially in its application on monitoring of activity in tissues. The main techniques used to obtain information from speckle patterns are presented, with special prominence to autocorrelation function, co-occurrence matrices, Fujii's method, Briers' contrast and spatial and temporal contrast analisys (LASCA and LASTCA). An incipient method for analysis, based on the study of sucessive correlations contrast, is introduced. Numerical simulations, using diferent probability density functions for velocities of scatterers, were made with two objectives: to test the analysis methods and to give subsidies for interpretation of in vivo results. Vegetable and animal tissues are investigated, achieving the monitoring of senescence process and vascularization maps on leaves, the accompaniment of fungi contamined fruits, the mapping of activity in flowers and the analisys of healing in rats subjected to abdominal surgery. Experiments using the biospeckle phenomenon in microscopy are carried out. At last, it is evaluated the potentiality of biospeckle as diagnosis tool in chronic vein ulcer cared with low intensity laser therapy and the better analysis methods for each kind of tissue are pointed.

  10. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    Science.gov (United States)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  11. Study on color difference estimation method of medicine biochemical analysis

    Science.gov (United States)

    Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun

    2006-01-01

    The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.

  12. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  13. A method for exergy analysis of sugar cane bagasse boilers

    Energy Technology Data Exchange (ETDEWEB)

    Cortez, L.A.B.; Gomez, E.O. [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Agricola

    1998-03-01

    This work presents a method to conduct a thermodynamic analysis of sugarcane bagasse boilers. The method is based on the standard and actual reactions which allows the calculation of the enthalpies of each process subequation and the exergies of each of the main flowrates participating in the combustion. The method is presented using an example with real data from a sugarcane bagasse boiler. A summary of the results obtained is also presented together based on the 1 st Law of Thermodynamics analysis, the exergetic efficiencies, and the irreversibility rates. The method presented is very rigorous with respect to data consistency, particularly for the flue gas composition. (author) 11 refs., 1 fig., 6 tabs.; e-mail: cortez at agr.unicamp.br

  14. Variable cluster analysis method for building neural network model

    Institute of Scientific and Technical Information of China (English)

    王海东; 刘元东

    2004-01-01

    To address the problems that input variables should be reduced as much as possible and explain output variables fully in building neural network model of complicated system, a variable selection method based on cluster analysis was investigated. Similarity coefficient which describes the mutual relation of variables was defined. The methods of the highest contribution rate, part replacing whole and variable replacement are put forwarded and deduced by information theory. The software of the neural network based on cluster analysis, which can provide many kinds of methods for defining variable similarity coefficient, clustering system variable and evaluating variable cluster, was developed and applied to build neural network forecast model of cement clinker quality. The results show that all the network scale, training time and prediction accuracy are perfect. The practical application demonstrates that the method of selecting variables for neural network is feasible and effective.

  15. Quaternion-based discriminant analysis method for color face recognition.

    Science.gov (United States)

    Xu, Yong

    2012-01-01

    Pattern recognition techniques have been used to automatically recognize the objects, personal identities, predict the function of protein, the category of the cancer, identify lesion, perform product inspection, and so on. In this paper we propose a novel quaternion-based discriminant method. This method represents and classifies color images in a simple and mathematically tractable way. The proposed method is suitable for a large variety of real-world applications such as color face recognition and classification of the ground target shown in multispectrum remote images. This method first uses the quaternion number to denote the pixel in the color image and exploits a quaternion vector to represent the color image. This method then uses the linear discriminant analysis algorithm to transform the quaternion vector into a lower-dimensional quaternion vector and classifies it in this space. The experimental results show that the proposed method can obtain a very high accuracy for color face recognition. PMID:22937054

  16. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... updating and integrating stresses in time is problematic. This is discussed using an example of the dynamical collapse of a soil column....

  17. Probabilistic safety analysis : a new nuclear power plants licensing method

    International Nuclear Information System (INIS)

    After a brief retrospect of the application of Probabilistic Safety Analysis in the nuclear field, the basic differences between the deterministic licensing method, currently in use, and the probabilistic method are explained. Next, the two main proposals (by the AIF and the ACRS) concerning the establishment of the so-called quantitative safety goals (or simply 'safety goals') are separately presented and afterwards compared in their most fundamental aspects. Finally, some recent applications and future possibilities are discussed. (Author)

  18. Statistical methods for genetic association analysis involving complex longitudinal data

    OpenAIRE

    Salem, Rany Mansour

    2009-01-01

    Most, if not all, human phenotypes exhibit a temporal, dosage-dependent, or age effect. In this work, I explore and showcase the use different analytical methods for assessing the genetic contribution to traits with temporal trends, or what I refer to as 'dynamic complex traits' (DCTs). The study of DCTs could offer insights into disease pathogenesis that are not achievable in other research settings. I describe the development and application of a method of DCT analysis termed ̀Curve- Based ...

  19. Homotopy Analysis Method for Solving Biological Population Model

    Institute of Scientific and Technical Information of China (English)

    A.M. Arafa; S,Z. Rida; H. Mohamed

    2011-01-01

    In this paper, the homotopy analysis method (HAM) is applied to solve generalized biological population models. The fractional derivatives are described by Caputo's sense. The method introduces a significant improvement in this field over existing techniques. Results obtained using the scheme presented here agree well with the analytical solutions and the numerical results presented in Ref [6]. However, the fundamental solutions of these equations still exhibit useful scaling properties that make them attractive for applications.

  20. Modified method of analysis for surgical correction of facial asymmetry

    OpenAIRE

    Christou, Terpsithea; Kau, Chung How; Waite, Peter D.; Kheir, Nadia Abou; Mouritsen, David

    2013-01-01

    Introduction: The aim of this article was to present a new method of analysis using a three dimensional (3D) model of an actual patient with facial asymmetry, for the assessment of her facial changes and the quantification of the deformity. This patient underwent orthodontic and surgical treatment to correct a severe facial asymmetry. Materials and Methods: The surgical procedure was complex and the case was challenging. The treatment procedure required an orthodontic approach followed by Le ...

  1. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  2. Iteration Complexity Analysis of Block Coordinate Descent Methods

    OpenAIRE

    Hong, Mingyi; Wang, Xiangfeng; Razaviyayn, Meisam; Luo, Zhi-Quan

    2013-01-01

    In this paper, we provide a unified iteration complexity analysis for a family of general block coordinate descent (BCD) methods, covering popular methods such as the block coordinate gradient descent (BCGD) and the block coordinate proximal gradient (BCPG), under various different coordinate update rules. We unify these algorithms under the so-called Block Successive Upper-bound Minimization (BSUM) framework, and show that for a broad class of multi-block nonsmooth convex problems, all algor...

  3. ANALYSIS METHODS ON STABILITY OF TALL AND BEDDIIG CREEP SLOPE

    Institute of Scientific and Technical Information of China (English)

    RUIYongqin; JIANGZhiming; LIUJinghui

    1995-01-01

    Based on the model of slope engineering geology,the creep and its failure mechanism of tall and bedding slope are deeply analyzed in this paper .The creep laws of weak intercalations are also discussed.The analysis om the stability of creep slope and the age forecasting of sliding slope have been conducted through mumerical simulations using Finite Element Method (FEM)and Dintimct Element Method(DEM).

  4. On the methods and examples of aircraft impact analysis

    International Nuclear Information System (INIS)

    Conclusions: Aircraft impact analysis can be performed today within feasible run times using PCs and available advanced commercial finite element software tools. Adequate element and material model technologies exist. Explicit time integration enables analysis of very large deformation Missile/Target impacts. Meshless/particle based methods may be beneficial for large deformation concrete “punching shear” analysis – potentially solves the “element erosion” problem associated with FE, but are not generally implemented yet in major commercial software. Verification of the complicated modeling technologies continues to be a challenge. Not much work has been done yet on ACI shock loading – redundant and physically separated safety trains key to success. Analysis approach and detail should be “balanced” - commensurate with the significant uncertainties - do not “over-do” details of some parts of the model (e.g., the plane) and the analysis

  5. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO2 and MOX fuel rods, (3) analysis of isotopic composition data for UO2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  6. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical data base used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical data base, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The data base will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects (1) analysis of the measurement data of Doppler reactivity in experimental MOX core simulating LWR cores, (2) measurements of isotopic compositions of fission product nuclides on high-burnup BWR UO2 fuels and the analysis of the measurement data, and (3) neutronics analysis of the experimental data that has been obtained in the international joint programs such as FUBILA and REBUS. (author)

  7. Further development of a static seismic analysis method for piping systems: The load coefficient method

    International Nuclear Information System (INIS)

    Currently the ASME Boiler and Pressure Vessel Code is considering a simplified Static Analysis Method for seismic design of piping systems for incorporation into Appendix N of Section 3, Division 1, of the Code. This proposed method, called the Load Coefficient Method, uses coefficients, ranging from .4 to 1.0, times the peak value of the in-structure response spectra with a static analysis technique to evaluate the response of piping systems to seismic events. The coefficient used is a function of the pipe support spacing hence the frequency response of the system and in general, the greater the support spacing the lower the frequency, the lower the spectral response, hence the lower the coefficient. The results of the Load Coefficient Method static analyses have been compared to analyses using the Response Spectrum Modal Analysis Method. Reaction loads were also evaluated with one important modification, a minimum support reaction load as a function of nominal pipe diameter has been established. This assures that lightly loaded supports regardless of the analytical method used will be loaded to realistic values and eliminate the potential for under designed supports. With respect to the accelerations applicable to inline components, a factor of 0.9 times the Square Root of Sum of Square of horizontal floor spectra peaks was determined to envelop the horizontal accelerations and a coefficient of 1.2 was shown to envelop the vertical accelerations. Presented in this paper is the current form of the load coefficient method, a summarization of the results of the over 2,700 benchmark analysis of piping system segments which form the basis for the acceptance of the method, and an explanation of the use of the method

  8. Multi-dye theranostic nanoparticle platform for bioimaging and cancer therapy

    Directory of Open Access Journals (Sweden)

    Singh AK

    2012-06-01

    Full Text Available Amit K Singh,1,2 Megan A Hahn,2 Luke G Gutwein,3 Michael C Rule,4 Jacquelyn A Knapik,5 Brij M Moudgil,1,2 Stephen R Grobmyer,3 Scott C Brown,2,61Department of Materials Science and Engineering, College of Engineering, 2Particle Engineering Research Center, College of Engineering, 3Division of Surgical Oncology, Department of Surgery, College of Medicine, 4Cell and Tissue Analysis Core, McKnight Brain Institute, 5Department of Pathology, College of Medicine, University of Florida, Gainesville, FL, USA; 6DuPont Central Research and Development, Corporate Center for Analytical Science, Wilmington, DE, USABackground: Theranostic nanomaterials composed of fluorescent and photothermal agents can both image and provide a method of disease treatment in clinical oncology. For in vivo use, the near-infrared (NIR window has been the focus of the majority of studies, because of greater light penetration due to lower absorption and scatter of biological components. Therefore, having both fluorescent and photothermal agents with optical properties in the NIR provides the best chance of improved theranostic capabilities utilizing nanotechnology.Methods: We developed nonplasmonic multi-dye theranostic silica nanoparticles (MDT-NPs, combining NIR fluorescence visualization and photothermal therapy within a single nanoconstruct comprised of molecular components. A modified NIR fluorescent heptamethine cyanine dye was covalently incorporated into a mesoporous silica matrix and a hydrophobic metallo-naphthalocyanine dye with large molar absorptivity was loaded into the pores of these fluorescent particles. The imaging and therapeutic capabilities of these nanoparticles were demonstrated in vivo using a direct tumor injection model.Results: The fluorescent nanoparticles are bright probes (300-fold enhancement in quantum yield versus free dye that have a large Stokes shift (>110 nm. Incorporation of the naphthalocyanine dye and exposure to NIR laser excitation

  9. Meso-ester and carboxylic acid substituted BODIPYs with far-red and near-infrared emission for bioimaging applications

    KAUST Repository

    Ni, Yong

    2014-01-21

    A series of meso-ester-substituted BODIPY derivatives 1-6 are synthesized and characterized. In particular, dyes functionalized with oligo(ethylene glycol) ether styryl or naphthalene vinylene groups at the α positions of the BODIPY core (3-6) become partially soluble in water, and their absorptions and emissions are located in the far-red or near-infrared region. Three synthetic approaches are attempted to access the meso-carboxylic acid (COOH)-substituted BODIPYs 7 and 8 from the meso-ester-substituted BODIPYs. Two feasible synthetic routes are developed successfully, including one short route with only three steps. The meso-COOH-substituted BODIPY 7 is completely soluble in pure water, and its fluorescence maximum reaches around 650 nm with a fluorescence quantum yield of up to 15 %. Time-dependent density functional theory calculations are conducted to understand the structure-optical properties relationship, and it is revealed that the Stokes shift is dependent mainly on the geometric change from the ground state to the first excited singlet state. Furthermore, cell staining tests demonstrate that the meso-ester-substituted BODIPYs (1 and 3-6) and one of the meso-COOH-substituted BODIPYs (8) are very membrane-permeable. These features make these meso-ester- and meso-COOH-substituted BODIPY dyes attractive for bioimaging and biolabeling applications in living cells. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Using AIE Luminogen for Long-term and Low-background Three-Photon Microscopic Functional Bioimaging

    Science.gov (United States)

    Zhu, Zhenfeng; Leung, Chris W. T.; Zhao, Xinyuan; Wang, Yalun; Qian, Jun; Tang, Ben Zhong; He, Sailing

    2015-10-01

    Fluorescent probes are one of the most popularly used bioimaging markers to monitor metabolic processes of living cells. However, long-term light excitation always leads to photobleaching of fluorescent probes, unavoidable autofluorescence as well as photodamage of cells. To overcome these limitations, we synthesized a type of photostable luminogen named TPE-TPP with an aggregation induced emission (AIE) characteristic, and achieved its three-photon imaging with femtosecond laser excitation of 1020 nm. By using TPE-TPP as fluorescent probes, three-photon microscopy under 1020 nm excitation showed little photo-damage, as well as low autofluorescence to HeLa cells. Due to the AIE effect, the TPE-TPP nanoaggregates uptaken by cells were resistant to photobleaching under three-photon excitation for an extended period of time. Furthermore, we demonstrated that for the present TPE-TPP AIE the three-photon microscopy (with 1020 nm excitation) had a better signal to noise ratio than the two-photon microscopy (with 810 nm excitation) in tissue imaging.

  11. In site bioimaging of hydrogen sulfide uncovers its pivotal role in regulating nitric oxide-induced lateral root formation.

    Directory of Open Access Journals (Sweden)

    Yan-Jun Li

    Full Text Available Hydrogen sulfide (H2S is an important gasotransmitter in mammals. Despite physiological changes induced by exogenous H2S donor NaHS to plants, whether and how H2S works as a true cellular signal in plants need to be examined. A self-developed specific fluorescent probe (WSP-1 was applied to track endogenous H2S in tomato (Solanum lycopersicum roots in site. Bioimaging combined with pharmacological and biochemical approaches were used to investigate the cross-talk among H2S, nitric oxide (NO, and Ca(2+ in regulating lateral root formation. Endogenous H2S accumulation was clearly associated with primordium initiation and lateral root emergence. NO donor SNP stimulated the generation of endogenous H2S and the expression of the gene coding for the enzyme responsible for endogenous H2S synthesis. Scavenging H2S or inhibiting H2S synthesis partially blocked SNP-induced lateral root formation and the expression of lateral root-related genes. The stimulatory effect of SNP on Ca(2+ accumulation and CaM1 (calmodulin 1 expression could be abolished by inhibiting H2S synthesis. Ca(2+ chelator or Ca(2+ channel blocker attenuated NaHS-induced lateral root formation. Our study confirmed the role of H2S as a cellular signal in plants being a mediator between NO and Ca(2+ in regulating lateral root formation.

  12. Transuranic waste characterization sampling and analysis methods manual

    International Nuclear Information System (INIS)

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP

  13. Application of the IPEBS method to dynamic contingency analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A.C.B. [FURNAS, Rio de Janeiro, RJ (Brazil); Pedroso, A.S. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    1994-12-31

    Dynamic contingency analysis is certainly a demanding task in the context of dynamic performance evaluation. This paper presents the results of a test for checking the contingency screening capability of the IPEBS method. A brazilian 1100-bus, 112-gen system was used in the test; the ranking of the contingencies based on critical clearing times obtained with IPEBS, was compared with the ranking derived from detailed time-domain simulation. The results of this comparison encourages us to recommended the use of the method in industry applications, in a complementary basis to the current method of time domain simulation. (author) 5 refs., 1 fig., 2 tabs.

  14. Wing analysis using a transonic potential flow computational method

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  15. Background and Mathematical Analysis of Diffusion MRI Methods.

    Science.gov (United States)

    Ozcan, Alpay; Wong, Kenneth H; Larson-Prior, Linda; Cho, Zang-Hee; Mun, Seong K

    2012-03-01

    The addition of a pair of magnetic field gradient pulses had initially provided the measurement of spin motion with nuclear magnetic resonance (NMR) techniques. In the adaptation of DW-NMR techniques to magnetic resonance imaging (MRI), the taxonomy of mathematical models is divided in two categories: model matching and spectral methods. In this review, the methods are summarized starting from early diffusion weighted (DW) NMR models followed up with their adaptation to DW MRI. Finally, a newly introduced Fourier analysis based unifying theory, so-called Complete Fourier Direct MRI, is included to explain the mechanisms of existing methods.

  16. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Weber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution which has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  17. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Waber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution which has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  18. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Weber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution that has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  19. NUMERICAL ANALYSIS ON BINOMIAL TREE METHODS FOR AMERICAN LOOKBACK OPTIONS

    Institute of Scientific and Technical Information of China (English)

    戴民

    2001-01-01

    Lookback options are path-dependent options. In general, the binomial tree methods,as the most popular approaches to pricing options, involve a path dependent variable as well as the underlying asset price for lookback options. However, for floating strike lookback options, a single-state variable binomial tree method can be constructed. This paper is devoted to the convergence analysis of the single-state binomial tree methods both for discretely and continuously monitored American floating strike lookback options. We also investigate some properties of such options, including effects of expiration date, interest rate and dividend yield on options prices,properties of optimal exercise boundaries and so on.

  20. Transuranic waste characterization sampling and analysis methods manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  1. Scientific and methodical approaches to analysis of enterprise development potential

    Directory of Open Access Journals (Sweden)

    Hrechina Iryna V.

    2014-01-01

    Full Text Available The modern state of the Ukrainian economy urge enterprises to search for new possibilities of their development, which makes the study subject topical. The article systemises existing approaches to analysis of the potential of enterprise development and marks out two main scientific approaches: first is directed at analysis of prospects of self-development of the economic system; the second – at analysis of probability of possibilities of growth. In order to increase the quality of the process of formation of methods of analysis of potential of enterprise development, the article offers an organisation model of methods and characterises its main elements. It develops methods of analysis, in the basis of which there are indicators of potentialogical sustainability. Scientific novelty of the obtained results lies in a possibility of identification of main directions of enterprise development with the use of the enterprise development potential ration: self-development or probability of augmenting opportunities, which is traced through interconnection of resources and profit.

  2. Finite strip method combined with other numerical methods for the analysis of plates

    Science.gov (United States)

    Cheung, M. S.; Li, Wenchang

    1992-09-01

    Finite plate strips are combined with finite elements or boundary elements in the analysis of rectangular plates with some minor irregularities such as openings, skew edges, etc. The plate is divided into regular and irregular regions. The regular region is analyzed by the finite strip method while the irregular one is analyzed by the finite element or boundary element method. A special transition element and strip are developed in order to connect the both regions. Numerical examples will show the accuracy and efficiency of this combined analysis.

  3. Fast nonlinear regression method for CT brain perfusion analysis.

    Science.gov (United States)

    Bennink, Edwin; Oosterbroek, Jaap; Kudo, Kohsuke; Viergever, Max A; Velthuis, Birgitta K; de Jong, Hugo W A M

    2016-04-01

    Although computed tomography (CT) perfusion (CTP) imaging enables rapid diagnosis and prognosis of ischemic stroke, current CTP analysis methods have several shortcomings. We propose a fast nonlinear regression method with a box-shaped model (boxNLR) that has important advantages over the current state-of-the-art method, block-circulant singular value decomposition (bSVD). These advantages include improved robustness to attenuation curve truncation, extensibility, and unified estimation of perfusion parameters. The method is compared with bSVD and with a commercial SVD-based method. The three methods were quantitatively evaluated by means of a digital perfusion phantom, described by Kudo et al. and qualitatively with the aid of 50 clinical CTP scans. All three methods yielded high Pearson correlation coefficients ([Formula: see text]) with the ground truth in the phantom. The boxNLR perfusion maps of the clinical scans showed higher correlation with bSVD than the perfusion maps from the commercial method. Furthermore, it was shown that boxNLR estimates are robust to noise, truncation, and tracer delay. The proposed method provides a fast and reliable way of estimating perfusion parameters from CTP scans. This suggests it could be a viable alternative to current commercial and academic methods.

  4. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  5. Classification of analysis methods for characterization of magnetic nanoparticle properties

    DEFF Research Database (Denmark)

    Posth, O.; Hansen, Mikkel Fougt; Steinhoff, U.;

    2015-01-01

    The aim of this paper is to provide a roadmap for the standardization of magnetic nanoparticle (MNP) characterization. We have assessed common MNP analysis techniques under various criteria in order to define the methods that can be used as either standard techniques for magnetic particle charact...

  6. CHROMATOGRAPHIC METHODS IN THE ANALYSIS OF CHOLESTEROL AND RELATED LIPIDS

    NARCIS (Netherlands)

    HOVING, EB

    1995-01-01

    Methods using thin-layer chromatography, solid-phase extraction, gas chromatography, high-performance liquid chromatography and supercritical fluid chromatography are described for the analysis of single cholesterol, esterified and sulfated cholesterol, and for cholesterol in the context of other li

  7. Single corn kernel aflatoxin B1 extraction and analysis method

    Science.gov (United States)

    Aflatoxins are highly carcinogenic compounds produced by the fungus Aspergillus flavus. Aspergillus flavus is a phytopathogenic fungus that commonly infects crops such as cotton, peanuts, and maize. The goal was to design an effective sample preparation method and analysis for the extraction of afla...

  8. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    International Nuclear Information System (INIS)

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits

  9. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  10. Collage Portraits as a Method of Analysis in Qualitative Research

    Directory of Open Access Journals (Sweden)

    Paula Gerstenblatt PhD

    2013-02-01

    Full Text Available This article explores the use of collage portraits in qualitative research and analysis. Collage portraiture, an area of arts-based research (ABR, is gaining stature as a method of analysis and documentation in many disciplines. This article presents a method of creating collage portraits to support a narrative thematic analysis that explored the impact of participation in an art installation construction. Collage portraits provide the opportunity to include marginalized voices and encourage a range of linguistic and non-linguistic representations to articulate authentic lived experiences. Other potential benefits to qualitative research are cross-disciplinary study and collaboration, innovative ways to engage and facilitate dialogue, and the building and dissemination of knowledge.

  11. Intelligent classification methods of grain kernels using computer vision analysis

    International Nuclear Information System (INIS)

    In this paper, a digital image analysis method was developed to classify seven kinds of individual grain kernels (common rice, glutinous rice, rough rice, brown rice, buckwheat, common barley and glutinous barley) widely planted in Korea. A total of 2800 color images of individual grain kernels were acquired as a data set. Seven color and ten morphological features were extracted and processed by linear discriminant analysis to improve the efficiency of the identification process. The output features from linear discriminant analysis were used as input to the four-layer back-propagation network to classify different grain kernel varieties. The data set was divided into three groups: 70% for training, 20% for validation, and 10% for testing the network. The classification experimental results show that the proposed method is able to classify the grain kernel varieties efficiently

  12. Fourier analysis for discontinuous Galerkin and related methods

    Institute of Scientific and Technical Information of China (English)

    ZHANG MengPing; SHU Chi-Wang

    2009-01-01

    In this paper we review a series of recent work on using a Fourier analysis technique to study the sta-bility and error estimates for the discontinuous Galerkin method and other related schemes. The ad-vantage of this approach is that it can reveal instability of certain "bad"' schemes; it can verify stability for certain good schemes which are not easily amendable to standard finite element stability analysis techniques; it can provide quantitative error comparisons among different schemes; and it can be used to study superconvergence and time evolution of errors for the discontinuous Galerkin method. We will briefly describe this Fourier analysis technique, summarize its usage in stability and error estimates for various schemes, and indicate the advantages and disadvantages of this technique in comparison with other finite element techniques.

  13. Transient Analysis of Hysteresis Queueing Model Using Matrix Geometric Method

    Directory of Open Access Journals (Sweden)

    Wajiha Shah

    2011-10-01

    Full Text Available Various analytical methods have been proposed for the transient analysis of a queueing system in the scalar domain. In this paper, a vector domain based transient analysis is proposed for the hysteresis queueing system with internal thresholds for the efficient and numerically stable analysis. In this system arrival rate of customer is controlled through the internal thresholds and the system is analyzed as a quasi-birth and death process through matrix geometric method with the combination of vector form Runge-Kutta numerical procedure which utilizes the special matrices. An arrival and service process of the system follows a Markovian distribution. We analyze the mean number of customers in the system when the system is in transient state against varying time for a Markovian distribution. The results show that the effect of oscillation/hysteresis depends on the difference between the two internal threshold values.

  14. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  15. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  16. Template matching method for the analysis of interstellar cloud structure

    CERN Document Server

    Juvela, M

    2016-01-01

    The structure of interstellar medium can be characterised at large scales in terms of its global statistics (e.g. power spectra) and at small scales by the properties of individual cores. Interest has been increasing in structures at intermediate scales, resulting in a number of methods being developed for the analysis of filamentary structures. We describe the application of the generic template-matching (TM) method to the analysis of maps. Our aim is to show that it provides a fast and still relatively robust way to identify elongated structures or other image features. We present the implementation of a TM algorithm for map analysis. The results are compared against rolling Hough transform (RHT), one of the methods previously used to identify filamentary structures. We illustrate the method by applying it to Herschel surface brightness data. The performance of the TM method is found to be comparable to that of RHT but TM appears to be more robust regarding the input parameters, for example, those related t...

  17. A method for rapid similarity analysis of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Liu Na

    2006-11-01

    Full Text Available Abstract Background Owing to the rapid expansion of RNA structure databases in recent years, efficient methods for structure comparison are in demand for function prediction and evolutionary analysis. Usually, the similarity of RNA secondary structures is evaluated based on tree models and dynamic programming algorithms. We present here a new method for the similarity analysis of RNA secondary structures. Results Three sets of real data have been used as input for the example applications. Set I includes the structures from 5S rRNAs. Set II includes the secondary structures from RNase P and RNase MRP. Set III includes the structures from 16S rRNAs. Reasonable phylogenetic trees are derived for these three sets of data by using our method. Moreover, our program runs faster as compared to some existing ones. Conclusion The famous Lempel-Ziv algorithm can efficiently extract the information on repeated patterns encoded in RNA secondary structures and makes our method an alternative to analyze the similarity of RNA secondary structures. This method will also be useful to researchers who are interested in evolutionary analysis.

  18. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  19. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  20. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  1. Cytotoxicity and fluorescence studies of silica-coated CdSe quantum dots for bioimaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Vibin, Muthunayagam [University of Kerala, Department of Biochemistry (India); Vinayakan, Ramachandran [National Institute for Interdisciplinary Science and Technology (CSIR), Photosciences and Photonics (India); John, Annie [Sree Chitra Tirunal Institute of Medical Sciences and Technology, Biomedical Technology Wing (India); Raji, Vijayamma; Rejiya, Chellappan S.; Vinesh, Naresh S.; Abraham, Annie, E-mail: annieab2@yahoo.co.in [University of Kerala, Department of Biochemistry (India)

    2011-06-15

    The toxicological effects of silica-coated CdSe quantum dots (QDs) were investigated systematically on human cervical cancer cell line. Trioctylphosphine oxide capped CdSe QDs were synthesized and rendered water soluble by overcoating with silica, using aminopropyl silane as silica precursor. The cytotoxicity studies were conducted by exposing cells to freshly synthesized QDs as a function of time (0-72 h) and concentration up to micromolar level by Lactate dehydrogenase assay, MTT [3-(4,5-Dimethylthiazol-2-yl)-2,5-Diphenyltetrazolium Bromide] assay, Neutral red cell viability assay, Trypan blue dye exclusion method and morphological examination of cells using phase contrast microscope. The in vitro analysis results showed that the silica-coated CdSe QDs were nontoxic even at higher loadings. Subsequently the in vivo fluorescence was also demonstrated by intravenous administration of the QDs in Swiss albino mice. The fluorescence images in the cryosections of tissues depicted strong luminescence property of silica-coated QDs under biological conditions. These results confirmed the role of these luminescent materials in biological labeling and imaging applications.

  2. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  3. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  4. Static Aeroelastic Analysis with an Inviscid Cartesian Method

    Science.gov (United States)

    Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.

    2014-01-01

    An embedded-boundary, Cartesian-mesh flow solver is coupled with a three degree-of-freedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves a nonlinear, aerostructural system of equations using a loosely-coupled strategy. An open-source, 3-D discrete-geometry engine is utilized to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The coupling interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. After verifying the structural model with comparisons to Euler beam theory, two applications of the analysis method are presented as validation. The first is a relatively stiff, transport wing model which was a subject of a recent workshop on aeroelasticity. The second is a very flexible model recently tested in a low speed wind tunnel. Both cases show that the aeroelastic analysis method produces results in excellent agreement with experimental data.

  5. Variational Methods in Sensitivity Analysis and Optimization for Aerodynamic Applications

    Science.gov (United States)

    Ibrahim, A. H.; Hou, G. J.-W.; Tiwari, S. N. (Principal Investigator)

    1996-01-01

    Variational methods (VM) sensitivity analysis, which is the continuous alternative to the discrete sensitivity analysis, is employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The determination of the sensitivity derivatives of the performance index or functional entails the coupled solutions of the state and costate equations. As the stable and converged numerical solution of the costate equations with their boundary conditions are a priori unknown, numerical stability analysis is performed on both the state and costate equations. Thereafter, based on the amplification factors obtained by solving the generalized eigenvalue equations, the stability behavior of the costate equations is discussed and compared with the state (Euler) equations. The stability analysis of the costate equations suggests that the converged and stable solution of the costate equation is possible only if the computational domain of the costate equations is transformed to take into account the reverse flow nature of the costate equations. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite

  6. Addition to the method of dimensional analysis in hydraulic problems

    Directory of Open Access Journals (Sweden)

    A.M. Kalyakin

    2013-03-01

    Full Text Available The modern engineering design, structures, and especially machines running of new technologies set to engineers the problems that require immediate solution. Therefore, the importance of the method of dimensional analysis as a tool for ordinary engineer is increasing, allows developers to get quick and quite simple solution of even very complex tasks.The method of dimensional analysis is being applied to almost any field of physics and engineering, but it is especially effective at solving problems of mechanics and applied mechanics – hydraulics, fluid mechanics, structural mechanics, etc.Until now the main obstacle to the application of the method of dimensional analysis in its classic form was a multifactorial problem (with many arguments, the solution of which was rather difficult and sometimes impossible. In order to overcome these difficulties, the authors of this study proposed a simple method – application of the combined option avoiding these difficulties.The main result of the study is a simple algorithm which application will make it possible to solve a large class of previously unsolvable problems.

  7. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical data base used in reviewing validation of core analysis methods of LWRs in the occasions: (1) confirming core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical data base, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The data base will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects (1) measurements of Doppler reactivity in experimental MOX core simulating LWR cores, (2) measurement of isotopic compositions of fission product nuclides on high-burn up BWR UO2 fuels and (3) neutronics analysis of the experimental data that has been obtained in the international joint programs such as FUBILA and REBUS. (author)

  8. An approximate method of analysis for notched unidirectional composites

    Science.gov (United States)

    Zweben, C.

    1974-01-01

    An approximate method is proposed for the analysis of unidirectional, filamentary composite materials having slit notches perpendicular to the fibers and subjected to tension parallel to the fibers. The approach is based on an engineering model which incorporates important effects of material heterogeneity by considering average extensional stresses in the fibers and average shear stresses in the matrix. Effects of interfacial failure and matrix plasticity at the root of the notch are considered. Predictions of the analysis are in reasonably good agreement with previous analytical models and experimental data for graphite/epoxy.

  9. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  10. Thermal Analysis of Thin Plates Using the Finite Element Method

    Science.gov (United States)

    Er, G. K.; Iu, V. P.; Liu, X. L.

    2010-05-01

    The isotropic thermal plate is analyzed with finite element method. The solution procedure is presented. The elementary stiffness matrix and loading vector are derived rigorously with variation principle and the principle of minimum potential energy. Numerical results are obtained based on the derived equations and tested with available exact solutions. The problems in the finite element analysis are figured out. It is found that the finite element solutions can not converge as the number of elements increases around the corners of the plate. The derived equations presented in this paper are fundamental for our further study on more complicated thermal plate analysis.

  11. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...... structural images for heavy minerals based on irregularly sampled geochemical data. This methodology has proven useful in producing images that reflect real geological structures with potential application in mineral exploration. A method for removing loboratory-produced map-sheet patterns in spatial data...

  12. Nanosilicon properties, synthesis, applications, methods of analysis and control

    CERN Document Server

    Ischenko, Anatoly A; Aslalnov, Leonid A

    2015-01-01

    Nanosilicon: Properties, Synthesis, Applications, Methods of Analysis and Control examines the latest developments on the physics and chemistry of nanosilicon. The book focuses on methods for producing nanosilicon, its electronic and optical properties, research methods to characterize its spectral and structural properties, and its possible applications. The first part of the book covers the basic properties of semiconductors, including causes of the size dependence of the properties, structural and electronic properties, and physical characteristics of the various forms of silicon. It presents theoretical and experimental research results as well as examples of porous silicon and quantum dots. The second part discusses the synthesis of nanosilicon, modification of the surface of nanoparticles, and properties of the resulting particles. The authors give special attention to the photoluminescence of silicon nanoparticles. The third part describes methods used for studying and controlling the structure and pro...

  13. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  14. Similarity theory based method for MEMS dynamics analysis

    Institute of Scientific and Technical Information of China (English)

    LI Gui-xian; PENG Yun-feng; ZHANG Xin

    2008-01-01

    A new method for MEMS dynamics analysis is presented, ased on the similarity theory. With this method, two systems' similarities can be captured in terms of physics quantities/governed-equations amongst different energy fields, and then the unknown dynamic characteristics of one of the systems can be analyzed ac-cording to the similar ones of the other system. The probability to establish a pair of similar systems among MEMS and other energy systems is also discussed based on the equivalent between mechanics and electrics, and then the feasibility of applying this method is proven by an example, in which the squeezed damping force in MEMS and the current of its equivalent circuit established by this method are compared.

  15. HARMONIC ANALYSIS OF SVPWM INVERTER USING MULTIPLE-PULSES METHOD

    Directory of Open Access Journals (Sweden)

    Mehmet YUMURTACI

    2009-01-01

    Full Text Available Space Vector Modulation (SVM technique is a popular and an important PWM technique for three phases voltage source inverter in the control of Induction Motor. In this study harmonic analysis of Space Vector PWM (SVPWM is investigated using multiple-pulses method. Multiple-Pulses method calculates the Fourier coefficients of individual positive and negative pulses of the output PWM waveform and adds them together using the principle of superposition to calculate the Fourier coefficients of the all PWM output signal. Harmonic magnitudes can be calculated directly by this method without linearization, using look-up tables or Bessel functions. In this study, the results obtained in the application of SVPWM for values of variable parameters are compared with the results obtained with the multiple-pulses method.

  16. Stability Analysis of a Variant of the Prony Method

    Directory of Open Access Journals (Sweden)

    Rodney Jaramillo

    2012-01-01

    Full Text Available Prony type methods are used in many engineering applications to determine the exponential fit corresponding to a dataset. In this paper we study a variant of Prony's method that was used by Martín-Landrove et al., in a process of segmentation of T2-weighted MRI brain images. We show the equivalence between that method and the classical Prony method and study the stability of the computed solutions with respect to noise in the data set. In particular, we show that the relative error in the calculation of the exponential fit parameters is linear with respect to noise in the data. Our analysis is based on classical results from linear algebra, matrix computation theory, and the theory of stability for roots of polynomials.

  17. Comparative analysis of different methods for graphene nanoribbon synthesis

    Directory of Open Access Journals (Sweden)

    Tošić Dragana D.

    2013-01-01

    Full Text Available Graphene nanoribbons (GNRs are thin strips of graphene that have captured the interest of scientists due to their unique structure and promising applications in electronics. This paper presents the results of a comparative analysis of morphological properties of graphene nanoribbons synthesized by different methods. Various methods have been reported for graphene nanoribons synthesis. Lithography methods usually include electron-beam (e-beam lithography, atomic force microscopy (AFM lithography, and scanning tunnelling microscopy (STM lithography. Sonochemical and chemical methods exist as well, namely chemical vapour deposition (CVD and anisotropic etching. Graphene nanoribbons can also be fabricated from unzipping carbon nanotubes (CNTs. We propose a new highly efficient method for graphene nanoribbons production by gamma irradiation of graphene dispersed in cyclopentanone (CPO. Surface morphology of graphene nanoribbons was visualized with atomic force and transmission electron microscopy. It was determined that dimensions of graphene nanoribbons are inversely proportional to applied gamma irradiation dose. It was established that the narrowest nanoribbons were 10-20 nm wide and 1 nm high with regular and smooth edges. In comparison to other synthesis methods, dimensions of graphene nanoribbons synthesized by gamma irradiation are slightly larger, but the yield of nanoribbons is much higher. Fourier transform infrared spectroscopy was used for structural analysis of graphene nanoribbons. Results of photoluminescence spectroscopy revealed for the first time that synthesized nanoribbons showed photoluminescence in the blue region of visible light in contrast to graphene nanoribbons synthesized by other methods. Based on disclosed facts, we believe that our synthesis method has good prospects for potential future mass production of graphene nanoribbons with uniform size, as well as for future investigations of carbon nanomaterials for

  18. The monostandard method in thermal neutron activation analysis

    International Nuclear Information System (INIS)

    A simple method is described for instrumental multielement thermal neutron activation analysis using a monostandard. For geological and air dust samples, iron is used as a comparator, while sodium has advantages for biological materials. To test the capabilities of this method, the values of the effective cross sections of the 23 elements determined were evaluated in a reactor site with an almost pure thermal neutron flux of about 9 x 1012 n x cm-2 x sec-1 and an epithermal neutron contribution of less than 0,03%. The obtained values were found to agree mostly well with the literature best values of thermal neutron cross sections. The results of an analysis by activation in the same site agree well with the relative method using multielement standard and for several standard reference materials with certified element contents. A comparison of the element contents obtained by the monostandard and relative methods together with corresponding precisions and accuracies is given. A brief survey of the monostandard method is presented. (orig.)

  19. Comparative analysis of redirection methods for asteroid resource exploitation

    Science.gov (United States)

    Bazzocchi, Michael C. F.; Emami, M. Reza

    2016-03-01

    An in-depth analysis and systematic comparison of asteroid redirection methods are performed within a resource exploitation framework using different assessment mechanisms. Through this framework, mission objectives and constraints are specified for the redirection of an asteroid from a near-Earth orbit to a stable orbit in the Earth-Moon system. The paper provides a detailed investigation of five redirection methods, i.e., ion beam, tugboat, gravity tractor, laser sublimation, and mass ejector, with respect to their capabilities for a redirection mission. A set of mission level criteria are utilized to assess the performance of each redirection method, and the means of assigning attributes to each criterion is discussed in detail. In addition, the uncertainty in physical characteristics of the asteroid population is quantified through the use of Monte Carlo analysis. The Monte Carlo simulation provides insight into the performance robustness of the redirection methods with respect to the targeted asteroid range. Lastly, the attributes for each redirection method are aggregated using three different multicriteria assessment approaches, i.e., the Analytical Hierarchy Process, a utility-based approach, and a fuzzy aggregation mechanism. The results of each assessment approach as well as recommendations for further studies are discussed in detail.

  20. Analysis on Large Deformation Compensation Method for Grinding Machine

    Directory of Open Access Journals (Sweden)

    Wang Ya-jie

    2013-08-01

    Full Text Available The positioning accuracy of computer numerical control machines tools and manufacturing systems is affected by structural deformations, especially for large sized systems. Structural deformations of the machine body are difficult to model and to predict. Researchs for the direct measurement of the amount of deformation and its compensation are farly limited in domestic and overseas,not involved to calculate the amount of deformation compensation. A new method to compensate large deformation caused by self-weight was presented in the paper. First of all, the compensation method is summarized; Then,static force analysis was taken on the large grinding machine through APDL(ANSYS Parameter Design Language. It could automatic extract results and form data files, getting the N points displacement in the working stroke of mechanical arm. Then, the mathematical model and corresponding flat rectangular function were established. The conclusion that the new compensation method is feasible was obtained through the analysis of displacement of N points. Finally, the MATLAB as a tool is used to calculate compensate amount and the accuracy of the proposed method is proved. Practice shows that the error caused by large deformatiion compensation method can meet the requirements of grinding.  

  1. Preparation of 99mTc-EDTA-MN and Its Bioimaging in Mouse

    Directory of Open Access Journals (Sweden)

    Yongshuai QI

    2015-07-01

    Full Text Available Background and objective Hypoxia is an important biological characteristics of solid tumor, it is not sensitive to radiotherapy and chemotherapy for which is the presence of hypoxic cell, thus increasing their resistance to conventional radiotherapy and chemotherapy, therefore, the detection of hypoxia degree of tumor tissue is of great significance. The hypoxia imaging of nuclear medicine can reflect the degree of tissue hypoxia, which can selectively retained on the hypoxic cells or tissues, including nitroimidazole and non nitroimidazole; the nitroimidazole is widely and deeply researched as hypoxic celles developer in China and abroad at present. The research about application of radionuclide labelled technique has clinical application value to develop the hypoxia imaging agent EDTA-MN complexes which was labeled. To study the feasibility of 99mTc by direct labeling method, the radiochemical properties evaluation of 99mTc-EDTA-MN, and observe the distribution characteristics of 99mTc radiolabeled EDTA-MN in the xenograft lung cancer nude mice bearing non-small cell lung cancer cell (A549, and provide experimental evidence for its further research and application. Methods The radiolabeling of EDTA-MN with 99mTc was performed with direct labeling method, respectively, on the reaction dosage (10 mg, 5 mg, 2 mg, stannous chloride dosage (8 mg/mL, 4 mg/mL, 2 mg/mL, mark system pH (2, 4, 5, 6 one by one test, using orthogonal design analysis, to find the optimal labeling conditions. Labelling rate, radiochemical purity, lipid-water partition coefficient and in vitro stability in normal saline (NS were determined by TLC and HPLC, and the preliminary study on the distribution of 99mTc-EDTA-MN in nude mice. Results The labeling rate of 99mTc-EDTA-MN with the best labeling conditions was (84.11±2.83%, and the radiochemical purity was higher than 90% by HPLC purification, without any notable decomposition at room temperature over a period of 12 h. The

  2. Synthesis of aircraft structures using integrated design and analysis methods

    Science.gov (United States)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  3. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V

    2016-01-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  4. Methods for Analysis of Outdoor Performance Data (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  5. New pressure transient analysis methods for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Serra, K.; Raghavan, R.; Reynolds, A.C.

    1983-10-01

    This paper presents new methods for analyzing pressure drawdown and buildup data obtained at wells producing naturally fractured reservoirs. The model used in this study assumes unsteady-state fluid transfer from the matrix system to the fracture system. A new flow regime is identified. The discovery of this flow regime explains field behavior that has been considered unusual. The probability of obtaining data reflecting this flow regime in a field test is higher than that of obtaining the classical responses given in the literature. The identification of this new flow regime provides methods for preparing a complete analysis of pressure data obtained from naturally fractured reservoirs. Applications to field data are discussed.

  6. New pressure transient analysis methods for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Serra, K.; Raghavan, R.; Reynolds, A.C.

    1983-12-01

    This paper presents new methods for analyzing pressure drawdown and buildup data obtained at wells producing naturally fractured reservoirs. The model used in this study assumes unsteady-state fluid transfer from the matrix system to the fracture system. A new flow regime is identified. The discovery of this flow regime explains field behavior that has been considered unusual. The probability of obtaining data reflecting this flow regime in a field test is higher than that of obtaining the classical responses given in the literature. The identification of this new flow regime provides methods for preparing a complete analysis of pressure data obtained from naturally fractured reservoirs. Applications to field data are discussed.

  7. Error analysis of symplectic Lanczos method for Hamiltonian eigenvalue problem

    Institute of Scientific and Technical Information of China (English)

    YAN Qingyou; WEI Xiaopeng

    2006-01-01

    A rounding error analysis for the symplectic Lanczos method is given to solve the large-scale sparse Hamiltonian eigenvalue problem.If no breakdown occurs in the method,then it can be shown that the Hamiltonian structure preserving requirement does not destroy the essential feature of the nonsymmetric Lanczos algorithm.The relationship between the loss of J-orthogonality among the symplectic Lanczos vectors and the convergence of the Ritz values in the symmetric Lanczos algorithm is discussed. It is demonstrated that under certain assumptions the computed J-orthogonal Lanczos vectors lose the J-orthogonality when some Ritz values begin to converge.Our analysis closely follows the recent works of Bai and Fabbender.

  8. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.

    2003-01-01

    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  9. Computational methods for efficient structural reliability and reliability sensitivity analysis

    Science.gov (United States)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  10. Extraction, chromatographic and mass spectrometric methods for lipid analysis.

    Science.gov (United States)

    Pati, Sumitra; Nie, Ben; Arnold, Robert D; Cummings, Brian S

    2016-05-01

    Lipids make up a diverse subset of biomolecules that are responsible for mediating a variety of structural and functional properties as well as modulating cellular functions such as trafficking, regulation of membrane proteins and subcellular compartmentalization. In particular, phospholipids are the main constituents of biological membranes and play major roles in cellular processes like transmembrane signaling and structural dynamics. The chemical and structural variety of lipids makes analysis using a single experimental approach quite challenging. Research in the field relies on the use of multiple techniques to detect and quantify components of cellular lipidomes as well as determine structural features and cellular organization. Understanding these features can allow researchers to elucidate the biochemical mechanisms by which lipid-lipid and/or lipid-protein interactions take place within the conditions of study. Herein, we provide an overview of essential methods for the examination of lipids, including extraction methods, chromatographic techniques and approaches for mass spectrometric analysis.

  11. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  12. Dynamic Characteristic Analysis and Experiment for Integral Impeller Based on Cyclic Symmetry Analysis Method

    Institute of Scientific and Technical Information of China (English)

    WU Qiong; ZHANG Yidu; ZHANG Hongwei

    2012-01-01

    A cyclic symmetry analysis method is proposed for analyzing the dynamic characteristic problems of thin walled integral impeller.Reliability and feasibility of the present method are investigated by means of simulation and experiment.The fundamental cyclic symmetry equations and the solutions of these equations are derived for the cyclic symmetry structure.The computational efficiency analysis between whole and part is performed.Comparison of results obtained by the finite element analysis (FEA)and experiment shows that the local dynamic characteristic of integral impeller has consistency with the single cyclic symmetry blade.When the integral impeller is constrained and the thin walled blade becomes a concerned object in analysis,the dynamic characteristic of integral impeller can be replaced by the cyclic symmetry blade approximately.Hence,a cyclic symmetry analysis method is effectively used to improve efficiency and obtain more information of parameters for dynamic characteristic of integral impellers.

  13. Big Data and Specific Analysis Methods for Insurance Fraud Detection

    Directory of Open Access Journals (Sweden)

    Ramona BOLOGA

    2014-02-01

    Full Text Available Analytics is the future of big data because only transforming data into information gives them value and can turn data in business in competitive advantage. Large data volumes, their variety and the increasing speed their growth, stretch the boundaries of traditional data warehouses and ETL tools. This paper investigates the benefits of Big Data technology and main methods of analysis that can be applied to the particular case of fraud detection in public health insurance system in Romania.

  14. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    OpenAIRE

    DR.K.KUPPUSAMY; S. Murugan

    2010-01-01

    This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS). This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such sy...

  15. Pseudo-dynamic method for structural analysis of automobile seats

    OpenAIRE

    J. O. Carneiro; Melo, F. J. Q. de; Pereira, J. T.; Teixeira, V.

    2005-01-01

    This work describes the application of a pseudo-dynamic (PsD) method to the dynamic analysis of passenger seats for the automotive industry. The project of such components involves a structural test considering the action of dynamic forces arising from a crash scenario. The laboratory certification of these automotive components consists essentially on the inspection of the propagation and extension of plastic deformations zones in metallic members of the seat structure as cons...

  16. Computational Methods for Failure Analysis and Life Prediction

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  17. Back analysis of microplane model parameters using soft computing methods

    CERN Document Server

    Kucerova, A; Zeman, J

    2009-01-01

    A new procedure based on layered feed-forward neural networks for the microplane material model parameters identification is proposed in the present paper. Novelties are usage of the Latin Hypercube Sampling method for the generation of training sets, a systematic employment of stochastic sensitivity analysis and a genetic algorithm-based training of a neural network by an evolutionary algorithm. Advantages and disadvantages of this approach together with possible extensions are thoroughly discussed and analyzed.

  18. Interval Analysis of the Finite Element Method for Stochastic Structures

    Institute of Scientific and Technical Information of China (English)

    刘长虹; 刘筱玲; 陈虬

    2004-01-01

    A random parameter can be transformed into an interval number in the structural analysis with the concept of the confidence interval. Hence, analyses of uncertain structural systems can be used in the traditional FEM software. In some cases, the amount of solutions in stochastic structures is nearly as many as that in the traditional structural problems. In addition, a new method to evaluate the failure probability of structures is presented for the needs of the modern engineering design.

  19. A Method of Automated Nonparametric Content Analysis for Social Science

    OpenAIRE

    Hopkins, Daniel J.; King, Gary

    2010-01-01

    The increasing availability of digitized text presents enormous opportunities for social scientists. Yet hand coding many blogs, speeches, government records, newspapers, or other sources of unstructured text is infeasible. Although computer scientists have methods for automated content analysis, most are optimized to classify individual documents, whereas social scientists instead want generalizations about the population of documents, such as the proportion in a given category. Unfortunatel...

  20. Continuum methods of physical modeling continuum mechanics, dimensional analysis, turbulence

    CERN Document Server

    Hutter, Kolumban

    2004-01-01

    The book unifies classical continuum mechanics and turbulence modeling, i.e. the same fundamental concepts are used to derive model equations for material behaviour and turbulence closure and complements these with methods of dimensional analysis. The intention is to equip the reader with the ability to understand the complex nonlinear modeling in material behaviour and turbulence closure as well as to derive or invent his own models. Examples are mostly taken from environmental physics and geophysics.

  1. Applied Methods for Analysis of Economic Structure and Change

    OpenAIRE

    Anderstig, Christer

    1988-01-01

    The thesis comprises five papers and an introductory overview of applied models and methods. The papers concern interdependences and interrelations in models applied to empirical analyses of various problems related to production, consumption, location and trade. Among different definitions of 'structural analysis' one refers to the study of the properties of economic models on the assumption of invariant structural relations, this definition is close to what is aimed at in lire present case....

  2. THE WAVELET ANALYSIS METHOD ON THE TRANSIENT SIGNAL

    Institute of Scientific and Technical Information of China (English)

    吴淼

    1996-01-01

    Many dynamic signals of mining machines are transient, such as load signals when roadheader's cutting head being cut-in or cut-out and response signals produced by these loads. For these transient signals, the traditional Fourier analysis method is quite inadequate,The limitations of analysis, resolution by using Short-Time Fourier Transform (STFT) on them were discussed in this paper. Because of wavelet transform having the characteristics of flexible window and multiresolution analysis, we try to apply it to analyse these transientsignal. In order to give a pratical example,using D 18 wavelet and Mallat's tree algorithm with MATLAB, the discrete wavelet transform was calculated for the simulating response signals of a three-degree-of freedom vibration system when it was under impulse and random excitations. The results of the wavelet transform made clear its effectiveness and superiority in analysing transient signals of mining machines.

  3. Simplified QCD fit method for BSM analysis of HERA data

    CERN Document Server

    Turkot, Oleksii; Zarnecki, Aleksander Filip

    2016-01-01

    The high-precision HERA data can be used as an input to a QCD analysis within the DGLAP formalism to obtain the detailed description of the proton structure in terms of the parton distribution functions (PDFs). However, when searching for Beyond Standard Model (BSM) contributions in the data one should take into account the possibility that the PDF set may already have been biased by partially or totally absorbing previously unrecognised new physics contributions. The ZEUS Collaboration has proposed a new approach to the BSM analysis of the inclusive $ep$ data based on the simultaneous QCD fits of parton distribution functions together with contributions of new physics processes. Unfortunately, limit setting procedure in the frequentist approach is very time consuming in this method, as full QCD analysis has to be repeated for numerous data replicas. We describe a simplified approach, based on the Taylor expansion of the cross section predictions in terms of PDF parameters, which allowed us to reduce the calc...

  4. A simplified method for elastic-plastic-creep structural analysis

    Science.gov (United States)

    Kaufman, A.

    1985-01-01

    A simplified inelastic analysis computer program (ANSYPM) was developed for predicting the stress-strain history at the critical location of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a plasticity hardening model. Creep effects are calculated on the basis of stress relaxation at constant strain, creep at constant stress or a combination of stress relaxation and creep accumulation. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, dwell times at various points in the cycles, different materials and kinematic hardening. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

  5. Analysis Method for Non-Nominal First Acquisition

    Science.gov (United States)

    Sieg, Detlef; Mugellesi-Dow, Roberta

    2007-01-01

    First this paper describes a method how the trajectory of the launcher can be modelled for the contingency analysis without having much information about the launch vehicle itself. From a dense sequence of state vectors a velocity profile is derived which is sufficiently accurate to enable the Flight Dynamics Team to integrate parts of the launcher trajectory on its own and to simulate contingency cases by modifying the velocity profile. Then the paper focuses on the thorough visibility analysis which has to follow the contingency case or burn performance simulations. In the ideal case it is possible to identify a ground station which is able to acquire the satellite independent from the burn performance. The correlations between the burn performance and the pointing at subsequent ground stations are derived with the aim of establishing simple guidelines which can be applied quickly and which significantly improve the chance of acquisition at subsequent ground stations. In the paper the method is applied to the Soyuz/Fregat launch with the MetOp satellite. Overall the paper shows that the launcher trajectory modelling with the simulation of contingency cases in connection with a ground station visibility analysis leads to a proper selection of ground stations and acquisition methods. In the MetOp case this ensured successful contact of all ground stations during the first hour after separation without having to rely on any early orbit determination result or state vector update.

  6. Extending methods: using Bourdieu's field analysis to further investigate taste

    Science.gov (United States)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  7. The colour analysis method applied to homogeneous rocks

    Directory of Open Access Journals (Sweden)

    Halász Amadé

    2015-12-01

    Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  8. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  9. Assessment of the Prony's method for BWR stability analysis

    International Nuclear Information System (INIS)

    Highlights: → This paper describes a method to determine the degree of stability of a BWR. → Performance comparison between Prony's and common AR techniques is presented. → Benchmark data and actual BWR transient data are used for comparison. → DR and f results are presented and discussed. → The Prony's method is shown to be a robust technique for BWR stability. - Abstract: It is known that Boiling Water Reactors are susceptible to present power oscillations in regions of high power and low coolant flow, in the power-flow operational map. It is possible to fall in one of such instability regions during reactor startup, since both power and coolant flow are being increased but not proportionally. One other possibility for falling into those areas is the occurrence of a trip of recirculation pumps. Stability monitoring in such cases can be difficult, because the amount or quality of power signal data required for calculation of the stability key parameters may not be enough to provide reliable results in an adequate time range. In this work, the Prony's Method is presented as one complementary alternative to determine the degree of stability of a BWR, through time series data. This analysis method can provide information about decay ratio and oscillation frequency from power signals obtained during transient events. However, so far not many applications in Boiling Water Reactors operation have been reported and supported to establish the scope of using such analysis for actual transient events. This work presents first a comparison of decay ratio and frequency oscillation results obtained by Prony's method and those results obtained by the participants of the Forsmark 1 and 2 Boiling Water Reactor Stability Benchmark using diverse techniques. Then, a comparison of decay ratio and frequency oscillation results is performed for four real BWR transient event data, using Prony's method and two other techniques based on an autoregressive modeling. The four

  10. Summary oral reflective analysis: a method for interview data analysis in feminist qualitative research.

    Science.gov (United States)

    Thompson, S M; Barrett, P A

    1997-12-01

    This article explores an innovative approach to qualitative data analysis called Summary Oral Reflective Analysis (SORA). The method preserves the richness and contextuality of in-depth interview data within a broader feminist philosophical perspective. This multidisciplinary approach was developed in two individual research programs within a cooperative, collaborative arrangement. It represents a creative response to perceived deficiencies in the pragmatics of qualitative data analysis where the maintenance of data contextuality is critical. PMID:9398939

  11. Summary oral reflective analysis: a method for interview data analysis in feminist qualitative research.

    Science.gov (United States)

    Thompson, S M; Barrett, P A

    1997-12-01

    This article explores an innovative approach to qualitative data analysis called Summary Oral Reflective Analysis (SORA). The method preserves the richness and contextuality of in-depth interview data within a broader feminist philosophical perspective. This multidisciplinary approach was developed in two individual research programs within a cooperative, collaborative arrangement. It represents a creative response to perceived deficiencies in the pragmatics of qualitative data analysis where the maintenance of data contextuality is critical.

  12. Development of Analysis Methods for Designing with Composites

    Science.gov (United States)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  13. The SMART CLUSTER METHOD - adaptive earthquake cluster analysis and declustering

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2016-04-01

    Earthquake declustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity with usual applications comprising of probabilistic seismic hazard assessments (PSHAs) and earthquake prediction methods. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation. Various methods have been developed to address this issue from other researchers. These have differing ranges of complexity ranging from rather simple statistical window methods to complex epidemic models. This study introduces the smart cluster method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal identification. Hereby, an adaptive search algorithm for data point clusters is adopted. It uses the earthquake density in the spatio-temporal neighbourhood of each event to adjust the search properties. The identified clusters are subsequently analysed to determine directional anisotropy, focussing on a strong correlation along the rupture plane and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010/2011 Darfield-Christchurch events, an adaptive classification procedure is applied to disassemble subsequent ruptures which may have been grouped into an individual cluster using near-field searches, support vector machines and temporal splitting. The steering parameters of the search behaviour are linked to local earthquake properties like magnitude of completeness, earthquake density and Gutenberg-Richter parameters. The method is capable of identifying and classifying earthquake clusters in space and time. It is tested and validated using earthquake data from California and New Zealand. As a result of the cluster identification process, each event in

  14. Microscale extraction method for HPLC carotenoid analysis in vegetable matrices

    Directory of Open Access Journals (Sweden)

    Sidney Pacheco

    2014-10-01

    Full Text Available In order to generate simple, efficient analytical methods that are also fast, clean, and economical, and are capable of producing reliable results for a large number of samples, a micro scale extraction method for analysis of carotenoids in vegetable matrices was developed. The efficiency of this adapted method was checked by comparing the results obtained from vegetable matrices, based on extraction equivalence, time required and reagents. Six matrices were used: tomato (Solanum lycopersicum L., carrot (Daucus carota L., sweet potato with orange pulp (Ipomoea batatas (L. Lam., pumpkin (Cucurbita moschata Duch., watermelon (Citrullus lanatus (Thunb. Matsum. & Nakai and sweet potato (Ipomoea batatas (L. Lam. flour. Quantification of the total carotenoids was made by spectrophotometry. Quantification and determination of carotenoid profiles were formulated by High Performance Liquid Chromatography with photodiode array detection. Microscale extraction was faster, cheaper and cleaner than the commonly used one, and advantageous for analytical laboratories.

  15. Urinary density measurement and analysis methods in neonatal unit care

    Directory of Open Access Journals (Sweden)

    Maria Vera Lúcia Moreira Leitão Cardoso

    2013-09-01

    Full Text Available The objective was to assess urine collection methods through cotton in contact with genitalia and urinary collector to measure urinary density in newborns. This is a quantitative intervention study carried out in a neonatal unit of Fortaleza-CE, Brazil, in 2010. The sample consisted of 61 newborns randomly chosen to compose the study group. Most neonates were full term (31/50.8% males (33/54%. Data on urinary density measurement through the methods of cotton and collector presented statistically significant differences (p<0.05. The analysis of interquartile ranges between subgroups resulted in statistical differences between urinary collector/reagent strip (1005 and cotton/reagent strip (1010, however there was no difference between urinary collector/ refractometer (1008 and cotton/ refractometer. Therefore, further research should be conducted with larger sampling using methods investigated in this study and whenever possible, comparing urine density values to laboratory tests.

  16. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.

    1978-07-01

    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  17. Analysis of Photovoltaic System Energy Performance Evaluation Method

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  18. Bond-graph Methods for Electric Circuits Analysis

    Directory of Open Access Journals (Sweden)

    GRAVA Adriana

    2012-10-01

    Full Text Available The paper presents a bond-graph method for solving and analyzing an electric circuit with four or more circuit loops. Using this method, the time for circuit analysis is much shorter then using a classicalmethod. Besides determining the intensities of electrical currents through the sides of the circuit, the bond-graphs provide the possibility to obtain the transmittance of the analyzed system applying a fast working method. The main advantage of bond-graphs is the interaction with various areas of physics. The analyzedelectrical circuit could be a part of a complex physical system that could be modeled and analyzed as a unitary system by using bond-graphs.

  19. Performance analysis of image fusion methods in transform domain

    Science.gov (United States)

    Choi, Yoonsuk; Sharifahmadian, Ershad; Latifi, Shahram

    2013-05-01

    Image fusion involves merging two or more images in such a way as to retain the most desirable characteristics of each. There are various image fusion methods and they can be classified into three main categories: i) Spatial domain, ii) Transform domain, and iii) Statistical domain. We focus on the transform domain in this paper as spatial domain methods are primitive and statistical domain methods suffer from a significant increase of computational complexity. In the field of image fusion, performance analysis is important since the evaluation result gives valuable information which can be utilized in various applications, such as military, medical imaging, remote sensing, and so on. In this paper, we analyze and compare the performance of fusion methods based on four different transforms: i) wavelet transform, ii) curvelet transform, iii) contourlet transform and iv) nonsubsampled contourlet transform. Fusion framework and scheme are explained in detail, and two different sets of images are used in our experiments. Furthermore, various performance evaluation metrics are adopted to quantitatively analyze the fusion results. The comparison results show that the nonsubsampled contourlet transform method performs better than the other three methods. During the experiments, we also found out that the decomposition level of 3 showed the best fusion performance, and decomposition levels beyond level-3 did not significantly affect the fusion results.

  20. Application of the holistic methods in analysis of organic milk

    Directory of Open Access Journals (Sweden)

    Anka Popović-Vranješ

    2012-12-01

    Full Text Available Organic farming has advantages in terms of environmental protection, biodiversity, soil quality, animal welfare and pesticide residues. Unlike conventional production “organic chain” means that healthy soil leads to healthy animal feed, leading to healthy cows with normal milk, which eventually leads to healthy consumers. Since this must be scientifically proven, there is an increasing need for scientific methods that will reveal the benefits of organic food. For this purpose holistic methods such as biocrystallization and methods of rising picture are introduced. Biocrystallization shows that organic milk is systematically more “balanced” and that there is more “ordered structure” and better “integration and coordination.” Previous studies using biocrystallization method were performed on the raw milk produced in different conditions, differently treated milk (heat treatment and homogenization and on butter. Pictures of biocrystallization are firstly visually assessed and then by the computer analysis of texture images, which are used to estimate the density of images. Rising picture method which normally works in parallel with biocrystallization can differentiate samples of Demeter, and organic milk from conventional production and milk treated differently during processing. Organic milk in relation to conventional shows better result in terms of impact on the health of consumers when using both the conventional and holistic methods.

  1. Comparative analysis of design methods of transversally loaded diaphragms

    Directory of Open Access Journals (Sweden)

    Bonić Zoran

    2015-01-01

    Full Text Available Reinforced concrete diaphragms are in-built supporting structures constructed directly in the ground. They are intended for reception of lateral soil pressures, and due to the thickness-height ratio they belong to the group of deformable structures. The paper presents different design methods of transversally loaded diaphragms as well as constitutive soil models which can be used on this occasion. For comparison of the described methods, one example of design of reinforced-concrete diaphragm with the analysis of obtained results was done. The diaphragm is firstly treated using classical analytical methods, and then using the numerical methods based on the concept of problem discretization using finite differences method and the STRESS, TOWER and PLAXIS software. The goal of the paper is as accurate prediction of the diaphragm and surrounding soil behavior as possible, as well as finding of the relevant impacts required for the design. [Projekat Ministarstva nauke Republike Srbije, br. TR36028: Development and improvement of methods for analyses of soil - structure interaction based on theoretical and experimental research

  2. Pharmacokinetics of quercetin-loaded nanodroplets with ultrasound activation and their use for bioimaging

    Directory of Open Access Journals (Sweden)

    Chang LW

    2015-04-01

    Full Text Available Li-Wen Chang,1 Mei-Ling Hou,1 Shuo-Hui Hung,2 Lie-Chwen Lin,3 Tung-Hu Tsai1,4–6 1Institute of Traditional Medicine, School of Medicine, National Yang-Ming University, 2Department of Surgery, 3National Research Institute of Chinese Medicine, Ministry of Health and Welfare, Taipei, 4Department of Education and Research, Taipei City Hospital, 5School of Pharmacy, College of Pharmacy, Kaohsiung Medical University, Kaohsiung, 6Graduate Institute of Acupuncture Science, China Medical University, Taichung, Taiwan Abstract: Bubble formulations have both diagnostic and therapeutic applications. However, research on nanobubbles/nanodroplets remains in the initial stages. In this study, a nanodroplet formulation was prepared and loaded with a novel class of chemotherapeutic drug, ie, quercetin, to observe its pharmacokinetic properties and ultrasonic bioimaging of specific sites, namely the abdominal vein and bladder. Four parallel groups were designed to investigate the effects of ultrasound and nanodroplets on the pharmacokinetics of quercetin. These groups were quercetin alone, quercetin triggered with ultrasound, quercetin-encapsulated in nanodroplets, and quercetin encapsulated in nanodroplets triggered with ultrasound. Spherical vesicles with a mean diameter of 280 nm were formed, and quercetin was completely encapsulated within. In vivo ultrasonic imaging confirmed that the nanodroplets could be treated by ultrasound. The results indicate that the initial 5-minute serum concentration, area under the concentration–time curve, elimination half-life, and clearance of quercetin were significantly enhanced by nanodroplets with or without ultrasound. Keywords: nanodroplets, quercetin, ultrasonic pharmacokinetics, ultrasonic imaging, ultrasound

  3. Dual-Emissive Cyclometalated Iridium(III) Polypyridine Complexes as Ratiometric Biological Probes and Organelle-Selective Bioimaging Reagents.

    Science.gov (United States)

    Zhang, Kenneth Yin; Liu, Hua-Wei; Tang, Man-Chung; Choi, Alex Wing-Tat; Zhu, Nianyong; Wei, Xi-Guang; Lau, Kai-Chung; Lo, Kenneth Kam-Wing

    2015-07-01

    In this Article, we present a series of cyclometalated iridium(III) polypyridine complexes of the formula [Ir(N^C)2(N^N)](PF6) that showed dual emission under ambient conditions. The structures of the cyclometalating and diimine ligands were changed systematically to investigate the effects of the substituents on the dual-emission properties of the complexes. On the basis of the photophysical data, the high-energy (HE) and low-energy (LE) emission features of the complexes were assigned to triplet intraligand ((3)IL) and triplet charge-transfer ((3)CT) excited states, respectively. Time-dependent density functional theory (TD-DFT) calculations supported these assignments and indicated that the dual emission resulted from the interruption of the communication between the higher-lying (3)IL and the lower-lying (3)CT states by a triplet amine-to-ligand charge-transfer ((3)NLCT) state. Also, the avidin-binding properties of the biotin complexes were studied by emission titrations, and the results showed that the dual-emissive complexes can be utilized as ratiometric probes for avidin. Additionally, all the complexes exhibited efficient cellular uptake by live HeLa cells. The MTT and Annexin V assays confirmed that no cell death and early apoptosis occurred during the cell imaging experiments. Interestingly, laser-scanning confocal microscopy revealed that the complexes were selectively localized on the cell membrane, mitochondria, or both, depending on the nature of the substituents of the ligands. The results of this work will contribute to the future development of dual-emissive transition metal complexes as ratiometric probes and organelle-selective bioimaging reagents. PMID:26087119

  4. A method of mixed dentition analysis in the mandible.

    Science.gov (United States)

    Motokawa, W; Ozaki, M; Soejima, Y; Yoshida, Y

    1987-01-01

    We developed a method of space analysis based on the fact that the measurement between the distal surfaces of the mandibular permanent lateral incisors was approximately equal to that of the combined widths of the mandibular permanent canine and premolars. This method is referred to as Interlateral Incisor Width (I.L.I.W.) Analysis. One hundred and nineteen Japanese children, without malocclusion, were selected for the study. Various measurements of teeth were taken in their mouths with a modified, fine-tipped, electrical, digital caliper and recorded in a Handheld Computer by connecting it to the caliper. Statistical analyses were conducted to compare the accuracy of the I.L.I.W., Ono, Moyers, and Ballard and Wylie analyses in the mandibular arch. The summary of the results were: Correlation coefficients for the sum of the actual mesiodistal dimensions of the canine and premolars with their predicted values obtained by each of the four analyses revealed r = 0.63 for I.L.I.W., r = 0.55 for Ono, r = 0.57 for Moyers, and r - 0.55 for Ballard and Wylie. Our I.L.I.W. method presented the best correlation of the four analyses, although each indicated a relatively low correlation. This method does appear to be clinically valid, since it is simple enough to enable the practitioner to estimate the combined dimension of the unerupted canine and premolars by measurement, in the mouth, of the distance between the distal surfaces of both mandibular permanent lateral incisors, instead of on study casts. It is recommended that a radiographic method be used in conjunction with our method to obtain a more accurate estimate.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:3470327

  5. Accident Analysis and Barrier Function (AEB) Method. Manual for Incident Analysis

    International Nuclear Information System (INIS)

    The Accident Analysis and Barrier Function (AEB) Method models an accident or incident as a series of interactions between human and technical systems. In the sequence of human and technical errors leading to an accident there is, in principle, a possibility to arrest the development between each two successive errors. This can be done by a barrier function which, for example, can stop an operator from making an error. A barrier function can be performed by one or several barrier function systems. To illustrate, a mechanical system, a computer system or another operator can all perform a given barrier function to stop an operator from making an error. The barrier function analysis consists of analysis of suggested improvements, the effectiveness of the improvements, the costs of implementation, probability of implementation, the cost of maintaining the barrier function, the probability that maintenance will be kept up to standards and the generalizability of the suggested improvement. The AEB method is similar to the US method called HPES, but differs from that method in different ways. To exemplify, the AEB method has more emphasis on technical errors than HPES. In contrast to HPES that describes a series of events, the AEB method models only errors. This gives a more focused analysis making it well suited for checking other HPES-type accident analyses. However, the AEB method is a generic and stand-alone method that has been applied in other fields than nuclear power, such as, in traffic accident analyses

  6. Biclustering methods: biological relevance and application in gene expression analysis.

    Science.gov (United States)

    Oghabian, Ali; Kilpinen, Sami; Hautaniemi, Sampsa; Czeizler, Elena

    2014-01-01

    DNA microarray technologies are used extensively to profile the expression levels of thousands of genes under various conditions, yielding extremely large data-matrices. Thus, analyzing this information and extracting biologically relevant knowledge becomes a considerable challenge. A classical approach for tackling this challenge is to use clustering (also known as one-way clustering) methods where genes (or respectively samples) are grouped together based on the similarity of their expression profiles across the set of all samples (or respectively genes). An alternative approach is to develop biclustering methods to identify local patterns in the data. These methods extract subgroups of genes that are co-expressed across only a subset of samples and may feature important biological or medical implications. In this study we evaluate 13 biclustering and 2 clustering (k-means and hierarchical) methods. We use several approaches to compare their performance on two real gene expression data sets. For this purpose we apply four evaluation measures in our analysis: (1) we examine how well the considered (bi)clustering methods differentiate various sample types; (2) we evaluate how well the groups of genes discovered by the (bi)clustering methods are annotated with similar Gene Ontology categories; (3) we evaluate the capability of the methods to differentiate genes that are known to be specific to the particular sample types we study and (4) we compare the running time of the algorithms. In the end, we conclude that as long as the samples are well defined and annotated, the contamination of the samples is limited, and the samples are well replicated, biclustering methods such as Plaid and SAMBA are useful for discovering relevant subsets of genes and samples. PMID:24651574

  7. Biclustering methods: biological relevance and application in gene expression analysis.

    Directory of Open Access Journals (Sweden)

    Ali Oghabian

    Full Text Available DNA microarray technologies are used extensively to profile the expression levels of thousands of genes under various conditions, yielding extremely large data-matrices. Thus, analyzing this information and extracting biologically relevant knowledge becomes a considerable challenge. A classical approach for tackling this challenge is to use clustering (also known as one-way clustering methods where genes (or respectively samples are grouped together based on the similarity of their expression profiles across the set of all samples (or respectively genes. An alternative approach is to develop biclustering methods to identify local patterns in the data. These methods extract subgroups of genes that are co-expressed across only a subset of samples and may feature important biological or medical implications. In this study we evaluate 13 biclustering and 2 clustering (k-means and hierarchical methods. We use several approaches to compare their performance on two real gene expression data sets. For this purpose we apply four evaluation measures in our analysis: (1 we examine how well the considered (biclustering methods differentiate various sample types; (2 we evaluate how well the groups of genes discovered by the (biclustering methods are annotated with similar Gene Ontology categories; (3 we evaluate the capability of the methods to differentiate genes that are known to be specific to the particular sample types we study and (4 we compare the running time of the algorithms. In the end, we conclude that as long as the samples are well defined and annotated, the contamination of the samples is limited, and the samples are well replicated, biclustering methods such as Plaid and SAMBA are useful for discovering relevant subsets of genes and samples.

  8. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  9. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  10. Primary component analysis method and reduction of seismicity parameters

    Institute of Scientific and Technical Information of China (English)

    WANG Wei; MA Qin-zhong; LIN Ming-zhou; WU Geng-feng; WU Shao-chun

    2005-01-01

    In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, 7-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However,the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS>5.8) occurred in North China,which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.

  11. A Method for Treating Discretization Error in Nondeterministic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Alvin, K.F.

    1999-01-27

    A response surface methodology-based technique is presented for treating discretization error in non-deterministic analysis. The response surface, or metamodel, is estimated from computer experiments which vary both uncertain physical parameters and the fidelity of the computational mesh. The resultant metamodel is then used to propagate the variabilities in the continuous input parameters, while the mesh size is taken to zero, its asymptotic limit. With respect to mesh size, the metamodel is equivalent to Richardson extrapolation, in which solutions on coarser and finer meshes are used to estimate discretization error. The method is demonstrated on a one dimensional prismatic bar, in which uncertainty in the third vibration frequency is estimated by propagating variations in material modulus, density, and bar length. The results demonstrate the efficiency of the method for combining non-deterministic analysis with error estimation to obtain estimates of total simulation uncertainty. The results also show the relative sensitivity of failure estimates to solution bias errors in a reliability analysis, particularly when the physical variability of the system is low.

  12. Analysis on electric energy measuring method based on multi-resolution analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-bing; CUI Jia-rui; LIANG Yuan-hua; WANG Mu-kun

    2006-01-01

    Along with the massive applications of the non-linear loads and the impact loads, many non-stationary stochastic signals such as harmonics, inter-harmonics, impulse signals and so on are introduced into the electric network, and these non-stationary stochastic signals have had effects on the accuracy of the measurement of electric energy. The traditional method like Fourier Analysis can be applied efficiently on the stationary stochastic signals, but it has little effect on non-stationary stochastic signals. In light of this, the form of the signals of the electric network in wavelet domain will be discussed in this paper. A measurement method of active power based on multi-resolution analysis in the stochastic process is presented. This method has a wider application scope compared with the traditional method Fourier analysis, and it is of good referential value and practical value in terms of raising the level of the existing electric energy measurement.

  13. Analysis of Fiber deposition using Automatic Image Processing Method

    Science.gov (United States)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  14. Analysis of Fiber deposition using Automatic Image Processing Method

    Directory of Open Access Journals (Sweden)

    Jicha M.

    2013-04-01

    Full Text Available Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  15. SIMS: a hybrid method for rapid conformational analysis.

    Directory of Open Access Journals (Sweden)

    Bryant Gipson

    Full Text Available Proteins are at the root of many biological functions, often performing complex tasks as the result of large changes in their structure. Describing the exact details of these conformational changes, however, remains a central challenge for computational biology due the enormous computational requirements of the problem. This has engendered the development of a rich variety of useful methods designed to answer specific questions at different levels of spatial, temporal, and energetic resolution. These methods fall largely into two classes: physically accurate, but computationally demanding methods and fast, approximate methods. We introduce here a new hybrid modeling tool, the Structured Intuitive Move Selector (sims, designed to bridge the divide between these two classes, while allowing the benefits of both to be seamlessly integrated into a single framework. This is achieved by applying a modern motion planning algorithm, borrowed from the field of robotics, in tandem with a well-established protein modeling library. sims can combine precise energy calculations with approximate or specialized conformational sampling routines to produce rapid, yet accurate, analysis of the large-scale conformational variability of protein systems. Several key advancements are shown, including the abstract use of generically defined moves (conformational sampling methods and an expansive probabilistic conformational exploration. We present three example problems that sims is applied to and demonstrate a rapid solution for each. These include the automatic determination of "active" residues for the hinge-based system Cyanovirin-N, exploring conformational changes involving long-range coordinated motion between non-sequential residues in Ribose-Binding Protein, and the rapid discovery of a transient conformational state of Maltose-Binding Protein, previously only determined by Molecular Dynamics. For all cases we provide energetic validations using well

  16. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael;

    2016-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...... in agricultural and food markets. We provide a framework that may help researchers to evaluate and improve structural models of market power. Starting with the specification of the approaches in question, we compare published empirical studies of market power with respect to the choice of the applied approach......, functional forms, estimation methods and derived estimates of the degree of market power. Thereafter, we use our framework to evaluate several structural models based on PTA and GIM to measure oligopsony power in the Ukrainian dairy industry. The PTA-based results suggest that the estimated parameters...

  17. Numerical Analysis of Multilayer Waveguides Using Effective Refractive Index Method

    Institute of Scientific and Technical Information of China (English)

    GAO Shao-Wen; CAO Jun-Cheng; FENG Song-Lin

    2003-01-01

    With the help of the effective refractive index method we have numerically analyzed a multilayer planar waveguide structure and calculated the propagation constants, confinement factors, and transverse electric (TE) modes. A five-layer waveguide model has been provided to analyze the electro-magne tic wave propagation process. The analysis method has been applied to the 980 nm laser with active layer of GaInAs/GaInAsP strained quantum wells, GaInAsP confinement layers and GaInP cap layers. By changing the thickness of confinement layers, we obtained confinement factor as high as 95% with higher TE modes TE1 and TE2. The results are in good agreement with the experiment by A. Al-Muhanna et al. and give the new idea to enhance output power of semiconductor lasers. The analysis method can also be extended to any other slab multilayer waveguide structures, and the results are useful to the fabrication of optic-electronic devices.

  18. Alpha track analysis using nuclear emulsions as a preselecting method for safeguards environmental sample analysis

    International Nuclear Information System (INIS)

    Alpha track analysis in state-of-the-art nuclear emulsions was investigated to develop a preselecting method for environmental sampling for safeguards, which is based on counting of each alpha and fission tracks from nuclear material. We developed an automatic scanning system and software for readout of alpha tracks in the emulsions. Automatic analysis of alpha tracks from an uranium ore sample was demonstrated. - Highlights: • Automatic scanning system and software were developed for alpha track analysis. • Basic performance of alpha track readout in novel nuclear emulsions was investigated. • NIT was a promising candidate for alpha track analysis from nuclear material

  19. Acoustic analysis of lightweight auto-body based on finite element method and boundary element method

    Institute of Scientific and Technical Information of China (English)

    LIANG Xinhua; ZHU Ping; LIN Zhongqin; ZHANG Yan

    2007-01-01

    A lightweight automotive prototype using alter- native materials and gauge thickness is studied by a numeri- cal method. The noise, vibration, and harshness (NVH) performance is the main target of this study. In the range of 1-150 Hz, the frequency response function (FRF) of the body structure is calculated by a finite element method (FEM) to get the dynamic behavior of the auto-body structure. The pressure response of the interior acoustic domain is solved by a boundary element method (BEM). To find the most contrib- uting panel to the inner sound pressure, the panel acoustic contribution analysis (PACA) is performed. Finally, the most contributing panel is located and the resulting structural optimization is found to be more efficient.

  20. Statistical analysis of the precision of the Match method

    Directory of Open Access Journals (Sweden)

    R. Lehmann

    2005-05-01

    Full Text Available The Match method quantifies chemical ozone loss in the polar stratosphere. The basic idea consists in calculating the forward trajectory of an air parcel that has been probed by an ozone measurement (e.g., by an ozone sonde or satellite and finding a second ozone measurement close to this trajectory. Such an event is called a ''match''. A rate of chemical ozone destruction can be obtained by a statistical analysis of several tens of such match events. Information on the uncertainty of the calculated rate can be inferred from the scatter of the ozone mixing ratio difference (second measurement minus first measurement associated with individual matches. A standard analysis would assume that the errors of these differences are statistically independent. However, this assumption may be violated because different matches can share a common ozone measurement, so that the errors associated with these match events become statistically dependent. Taking this effect into account, we present an analysis of the uncertainty of the final Match result. It has been applied to Match data from the Arctic winters 1995, 1996, 2000, and 2003. For these ozone-sonde Match studies the effect of the error correlation on the uncertainty estimates is rather small: compared to a standard error analysis, the uncertainty estimates increase by 15% on average. However, the effect is more pronounced for typical satellite Match analyses: for an Antarctic satellite Match study (2003, the uncertainty estimates increase by 60% on average.

  1. Statistical methods for the detection and analysis of radioactive sources

    Science.gov (United States)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  2. Analysis of medicinal plant extracts by neutron activation method

    International Nuclear Information System (INIS)

    This dissertation has presented the results from analysis of medicinal plant extracts using neutron activation method. Instrumental neutron activation analysis was applied to the determination of the elements Al, Br, Ca, Ce, Cl, Cr, Cs, Fe, K, La, Mg, Mn, Na, Rb, Sb, Sc and Zn in medicinal extracts obtained from Achyrolcline satureoides DC, Casearia sylvestris, Centella asiatica, Citrus aurantium L., Solano lycocarpum, Solidago microglossa, Stryphnondedron barbatiman and Zingiber officinale R. plants. The elements Hg and Se were determined using radiochemical separation by means of retention of Se in HMD inorganic exchanger and solvent extraction of Hg by bismuth diethyl-dithiocarbamate solution. Precision and accuracy of the results have been evaluated by analysing reference materials. The therapeutic action of some elements found in plant extracts analyzed was briefly discussed

  3. Analysis Methods for Progressive Damage of Composite Structures

    Science.gov (United States)

    Rose, Cheryl A.; Davila, Carlos G.; Leone, Frank A.

    2013-01-01

    This document provides an overview of recent accomplishments and lessons learned in the development of general progressive damage analysis methods for predicting the residual strength and life of composite structures. These developments are described within their State-of-the-Art (SoA) context and the associated technology barriers. The emphasis of the authors is on developing these analysis tools for application at the structural level. Hence, modeling of damage progression is undertaken at the mesoscale, where the plies of a laminate are represented as a homogenous orthotropic continuum. The aim of the present effort is establish the ranges of validity of available models, to identify technology barriers, and to establish the foundations of the future investigation efforts. Such are the necessary steps towards accurate and robust simulations that can replace some of the expensive and time-consuming "building block" tests that are currently required for the design and certification of aerospace structures.

  4. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  5. Circuit Distortion Analysis Based on the Simplified Newton's Method

    Directory of Open Access Journals (Sweden)

    M. M. Gourary

    2011-01-01

    Full Text Available A new computational technique for distortion analysis of nonlinear circuits is presented. The new technique is applicable to the same class of circuits, namely, weakly nonlinear and time-varying circuits, as the periodic Volterra series. However, unlike the Volterra series, it does not require the computation of the second and third derivatives of device models. The new method is computationally efficient compared with a complete multitone nonlinear steady-state analysis such as harmonic balance. Moreover, the new technique naturally allows computing and characterizing the contributions of individual circuit components to the overall circuit distortion. This paper presents the theory of the new technique, a discussion of the numerical aspects, and numerical results.

  6. Modern wing flutter analysis by computational fluid dynamics methods

    Science.gov (United States)

    Cunningham, Herbert J.; Batina, John T.; Bennett, Robert M.

    1988-01-01

    The application and assessment of the recently developed CAP-TSD transonic small-disturbance code for flutter prediction is described. The CAP-TSD code has been developed for aeroelastic analysis of complete aircraft configurations and was previously applied to the calculation of steady and unsteady pressures with favorable results. Generalized aerodynamic forces and flutter characteristics are calculated and compared with linear theory results and with experimental data for a 45 deg sweptback wing. These results are in good agreement with the experimental flutter data which is the first step toward validating CAP-TSD for general transonic aeroelastic applications. The paper presents these results and comparisons along with general remarks regarding modern wing flutter analysis by computational fluid dynamics methods.

  7. Methods and criteria for safety analysis (FIN L2535)

    International Nuclear Information System (INIS)

    In response to the NRC request for a proposal dated October 20, 1992, Westinghouse Savannah River Company (WSRC) submit this proposal to provide contractural assistance for FIN L2535, ''Methods and Criteria for Safety Analysis,'' as specified in the Statement of Work attached to the request for proposal. The Statement of Work involves development of safety analysis guidance for NRC licensees, arranging a workshop on this guidance, and revising NRC Regulatory Guide 3.52. This response to the request for proposal offers for consideration the following advantages of WSRC in performing this work: Experience, Qualification of Personnel and Resource Commitment, Technical and Organizational Approach, Mobilization Plan, Key Personnel and Resumes. In addition, attached are the following items required by the NRC: Schedule II, Savannah River Site - Job Cost Estimate, NRC Form 189, Project and Budget Proposal for NRC Work, page 1, NRC Form 189, Project and Budget Proposal for NRC Work, page 2, Project Description

  8. New pulse shape analysis method with multi-shaping amplifiers

    International Nuclear Information System (INIS)

    A novel pulse-shape-analysis method that uses 'similarity' to recognize an individual pulse shape is presented. We obtain four pulse heights by using four linear amplifiers with time constants of 0.5, 2, 3 and 6 μs. We treat a combination of four pulse heights as a pattern vector. Each pulse shape is analyzed by using the similarity. The method has been applied to the improvement of characteristics of a CdZnTe semiconductor detector (eV Products 180.5.5.5s, 5x5x5 mm). A CdZnTe semiconductor detector has prominent properties that are desirable as a radiation detector. The high atomic numbers indicate a larger detection efficiency for X or gamma rays than that of other semiconductor detectors such as Si or Ge ones. The large forbidden band gap energy permits room temperature operation. However, as is common with other compound semiconductor materials, the pulse shapes from CdZnTe detectors differ from event to event depending on the positions of radiation interaction because of the different mobilities of the holes and electrons, and the short life time of the holes or trapping in the bulk. We tried to correct each pulse height by analyzing and compensating through the analysis of the pulse shapes with the similarity. After the correction procedure with the similarity, characteristics of the energy spectrum of the CdZnTe semiconductor detector such as peak-to-valley ratio or photopeak efficiency were improved. The results are tabulated. This method is simple and useful for pulse shape analysis, which can be used for many other applications

  9. Bayesian methods for the design and analysis of noninferiority trials.

    Science.gov (United States)

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  10. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  11. Analysis of medieval glass by X-ray spectrometric methods

    International Nuclear Information System (INIS)

    Systematic investigation of the 16th century glasses of Ljubljana is motivated by the spread of Italian glass-working technology into central Europe. The glass was probed using the external beam PIXE technique due to its non-destructiveness. Initial test measurements were performed by the methods of PIGE, XRF, electron probe microanalysis, and LA-ICP-MS. The PIXE data were evaluated statistically using the principal component analysis and minimizing the stress function. The manufacturing procedures were indicated by the Rb/Sr content in the glass: the investigated glasses were mainly produced with the ash (not potash) of halophitic plants

  12. CAD—Oriented Noise Analysis Method of Nonlinear Microwave Chircuits

    Institute of Scientific and Technical Information of China (English)

    WANGJun; TANGGaodi; CHENHuilian

    2003-01-01

    A general method is introduced which is capable of making accurate,quantitative predictions about the noise of different type of nonlinear microwave circuits.This new approach also elucidates several design criteria for making it suitable to CAD-oriented analysis via identifying the mechanisms by which intrinsic device noise and external noise sources contribute to the total equivalent noise.In particular,it explains the details of how noise spectrum at the interesting port is obtained.And the theory also naturally leads to additional important design insights.In the illustrative experiments,excellent agreement among theory,simulations,and measurements is observed.

  13. Generalized Method of Variational Analysis for 3-D Flow

    Institute of Scientific and Technical Information of China (English)

    兰伟仁; 黄思训; 项杰

    2004-01-01

    The generalized method of variational analysis (GMVA) suggested for 2-D wind observations by Huang et al. is extended to 3-D cases. Just as in 2-D cases, the regularization idea is applied. But due to the complexity of the 3-D cases, the vertical vorticity is taken as a stable functional. The results indicate that wind observations can be both variationally optimized and filtered. The efficiency of GMVA is also checked in a numerical test. Finally, 3-D wind observations with random disturbances are manipulated by GMVA after being filtered.

  14. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  15. Analysis of PCA Method in Image Recognition with MATALAB

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ping

    2014-01-01

    The growing need for effective biometric identification is widely acknowledged. Human face recognition is an important area in the field of biometrics. It has been an active area of research for several decades,but still remains a challenging problem because of the complexity of the human face. The Principal Component Analysis(PCA),or the eigenface method,is a de - facto standard in human face recognition. In this paper,the principle of PCA is introduced and the compressing and rebuilding of the image is accomplished with matlab program.

  16. Roof collapse of shallow tunnels with limit analysis method

    Institute of Scientific and Technical Information of China (English)

    YANG Xiao-li; LONG Ze-xiang

    2015-01-01

    A new failure mechanism is proposed to analyze the roof collapse based on nonlinear failure criterion. Limit analysis approach and variational principle are used to obtain analytical findings concerning the stability of potential roof. Then, parametric study is carried out to derive the change rule of corresponding parameters on the influence of collapsing shape, which is of paramount engineering significance to instruct the tunnel excavations. In comparison with existing results, the findings show agreement and validity of the proposed method. The actual collapse in certain shallow tunnels is well in accordance with the proposed failure mechanism.

  17. Methods for analysis of citrinin in human blood and urine.

    Science.gov (United States)

    Blaszkewicz, Meinolf; Muñoz, Katherine; Degen, Gisela H

    2013-06-01

    Citrinin (CIT), produced by several Penicillium, Aspergillus, and Monascus species, has been detected as contaminant in feeds, grains, and other food commodities. CIT can co-occur with ochratoxin A (OTA), a mycotoxin also known for its nephrotoxicity, and this raises concern regarding possible combined effects. But, in contrast to OTA, data on CIT contamination in foods for human consumption are scarce, and CIT biomonitoring has not been conducted so far due a lack of suitable methods for human specimen. Thus, it was the aim of the present study to develop sensitive methods for the analysis of CIT in human blood and urine to investigate human exposure. To this end, we assessed different methods of sample preparation and instrumental analysis for these matrices. Clean-up of blood plasma by protein precipitation followed by LC-MS/MS-based analysis allowed robust detection of CIT (LOD 0.07 ng/mL, LOQ 0.15 ng/mL). For urine, sample clean-up by an immunoaffinity column (CitriTest(®)) proved to be clearly superior to SPE with RP(18) material for subsequent analysis by LC-MS/MS. For CIT and its metabolite dihydrocitrinone (HO-CIT), the LOD and LOQ determined by external calibration curves in matrix were 0.02 and 0.05 ng/mL for CIT, and those for HO-CIT were 0.05 and 0.1 ng/mL urine. The newly developed method was applied in a small pilot study: CIT was present in all plasma samples from 8 German adults, at concentrations ranging from 0.11 to 0.26 ng/mL. The molar (nM) concentrations of CIT are similar to those measured for OTA in these samples as a result of dietary mycotoxin intake. CIT was detected in 8/10 urines (from 4 adults and 6 infants) in a range of 0.16-0.79 ng/mL, and HO-CIT was present in 5/10 samples at similar concentrations. Thus, CIT is excreted in urine as parent compound and also as metabolite. These first results in humans point to the need for further studies on CIT exposure. PMID:23354378

  18. Analysis of electroperforated materials using the quadrat counts method

    Energy Technology Data Exchange (ETDEWEB)

    Miranda, E; Garzon, C; Garcia-Garcia, J [Departament d' Enginyeria Electronica, Universitat Autonoma de Barcelona, 08193 Bellaterra, Barcelona (Spain); MartInez-Cisneros, C; Alonso, J, E-mail: enrique.miranda@uab.cat [Departament de Quimica AnalItica, Universitat Autonoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2011-06-23

    The electroperforation distribution in thin porous materials is investigated using the quadrat counts method (QCM), a classical statistical technique aimed to evaluate the deviation from complete spatial randomness (CSR). Perforations are created by means of electrical discharges generated by needle-like tungsten electrodes. The objective of perforating a thin porous material is to enhance its air permeability, a critical issue in many industrial applications involving paper, plastics, textiles, etc. Using image analysis techniques and specialized statistical software it is shown that the perforation locations follow, beyond a certain length scale, a homogeneous 2D Poisson distribution.

  19. Analysis of TRIGA reactor thermal power calibration method

    International Nuclear Information System (INIS)

    Analysis of thermal power method of the nuclear instrumentation of the TRIGA reactor in Ljubljana is described. Thermal power calibration was performed at different power levels and at different conditions. Different heat loss processes from the reactor pool to the surrounding are considered. It is shown that the use of proper calorimetric calibration procedure and the use of heat loss corrections improve the accuracy of the measurement. To correct the position of the control rods, perturbation factors are introduced. It is shown that the use of the perturbation factors enables power readings from nuclear instrumentation with accuracy better than without corrections.(author)

  20. Analysis method set up to check against adulterated export honey

    International Nuclear Information System (INIS)

    Over the past few years, North America has experienced occasional problems with the adulteration of honey, mainly by additions of other, cheaper sugar to increase bulk and lower production costs. The main addition was usually high fructose corn syrup, which had a similar chemical composition to that of honey. As a consequence of this type of adulteration, a method for its detection was developed using isotope ratio mass spectroscopy (IRMS). This was later refined to be more sensitive and is now specified as an Official Test. The Institute of Geological and Nuclear Sciences has now set up the analysis method to the international criteria at the Rafter Stable Isotope Laboratory in Lower Hutt. 2 refs