WorldWideScience

Sample records for bioimage analysis methods

  1. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  2. Robust normalization protocols for multiplexed fluorescence bioimage analysis.

    Science.gov (United States)

    Ahmed Raza, Shan E; Langenkämper, Daniel; Sirinukunwattana, Korsuk; Epstein, David; Nattkemper, Tim W; Rajpoot, Nasir M

    2016-01-01

    study of mapping and interaction of co-localized proteins at a sub-cellular level is important for understanding complex biological phenomena. One of the recent techniques to map co-localized proteins is to use the standard immuno-fluorescence microscopy in a cyclic manner (Nat Biotechnol 24:1270-8, 2006; Proc Natl Acad Sci 110:11982-7, 2013). Unfortunately, these techniques suffer from variability in intensity and positioning of signals from protein markers within a run and across different runs. Therefore, it is necessary to standardize protocols for preprocessing of the multiplexed bioimaging (MBI) data from multiple runs to a comparable scale before any further analysis can be performed on the data. In this paper, we compare various normalization protocols and propose on the basis of the obtained results, a robust normalization technique that produces consistent results on the MBI data collected from different runs using the Toponome Imaging System (TIS). Normalization results produced by the proposed method on a sample TIS data set for colorectal cancer patients were ranked favorably by two pathologists and two biologists. We show that the proposed method produces higher between class Kullback-Leibler (KL) divergence and lower within class KL divergence on a distribution of cell phenotypes from colorectal cancer and histologically normal samples. PMID:26949415

  3. BioImage Suite: An integrated medical image analysis suite: An update

    OpenAIRE

    Papademetris, Xenophon; Jackowski, Marcel P; Rajeevan, Nallakkandi; DiStasio, Marcello; Okuda, Hirohito; Constable, R. Todd; Staib, Lawrence H.

    2006-01-01

    BioImage Suite is an NIH-supported medical image analysis software suite developed at Yale. It leverages both the Visualization Toolkit (VTK) and the Insight Toolkit (ITK) and it includes many additional algorithms for image analysis especially in the areas of segmentation, registration, diffusion weighted image processing and fMRI analysis. BioImage Suite has a user-friendly user interface developed in the Tcl scripting language. A final beta version is freely available for download 1

  4. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    OpenAIRE

    Ayyagari Sri Nagesh; G.P.Saradhi Varma; Govardhan, A.

    2012-01-01

    In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR) has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identific...

  5. Transforms and Operators for Directional Bioimage Analysis: A Survey.

    Science.gov (United States)

    Püspöki, Zsuzsanna; Storath, Martin; Sage, Daniel; Unser, Michael

    2016-01-01

    We give a methodology-oriented perspective on directional image analysis and rotation-invariant processing. We review the state of the art in the field and make connections with recent mathematical developments in functional analysis and wavelet theory. We unify our perspective within a common framework using operators. The intent is to provide image-processing methods that can be deployed in algorithms that analyze biomedical images with improved rotation invariance and high directional sensitivity. We start our survey with classical methods such as directional-gradient and the structure tensor. Then, we discuss how these methods can be improved with respect to robustness, invariance to geometric transformations (with a particular interest in scaling), and computation cost. To address robustness against noise, we move forward to higher degrees of directional selectivity and discuss Hessian-based detection schemes. To present multiscale approaches, we explain the differences between Fourier filters, directional wavelets, curvelets, and shearlets. To reduce the computational cost, we address the problem of matching directional patterns by proposing steerable filters, where one might perform arbitrary rotations and optimizations without discretizing the orientation. We define the property of steerability and give an introduction to the design of steerable filters. We cover the spectrum from simple steerable filters through pyramid schemes up to steerable wavelets. We also present illustrations on the design of steerable wavelets and their application to pattern recognition. PMID:27207363

  6. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  7. Bioimage analysis of Shigella infection reveals targeting of colonic crypts.

    Science.gov (United States)

    Arena, Ellen T; Campbell-Valois, Francois-Xavier; Tinevez, Jean-Yves; Nigro, Giulia; Sachse, Martin; Moya-Nilges, Maryse; Nothelfer, Katharina; Marteyn, Benoit; Shorte, Spencer L; Sansonetti, Philippe J

    2015-06-23

    Few studies within the pathogenic field have used advanced imaging and analytical tools to quantitatively measure pathogenicity in vivo. In this work, we present a novel approach for the investigation of host-pathogen processes based on medium-throughput 3D fluorescence imaging. The guinea pig model for Shigella flexneri invasion of the colonic mucosa was used to monitor the infectious process over time with GFP-expressing S. flexneri. A precise quantitative imaging protocol was devised to follow individual S. flexneri in a large tissue volume. An extensive dataset of confocal images was obtained and processed to extract specific quantitative information regarding the progression of S. flexneri infection in an unbiased and exhaustive manner. Specific parameters included the analysis of S. flexneri positions relative to the epithelial surface, S. flexneri density within the tissue, and volume of tissue destruction. In particular, at early time points, there was a clear association of S. flexneri with crypts, key morphological features of the colonic mucosa. Numerical simulations based on random bacterial entry confirmed the bias of experimentally measured S. flexneri for early crypt targeting. The application of a correlative light and electron microscopy technique adapted for thick tissue samples further confirmed the location of S. flexneri within colonocytes at the mouth of crypts. This quantitative imaging approach is a novel means to examine host-pathogen systems in a tailored and robust manner, inclusive of the infectious agent. PMID:26056271

  8. Chapter 17: bioimage informatics for systems pharmacology.

    Directory of Open Access Journals (Sweden)

    Fuhai Li

    2013-04-01

    Full Text Available Recent advances in automated high-resolution fluorescence microscopy and robotic handling have made the systematic and cost effective study of diverse morphological changes within a large population of cells possible under a variety of perturbations, e.g., drugs, compounds, metal catalysts, RNA interference (RNAi. Cell population-based studies deviate from conventional microscopy studies on a few cells, and could provide stronger statistical power for drawing experimental observations and conclusions. However, it is challenging to manually extract and quantify phenotypic changes from the large amounts of complex image data generated. Thus, bioimage informatics approaches are needed to rapidly and objectively quantify and analyze the image data. This paper provides an overview of the bioimage informatics challenges and approaches in image-based studies for drug and target discovery. The concepts and capabilities of image-based screening are first illustrated by a few practical examples investigating different kinds of phenotypic changes caEditorsused by drugs, compounds, or RNAi. The bioimage analysis approaches, including object detection, segmentation, and tracking, are then described. Subsequently, the quantitative features, phenotype identification, and multidimensional profile analysis for profiling the effects of drugs and targets are summarized. Moreover, a number of publicly available software packages for bioimage informatics are listed for further reference. It is expected that this review will help readers, including those without bioimage informatics expertise, understand the capabilities, approaches, and tools of bioimage informatics and apply them to advance their own studies.

  9. Upconverting nanophosphors for bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Shuang Fang; Zhuo Rui [Department of MAE, Princeton University, Princeton, NJ 08544 (United States); Riehn, Robert [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Tung, Chih-kuan; Dalland, Joanna; Austin, Robert H [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Ryu, William S [Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544 (United States)

    2009-10-07

    Upconverting nanoparticles (UCNPs) when excited in the near-infrared (NIR) region display anti-Stokes emission whereby the emitted photon is higher in energy than the excitation energy. The material system achieves that by converting two or more infrared photons into visible photons. The use of the infrared confers benefits to bioimaging because of its deeper penetrating power in biological tissues and the lack of autofluorescence. We demonstrate here sub-10 nm, upconverting rare earth oxide UCNPs synthesized by a combustion method that can be stably suspended in water when amine modified. The amine modified UCNPs show specific surface immobilization onto patterned gold surfaces. Finally, the low toxicity of the UCNPs is verified by testing on the multi-cellular C. elegans nematode.

  10. Quantitative analysis for radiation image measured by bio-image analyzer

    International Nuclear Information System (INIS)

    Bio-image analyzer is a system for detecting radiation images. In the system, the radiation image recorded on the imaging plate (coated with photostimulable phosphor on a polyester plate) is read out as light signals by laser beam excitation and the image data are processed by a computer. This system is mainly applied for the autoradiography of biological samples. In order to clarify the characteristics of the analyzer, the factors that affect to the quantification of radiation image have been investigated. The photostimulable phosphor shows the fading phenomenon and its quantity depends on the preservation temperature and period. Irradiating C14-β ray for definite time, the plates were preserved for 1 hour to 14 days under 10degC to 40degC and read out. The absolute output value, defined as a value unaffected by fading, was determined from the relation between irradiation time and the output, by extraporating the time to zero. Compared to the absolute value, the calibration factors were calculated and expressed as the function of storage time and temperature. The fading effects after Tl204-β and γ ray irradiation were also examined and the fading rates almost coincide with that of C14-β ray. (author)

  11. Semiquantitative fluorescence method for bioconjugation analysis.

    Science.gov (United States)

    Brasil, Aluízio G; Carvalho, Kilmara H G; Leite, Elisa S; Fontes, Adriana; Santos, Beate Saegesser

    2014-01-01

    Quantum dots (QDs) have been used as fluorescent probes in biological and medical fields such as bioimaging, bioanalytical, and immunofluorescence assays. For these applications, it is important to characterize the QD-protein bioconjugates. This chapter provides details on a versatile method to confirm quantum dot-protein conjugation including the required materials and instrumentation in order to perform the step-by-step semiquantitative analysis of the bioconjugation efficiency by using fluorescence plate readings. Although the protocols to confirm the QD-protein attachment shown here were developed for CdTe QDs coated with specific ligands and proteins, the principles are the same for other QDs-protein bioconjugates. PMID:25103803

  12. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  13. A computational framework for bioimaging simulation

    OpenAIRE

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; TAKAHASHI, Koichi

    2014-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical ...

  14. Luminescent gold nanoparticles for bioimaging

    Science.gov (United States)

    Zhou, Chen

    Inorganic nanoparticles (NPs) with tunable and diverse material properties hold great potential as contrast agents for better disease management. Over the past decades, luminescent gold nanoparticles (AuNPs) with intrinsic emissions ranging from the visible to the near infrared have been synthesized and emerge as a new class of fluorophores for bioimaging. This dissertation aims to fundamentally understand the structure-property relationships in luminescent AuNPs and apply them as contrast agents to address some critical challenges in bioimaging at both the in vitro and in vivo level. In Chapter 2, we described the synthesized ~20 nm polycrystalline AuNPs (pAuNPs), which successfully integrated and enhanced plasmonic and fluorescence properties into a single AuNP through the grain size effect. The combination of these properties in one NP enabled AuNPs to serve as a multimodal contrast agent for in vitro optical microscopic imaging, making it possible to develop correlative microscopic imaging techniques. In Chapters 3-5, we proposed a feasible approach to optimize the in vivo kinetics and clearance profile of nanoprobes for multimodality in vivo bioimaging applications by using straightforward surface chemistry with luminescent AuNPs as a model. Luminescent glutathione-coated AuNPs of ~2 nm were synthesized. Investigation of the biodistribution showed that these glutathione-coated AuNPs (GS-AuNPs) exhibit stealthiness to the reticuloendothelial system (RES) organs and efficient renal clearance, with only 3.7+/-1.9% and 0.3+/-0.1% accumulating in the liver and spleen, and over 65% of the injection dose cleared out via the urine within the first 72 hours. In addition, ~2.5 nm NIR-emitting radioactive glutathione-coated [198Au]AuNPs (GS-[198Au]AuNPs) were synthesized for further evaluation of the pharmacokinetic profile of GS-AuNPs and potential multimodal imaging. The results showed that the GS-[198Au]AuNPs behave like small-molecule contrast agents in

  15. New horizons in biomagnetics and bioimaging

    International Nuclear Information System (INIS)

    This paper reviews recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation (TMS), magnetic resonance imaging (MRI), and cancer therapy based on magnetic stimulation. The technique of localized and vectorial TMS has made it possible to obtain non-invasive functional mapping of the human brain, and the development of new bioimaging technologies such as current distribution MRI and conductivity MRI may make it possible to understand the dynamics of brain functions, which include millisecond-level changes in functional regions and dynamic relations between brain neuronal networks. These techniques are leading medicine and biology toward new horizons through novel applications of magnetism. (author)

  16. Quantum dots in bio-imaging: Revolution by the small

    International Nuclear Information System (INIS)

    Visual analysis of biomolecules is an integral avenue of basic and applied biological research. It has been widely carried out by tagging of nucleotides and proteins with traditional fluorophores that are limited in their application by features such as photobleaching, spectral overlaps, and operational difficulties. Quantum dots (QDs) are emerging as a superior alternative and are poised to change the world of bio-imaging and further its applications in basic and applied biology. The interdisciplinary field of nanobiotechnology is experiencing a revolution and QDs as an enabling technology have become a harbinger of this hybrid field. Within a decade, research on QDs has evolved from being a pure science subject to the one with high-end commercial applications

  17. Advances in Bio-Imaging From Physics to Signal Understanding Issues State-of-the-Art and Challenges

    CERN Document Server

    Racoceanu, Daniel; Gouaillard, Alexandre

    2012-01-01

    Advances in Imaging Devices and Image processing stem from cross-fertilization between many fields of research such as Chemistry, Physics, Mathematics and Computer Sciences. This BioImaging Community feel the urge to integrate more intensively its various results, discoveries and innovation into ready to use tools that can address all the new exciting challenges that Life Scientists (Biologists, Medical doctors, ...) keep providing, almost on a daily basis. Devising innovative chemical probes, for example, is an archetypal goal in which image quality improvement must be driven by the physics of acquisition, the image processing and analysis algorithms and the chemical skills in order to design an optimal bioprobe. This book offers an overview of the current advances in many research fields related to bioimaging and highlights the current limitations that would need to be addressed in the next decade to design fully integrated BioImaging Device.

  18. A simple one-step synthesis of melanin-originated red shift emissive carbonaceous dots for bioimaging.

    Science.gov (United States)

    Hu, Chuan; Liu, Yongmei; Chen, Jiantao; He, Qin; Gao, Huile

    2016-10-15

    Carbonaceous dots (CDs) are superior nanomaterials owing to their promising luminescence properties and good biocompatibility. However, most CDs have relatively short excitation/emission, which restrict their application in bioimaging. In this study, a simple one-step procedure was developed for synthesis of melanin-originated CDs (MNPs). The MNPs showed two long red shift emissions at 570nm and 645nm with broad absorptions from 200nm to 400nm and 500nm to 700nm, suggesting the great potential of MNPs in bioimaging. Besides, several experiments indicated that MNPs possessed good serum stability and well blood compatibility. In vitro, MNPs could be taken up by C6 cell in a concentration- and time-dependent manner with endosomes involved. In conclusion, MNPs were prepared using a simple one-step method with unique optical and good biological properties and could be used for bioimaging. PMID:27416289

  19. Intelligent spectral signature bio-imaging in vivo for surgical applications

    Science.gov (United States)

    Jeong, Jihoon; Frykman, Philip K.; Gaon, Mark; Chung, Alice P.; Lindsley, Erik H.; Hwang, Jae Y.; Farkas, Daniel L.

    2007-02-01

    Multi-spectral imaging provides digital images of a scene or object at a large, usually sequential number of wavelengths, generating precise optical spectra at every pixel. We use the term "spectral signature" for a quantitative plot of optical property variations as a function of wavelengths. We present here intelligent spectral signature bio-imaging methods we developed, including automatic signature selection based on machine learning algorithms and database search-based automatic color allocations, and selected visualization schemes matching these approaches. Using this intelligent spectral signature bio-imaging method, we could discriminate normal and aganglionic colon tissue of the Hirschsprung's disease mouse model with over 95% sensitivity and specificity in various similarity measure methods and various anatomic organs such as parathyroid gland, thyroid gland and pre-tracheal fat in dissected neck of the rat in vivo.

  20. Synthesis, Structure, Properties, and Bioimaging of a Fluorescent Nitrogen-Linked Bisbenzothiadiazole.

    Science.gov (United States)

    Mota, Alberto A R; Corrêa, José R; Carvalho, Pedro H P R; de Sousa, Núbia M P; de Oliveira, Heibbe C B; Gatto, Claudia C; da Silva Filho, Demétrio A; de Oliveira, Aline L; Neto, Brenno A D

    2016-04-01

    This paper describes the synthesis, structure, photophysical properties, and bioimaging application of a novel 2,1,3-benzothiadiazole (BTD)-based rationally designed fluorophore. The capability of undergoing efficient stabilizing processes from the excited state allowed the novel BTD derivative to be used as a stable probe for bioimaging applications. No notable photobleaching effect or degradation could be observed during the experimental time period. Before the synthesis, the molecular architecture of the novel BTD derivative was evaluated by means of DFT calculations to validate the chosen design. Single-crystal X-ray analysis revealed the nearly flat characteristics of the structure in a syn conformation. The fluorophore was successfully tested as a live-cell-imaging probe and efficiently stained MCF-7 breast cancer cell lineages. PMID:26930300

  1. Applications of graphene and its derivatives in intracellular biosensing and bioimaging.

    Science.gov (United States)

    Zhu, Xiaohua; Liu, Yang; Li, Pei; Nie, Zhou; Li, Jinghong

    2016-08-01

    Graphene has a unique planar structure, as well as excellent electronic properties, and has attracted a great deal of interest from scientists. Graphene and its derivatives display advantageous characteristics as a biosensing platform due to their high surface area, good biocompatibility and ease of functionalization. Moreover, graphene and its derivatives exhibit excellent optical properties; thus they are considered to be promising and attractive candidates for bioimaging, mainly of cells and tissues. Following an introduction and a discussion of the optical properties of graphene, this review assesses the methods for engineering the functions of graphene and its derivatives. Specific examples are given on the use of graphene and its derivatives in fluorescence bioimaging, surface-enhanced Raman scattering (SERS) imaging, and magnetic resonance imaging (MRI). Finally, the prospects and further developments in this exciting field of graphene-based materials are suggested. PMID:27373227

  2. Spatial-scanning hyperspectral imaging probe for bio-imaging applications

    Science.gov (United States)

    Lim, Hoong-Ta; Murukeshan, Vadakke Matham

    2016-03-01

    The three common methods to perform hyperspectral imaging are the spatial-scanning, spectral-scanning, and snapshot methods. However, only the spectral-scanning and snapshot methods have been configured to a hyperspectral imaging probe as of today. This paper presents a spatial-scanning (pushbroom) hyperspectral imaging probe, which is realized by integrating a pushbroom hyperspectral imager with an imaging probe. The proposed hyperspectral imaging probe can also function as an endoscopic probe by integrating a custom fabricated image fiber bundle unit. The imaging probe is configured by incorporating a gradient-index lens at the end face of an image fiber bundle that consists of about 50 000 individual fiberlets. The necessary simulations, methodology, and detailed instrumentation aspects that are carried out are explained followed by assessing the developed probe's performance. Resolution test targets such as United States Air Force chart as well as bio-samples such as chicken breast tissue with blood clot are used as test samples for resolution analysis and for performance validation. This system is built on a pushbroom hyperspectral imaging system with a video camera and has the advantage of acquiring information from a large number of spectral bands with selectable region of interest. The advantages of this spatial-scanning hyperspectral imaging probe can be extended to test samples or tissues residing in regions that are difficult to access with potential diagnostic bio-imaging applications.

  3. Bioimaging mass spectrometry of trace elements – recent advance and applications of LA-ICP-MS: A review

    International Nuclear Information System (INIS)

    Highlights: • Bioimaging LA-ICP-MS is established for trace metals within biomedical specimens. • Trace metal imaging allows to study brain function and neurodegenerative diseases. • Laser microdissection ICP-MS was applied to mouse brain hippocampus and wheat root. - Abstract: Bioimaging using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) offers the capability to quantify trace elements and isotopes within tissue sections with a spatial resolution ranging about 10–100 μm. Distribution analysis adds to clarifying basic questions of biomedical research and enables bioaccumulation and bioavailability studies for ecological and toxicological risk assessment in humans, animals and plants. Major application fields of mass spectrometry imaging (MSI) and metallomics have been in brain and cancer research, animal model validation, drug development and plant science. Here we give an overview of latest achievements in methods and applications. Recent improvements in ablation systems, operation and cell design enabled progressively better spatial resolutions down to 1 μm. Meanwhile, a body of research has accumulated covering basic principles of the element architecture in animals and plants that could consistently be reproduced by several laboratories such as the distribution of Fe, Cu, Zn in rodent brain. Several studies investigated the distribution and delivery of metallo-drugs in animals. Hyper-accumulating plants and pollution indicator organisms have been the key topics in environmental science. Increasingly, larger series of samples are analyzed, may it be in the frame of comparisons between intervention and control groups, of time kinetics or of three-dimensional atlas approaches

  4. Bioimaging mass spectrometry of trace elements – recent advance and applications of LA-ICP-MS: A review

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J.Sabine, E-mail: s.becker@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany); Matusch, Andreas, E-mail: a.matusch@fz-juelich.de [Institute for Neuroscience and Medicine (INM-2), Forschungszentrum Jülich, Jülich D-52425 (Germany); Wu, Bei, E-mail: b.wu@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany)

    2014-07-04

    Highlights: • Bioimaging LA-ICP-MS is established for trace metals within biomedical specimens. • Trace metal imaging allows to study brain function and neurodegenerative diseases. • Laser microdissection ICP-MS was applied to mouse brain hippocampus and wheat root. - Abstract: Bioimaging using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) offers the capability to quantify trace elements and isotopes within tissue sections with a spatial resolution ranging about 10–100 μm. Distribution analysis adds to clarifying basic questions of biomedical research and enables bioaccumulation and bioavailability studies for ecological and toxicological risk assessment in humans, animals and plants. Major application fields of mass spectrometry imaging (MSI) and metallomics have been in brain and cancer research, animal model validation, drug development and plant science. Here we give an overview of latest achievements in methods and applications. Recent improvements in ablation systems, operation and cell design enabled progressively better spatial resolutions down to 1 μm. Meanwhile, a body of research has accumulated covering basic principles of the element architecture in animals and plants that could consistently be reproduced by several laboratories such as the distribution of Fe, Cu, Zn in rodent brain. Several studies investigated the distribution and delivery of metallo-drugs in animals. Hyper-accumulating plants and pollution indicator organisms have been the key topics in environmental science. Increasingly, larger series of samples are analyzed, may it be in the frame of comparisons between intervention and control groups, of time kinetics or of three-dimensional atlas approaches.

  5. Bioimaging: An Useful Tool to Monitor Differentiation of Human Embryonic Stem Cells into Chondrocytes.

    Science.gov (United States)

    Suchorska, Wiktoria M; Lach, Michał S; Richter, Magdalena; Kaczmarczyk, Jacek; Trzeciak, Tomasz

    2016-05-01

    To improve the recovery of damaged cartilage tissue, pluripotent stem cell-based therapies are being intensively explored. A number of techniques exist that enable monitoring of stem cell differentiation, including immunofluorescence staining. This simple and fast method enables changes to be observed during the differentiation process. Here, two protocols for the differentiation of human embryonic stem cells into chondrocytes were used (monolayer cell culture and embryoid body formation). Cells were labeled for markers expressed during the differentiation process at different time points (pluripotent: NANOG, SOX2, OCT3/4, E-cadherin; prochondrogenic: SOX6, SOX9, Collagen type II; extracellular matrix components: chondroitin sulfate, heparan sulfate; beta-catenin, CXCR4, and Brachyury). Comparison of the signal intensity of differentiated cells to control cell populations (articular cartilage chondrocytes and human embryonic stem cells) showed decreased signal intensities of pluripotent markers, E-cadherin and beta-catenin. Increased signal intensities of prochondrogenic markers and extracellular matrix components were observed. The changes during chondrogenic differentiation monitored by evaluation of pluripotent and chondrogenic markers signal intensity were described. The changes were similar to several studies over chondrogenesis. These results were confirmed by semi-quantitative analysis of IF signals. In this research we indicate a bioimaging as a useful tool to monitor and semi-quantify the IF pictures during the differentiation of hES into chondrocyte-like. PMID:26354117

  6. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  7. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  8. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  9. Nanostructures Derived from Starch and Chitosan for Fluorescence Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Yinxue Zu

    2016-07-01

    Full Text Available Fluorescent nanostructures (NSs derived from polysaccharides have drawn great attention as novel fluorescent probes for potential bio-imaging applications. Herein, we reported a facile alkali-assisted hydrothermal method to fabricate polysaccharide NSs using starch and chitosan as raw materials. Transmission electron microscopy (TEM demonstrated that the average particle sizes are 14 nm and 75 nm for starch and chitosan NSs, respectively. Fourier transform infrared (FT-IR spectroscopy analysis showed that there are a large number of hydroxyl or amino groups on the surface of these polysaccharide-based NSs. Strong fluorescence with an excitation-dependent emission behaviour was observed under ultraviolet excitation. Interestingly, the photostability of the NSs was found to be superior to fluorescein and rhodamine B. The quantum yield of starch NSs could reach 11.12% under the excitation of 360 nm. The oxidative metal ions including Cu(II, Hg(IIand Fe(III exhibited a quench effect on the fluorescence intensity of the prepared NSs. Both of the two kinds of the multicoloured NSs showed a maximum fluorescence intensity at pH 7, while the fluorescence intensity decreased dramatically when they were put in an either acidic or basic environment (at pH 3 or 11. The cytotoxicity study of starch NSs showed that low cell cytotoxicity and 80% viability was found after 24 h incubation, when their concentration was less than 10 mg/mL. The study also showed the possibility of using the multicoloured starch NSs for mouse melanoma cells and guppy fish imaging.

  10. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification of the...

  11. A two-photon fluorescent probe for bio-imaging of formaldehyde in living cells and tissues.

    Science.gov (United States)

    Li, Jun-Bin; Wang, Qian-Qian; Yuan, Lin; Wu, Yong-Xiang; Hu, Xiao-Xiao; Zhang, Xiao-Bing; Tan, Weihong

    2016-05-23

    Formaldehyde (FA) plays an important role in living systems as a reactive carbonyl species (RCS). An abnormal degree of FA is known to induce neurodegeneration, cognitive decrease and memory loss owing to the formation of strong cross-link DNA and protein and other molecules. The development of efficient methods for biological FA detection is of great biomedical importance. Although a few one-photon FA fluorescent probes have been reported for imaging in living cells, probes excited by two photons are more suitable for bio-imaging due to their low background fluorescence, less photobleaching, and deep penetration depth. In this study, a two-photon fluorescent probe for FA detection and bio-imaging in living cells and tissues was reported. The detection is based on the 2-aza-Cope sigmatropic rearrangement followed by elimination to release the fluorophore, resulting in both one- and two-photon excited fluorescence increase. The probe showed a high sensitivity to FA with a detection limit of 0.2 μM. Moreover, enabled the two-photon bio-imaging of FA in live HEK-293 cells and tissues with tissue-imaging depths of 40-170 μm. Furthermore, could be applied for the monitoring of endogenous FA in live MCF-7 cells, presaging its practical applications in biological systems. PMID:27137921

  12. Distribution of phytoplasmas in infected plants as revealed by real-time PCR and bioimaging.

    Science.gov (United States)

    Christensen, Nynne Meyn; Nicolaisen, Mogens; Hansen, Michael; Schulz, Alexander

    2004-11-01

    Phytoplasmas are cell wall-less bacteria inhabiting the phloem and utilizing it for their spread. Infected plants often show changes in growth pattern and a reduced crop yield. A quantitative real-time polymerase chain reaction (Q-PCR) assay and a bioimaging method were developed to quantify and localize phytoplasmas in situ. According to the Q-PCR assay, phytoplasmas accumulated disproportionately in source leaves of Euphorbia pulcherrima and, to a lesser extent, in petioles of source leaves and in stems. However, phytoplasma accumulation was small or nondetectable in sink organs (roots and sink leaves). For bioimaging, infected plant tissue was stained with vital fluorescence dyes and examined using confocal laser scanning microscopy. With a DNA-sensitive dye, the pathogens were detected exclusively in the phloem, where they formed dense masses in sieve tubes of Catharanthus roseus. Sieve tubes were identified by counterstaining with aniline blue for callose and multiphoton excitation. With a potentiometric dye, not all DNA-positive material was stained, suggesting that the dye stained metabolically active phytoplasmas only. Some highly infected sieve tubes contained phytoplasmas that were either inactive or dead upon staining. PMID:15553243

  13. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  14. Representation learning for histopathology image analysis

    OpenAIRE

    Arevalo Ovalle, John Edilson

    2013-01-01

    Abstract. Nowadays, automatic methods for image representation and analysis have been successfully applied in several medical imaging problems leading to the emergence of novel research areas like digital pathology and bioimage informatics. The main challenge of these methods is to deal with the high visual variability of biological structures present in the images, which increases the semantic gap between their visual appearance and their high level meaning. Particularly, the visual variabil...

  15. Theranostic liposomes loaded with quantum dots and apomorphine for brain targeting and bioimaging

    Directory of Open Access Journals (Sweden)

    Wen CJ

    2012-03-01

    Full Text Available Chih-Jen Wen1,*, Li-Wen Zhang2,*, Saleh A Al-Suwayeh3, Tzu-Chen Yen1, Jia-You Fang2,4 1Molecular Imaging Center, Chang Gung Memorial Hospital, Gueishan, Taoyuan, Taiwan; 2Pharmaceutics Laboratory, Graduate Institute of Natural Products, Chang Gung University, Gueishan, Taoyuan, Taiwan; 3Department of Pharmaceutics, College of Pharmacy, King Saud University, Riyadh, Saudi Arabia; 4Department of Cosmetic Science, Chang Gung University of Science and Technology, Gueishan, Taoyuan, Taiwan *These authors contributed equally to this workAbstract: Quantum dots (QDs and apomorphine were incorporated into liposomes to eliminate uptake by the liver and enhance brain targeting. We describe the preparation, physicochemical characterization, in vivo bioimaging, and brain endothelial cell uptake of the theranostic liposomes. QDs and the drug were mainly located in the bilayer membrane and inner core of the liposomes, respectively. Spherical vesicles with a mean diameter of ~140 nm were formed. QDs were completely encapsulated by the vesicles. Nearly 80% encapsulation percentage was achieved for apomorphine. A greater fluorescence intensity was observed in mouse brains treated with liposomes compared to free QDs. This result was further confirmed by ex vivo imaging of the organs. QD uptake by the heart and liver was reduced by liposomal incorporation. Apomorphine accumulation in the brain increased by 2.4-fold after this incorporation. According to a hyperspectral imaging analysis, multifunctional liposomes but not the aqueous solution carried QDs into the brain. Liposomes were observed to have been efficiently endocytosed into bEND3 cells. The mechanisms involved in the cellular uptake were clathrin- and caveola-mediated endocytosis, which were energy-dependent. To the best of our knowledge, our group is the first to develop liposomes with a QD-drug hybrid for the aim of imaging and treating brain disorders.Keywords: liposomes, quantum dots, apomorphine

  16. Nanomaterials for bio-imaging and therapeutics

    Science.gov (United States)

    Liu, Yu-San

    2007-12-01

    In this thesis we studied the applications of colloidal nanocrystal quantum dots (QD) in bio-medical studies. We investigate the synthesis of QD and report a relatively simple method for synthesizing QD. To produce QDs that are more stable and have higher fluorescent quantum efficiency than those produced by other methods (typically CdSe/ZnS core/shell structures), we developed a CdSe/ZnSe/ZnS (core/shell/shell) nanocrystal complex, capped with the small molecule mercaptoacetic acid (MAA) for aqueous solubilization and low toxicity. These MAA-capped QDs can be used as the visualization aid for a multi-functional probe combining the functions of viruses and carbon nanotubes (CNT). A mild method of tagging virus through a polycationic solution, Polybrene, at 4°C is developed. This method can preserve most viral infectivity. The probes can be used to induce higher death rate in cells under near-infrared laser irradiation than in the cells without them, and thus, after additional improvements, may find applications in the study of cancer therapy. The optical properties of MAA-capped QDs are pH dependent. In particular, the fluorescence intensity increases with pH (pH between 4 and 10) of the environment. The results lead to a new venue to exploit QD as nano-scale sensors for localized physical and chemical properties in cells.

  17. Aqueous synthesis of high bright and tunable near-infrared AgInSe2-ZnSe quantum dots for bioimaging.

    Science.gov (United States)

    Che, Dongchen; Zhu, Xiaoxu; Wang, Hongzhi; Duan, Yourong; Zhang, Qinghong; Li, Yaogang

    2016-02-01

    Efficient synthetic methods for near-infrared quantum dots with good biophysical properties as bioimaging agents are urgently required. In this work, a simple and fast synthesis of highly luminescent, near-infrared AgInSe2-ZnSe quantum dots (QDs) with tunable emissions in aqueous media is reported. This method avoids high temperature and pressure and organic solvents to directly generate water-dispersible AgInSe2-ZnSe QDs. The photoluminescence emission peak of the AgInSe2-ZnSe QDs ranged from 625 to 940nm, with quantum yields up to 31%. The AgInSe2-ZnSe QDs with high quantum yield, near-infrared and low cytotoxic could be used as good cell labels, showing great potential applications in bio-imaging. PMID:26513730

  18. Firm Analysis by Different Methods

    OpenAIRE

    Píbilová, Kateřina

    2012-01-01

    This Diploma Thesis deals with an analysis of the company made by selected methods. External environment of the company is analysed using PESTLE analysis and Porter’s five-factor model. The internal environment is analysed by means of Kralicek Quick test and Fundamental analysis. SWOT analysis represents opportunities and threats of the external environment with the strengths and weaknesses of the company. The proposal of betterment of the company’s economic management is designed on the basi...

  19. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  20. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  1. High resolution laser mass spectrometry bioimaging.

    Science.gov (United States)

    Murray, Kermit K; Seneviratne, Chinthaka A; Ghorai, Suman

    2016-07-15

    Mass spectrometry imaging (MSI) was introduced more than five decades ago with secondary ion mass spectrometry (SIMS) and a decade later with laser desorption/ionization (LDI) mass spectrometry (MS). Large biomolecule imaging by matrix-assisted laser desorption/ionization (MALDI) was developed in the 1990s and ambient laser MS a decade ago. Although SIMS has been capable of imaging with a moderate mass range at sub-micrometer lateral resolution from its inception, laser MS requires additional effort to achieve a lateral resolution of 10μm or below which is required to image at the size scale of single mammalian cells. This review covers untargeted large biomolecule MSI using lasers for desorption/ionization or laser desorption and post-ionization. These methods include laser microprobe (LDI) MSI, MALDI MSI, laser ambient and atmospheric pressure MSI, and near-field laser ablation MS. Novel approaches to improving lateral resolution are discussed, including oversampling, beam shaping, transmission geometry, reflective and through-hole objectives, microscope mode, and near-field optics. PMID:26972785

  2. High-efficiency upconversion luminescent sensing and bioimaging of Hg(II) by chromophoric ruthenium complex-assembled nanophosphors.

    Science.gov (United States)

    Liu, Qian; Peng, Juanjuan; Sun, Lining; Li, Fuyou

    2011-10-25

    A chromophoric ruthenium complex-assembled nanophosphor (N719-UCNPs) was achieved as a highly selective water-soluble probe for upconversion luminescence sensing and bioimaging of intracellular mercury ions. The prepared nanophosphors were characterized by X-ray powder diffraction (XRD), transmission electron microscopy (TEM), energy-dispersive X-ray analysis (EDXA), Fourier transform infrared spectroscopy (FTIR), and X-ray photoelectron spectroscopy (XPS). Further application of N719-UCNPs in sensing Hg(2+) was confirmed by optical titration experiment and upconversion luminescence live cell imaging. Using the ratiometric upconversion luminescence as a detection signal, the detection limit of Hg(2+) for this nanoprobe in water was down to 1.95 ppb, lower than the maximum level (2 ppb) of Hg(2+) in drinking water set by the United States EPA. Importantly, the nanoprobe N719-UCNPs has been shown to be capable of monitoring changes in the distribution of Hg(2+) in living cells by upconversion luminescence bioimaging. PMID:21899309

  3. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  4. G-quadruplex enhanced fluorescence of DNA-silver nanoclusters and their application in bioimaging

    Science.gov (United States)

    Zhu, Jinbo; Zhang, Libing; Teng, Ye; Lou, Baohua; Jia, Xiaofang; Gu, Xiaoxiao; Wang, Erkang

    2015-07-01

    Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially AS1411. Benefiting from the high brightness of AS1411 enhanced DNA-AgNCs and the specific binding affinity of AS1411 for nucleolin, the AS1411 enhanced AgNCs can stain cancer cells for bioimaging.Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially

  5. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  6. Assessment of Thorium Analysis Methods

    International Nuclear Information System (INIS)

    The Assessment of thorium analytical methods for mixture power fuel consisting of titrimetry, X-ray flouresence spectrometry, UV-VIS spectrometry, alpha spectrometry, emission spectrography, polarography, chromatography (HPLC) and neutron activation were carried out. It can be concluded that analytical methods which have high accuracy (deviation standard < 3%) were; titrimetry neutron activation analysis and UV-VIS spectrometry; whereas with low accuracy method (deviation standard 3-10%) were; alpha spectrometry and emission spectrography. Ore samples can be analyzed by X-ray flourescnce spectrometry, neutron activation analysis, UV-VIS spectrometry, emission spectrography, chromatography and alpha spectometry. Concentrated samples can be analyzed by X-ray flourescence spectrometry; simulation samples can be analyzed by titrimetry, polarography and UV-VIS spectrometry, and samples of thorium as minor constituent can be analyzed by neutron activation analysis and alpha spectrometry. Thorium purity (impurities element in thorium samples) can be analyzed by emission spectography. Considering interference aspects, in general analytical methods without molecule reaction are better than those involving molecule reactions (author). 19 refs., 1 tabs

  7. Statistical methods for bioimpedance analysis

    OpenAIRE

    Christian Tronstad; Are Hugo Pripp

    2014-01-01

    This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements a...

  8. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2003-01-01

    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  9. Quick methods for radiochemical analysis

    International Nuclear Information System (INIS)

    Quick methods for radiochemical analysis, of adequate precision for the assay of a limited number of biologically important radionuclides, are important in the development of effective monitoring programs, particularly those that would be applied in emergency situations following an accidental release of radioactive substances. Methods of this type have been developed in a number of laboratories and are being altered and improved from time to time. The Agency invited several Member States to provide detailed information on any such procedures that might have been developed in their laboratories. From the information thus obtained a number of methods have been selected that appear to meet the criteria of speed and economy in the use of materials and equipment, and seem to be eminently suitable for those radionuclides that might be of major interest from the point of view of assessing the potential dose to persons following a serious dispersal of contamination. Refs, figs and tabs

  10. Development of functional gold nanorods for bioimaging and photothermal therapy

    Energy Technology Data Exchange (ETDEWEB)

    Niidome, T, E-mail: niidome.takuro.655@m.kyushu-u.ac.j [Faculty of Engineering, Kyushu University, Fukuoka 819-0395 (Japan) and Center for Future Chemistry, Kyushu University, Fukuoka 819-0395 (Japan) and PRESTO, Japan Science and Technology Agency, Kawaguchi 332-0012 (Japan)

    2010-06-01

    Gold nanorods have strong surface plasmon band at near-infrared light region, and are used as a photothermal converter. Since the near-infrared light penetrates into tissues deeply, it has been expected as a contrast agent for near infrared light bioimaging, a photosensitizer for photothermal therapy, and functional device for drug delivery system responding to near-infrared light irradiation. In this study, the surface plasmon bands of intravenously injected gold nanorods were monitored in the mouse abdomen using a spectrophotometer equipped with an integrating sphere, then we determined pharmacokinetics parameters of the gold nanorods after intravenous injection. Next, the PEG-modified gold nanorods were directly injected into subcutaneous tumors in mice, then, near-infrared pulsed laser light was irradiated the tumors. Significant tumor damage and suppression of the tumor growth was observed. We constructed targeted delivery system of the gold nanorods by modifying with a thermo-responsive polymer and a peptide responding to a protease activity. These modified gold nanorods are expected as functional nanodevices for photothermal therapy and drug delivery system.

  11. Development of functional gold nanorods for bioimaging and photothermal therapy

    International Nuclear Information System (INIS)

    Gold nanorods have strong surface plasmon band at near-infrared light region, and are used as a photothermal converter. Since the near-infrared light penetrates into tissues deeply, it has been expected as a contrast agent for near infrared light bioimaging, a photosensitizer for photothermal therapy, and functional device for drug delivery system responding to near-infrared light irradiation. In this study, the surface plasmon bands of intravenously injected gold nanorods were monitored in the mouse abdomen using a spectrophotometer equipped with an integrating sphere, then we determined pharmacokinetics parameters of the gold nanorods after intravenous injection. Next, the PEG-modified gold nanorods were directly injected into subcutaneous tumors in mice, then, near-infrared pulsed laser light was irradiated the tumors. Significant tumor damage and suppression of the tumor growth was observed. We constructed targeted delivery system of the gold nanorods by modifying with a thermo-responsive polymer and a peptide responding to a protease activity. These modified gold nanorods are expected as functional nanodevices for photothermal therapy and drug delivery system.

  12. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard

    2006-03-01

    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  13. Combination Clustering Analysis Method and its Application

    OpenAIRE

    Bang-Chun Wen; Li-Yuan Dong; Qin-Liang Li; Yang Liu

    2013-01-01

    The traditional clustering analysis method can not automatically determine the optimal clustering number. In this study, we provided a new clustering analysis method which is combination clustering analysis method to solve this problem. Through analyzed 25 kinds of automobile data samples by combination clustering analysis method, the correctness of the analysis result was verified. It showed that combination clustering analysis method could objectively determine the number of clustering firs...

  14. Tunable Fabrication of Molybdenum Disulfide Quantum Dots for Intracellular MicroRNA Detection and Multiphoton Bioimaging.

    Science.gov (United States)

    Dai, Wenhao; Dong, Haifeng; Fugetsu, Bunshi; Cao, Yu; Lu, Huiting; Ma, Xinlei; Zhang, Xueji

    2015-09-01

    Molybdenum disulfide (MoS2 ) quantum dots (QDs) (size MoS2 QDs has not been investigated in great detail. Here, a facile and efficient approach for synthesis of controllable-size MoS2 QDs with excellent photoluminescence (PL) by using a sulfuric acid-assisted ultrasonic route is developed for this investigation. Various MoS2 structures including monolayer MoS2 flake, nanoporous MoS2 , and MoS2 QDs can be yielded by simply controlling the ultrasonic durations. Comprehensive microscopic and spectroscopic tools demonstrate that the MoS2 QDs have uniform lateral size and possess excellent excitation-independent blue PL. The as-generated MoS2 QDs show high quantum yield of 9.65%, long fluorescence lifetime of 4.66 ns, and good fluorescent stability over broad pH values from 4 to 10. Given the good intrinsic optical properties and large surface area combined with excellent physiological stability and biocompatibility, a MoS2 QDs-based intracellular microRNA imaging analysis system is successfully constructed. Importantly, the MoS2 QDs show good performance as multiphoton bioimaging labeling. The proposed synthesis strategy paves a new way for facile and efficient preparing MoS2 QDs with tunable-size for biomedical imaging and optoelectronic devices application. PMID:26033986

  15. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    Science.gov (United States)

    Gongalsky, M. B.; Osminkina, L. A.; Pereira, A.; Manankov, A. A.; Fedorenko, A. A.; Vasiliev, A. N.; Solovyev, V. V.; Kudryavtsev, A. A.; Sentis, M.; Kabashin, A. V.; Timoshenko, V. Yu.

    2016-04-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparency near 800 nm. Based on the laser ablation of crystalline Si targets in gaseous helium, followed by ultrasound-assisted dispersion of the deposited films in physiological saline, the proposed method avoids any toxic by-products during the synthesis. We demonstrate efficient contrast of the Si QDs in living cells by following the exciton PL. We also show that the prepared QDs do not provoke any cytoxicity effects while penetrating into the cells and efficiently accumulating near the cell membrane and in the cytoplasm. Combined with the possibility of enabling parallel therapeutic channels, ultrapure laser-synthesized Si nanostructures present unique object for cancer theranostic applications.

  16. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging.

    Science.gov (United States)

    Gongalsky, M B; Osminkina, L A; Pereira, A; Manankov, A A; Fedorenko, A A; Vasiliev, A N; Solovyev, V V; Kudryavtsev, A A; Sentis, M; Kabashin, A V; Timoshenko, V Yu

    2016-01-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparency near 800 nm. Based on the laser ablation of crystalline Si targets in gaseous helium, followed by ultrasound-assisted dispersion of the deposited films in physiological saline, the proposed method avoids any toxic by-products during the synthesis. We demonstrate efficient contrast of the Si QDs in living cells by following the exciton PL. We also show that the prepared QDs do not provoke any cytoxicity effects while penetrating into the cells and efficiently accumulating near the cell membrane and in the cytoplasm. Combined with the possibility of enabling parallel therapeutic channels, ultrapure laser-synthesized Si nanostructures present unique object for cancer theranostic applications. PMID:27102695

  17. Oleyl-hyaluronan micelles loaded with upconverting nanoparticles for bio-imaging

    International Nuclear Information System (INIS)

    Hyaluronan (HA) represents an interesting polymer for nanoparticle coating due to its biocompatibility and enhanced cell interaction via CD44 receptor. Here, we describe incorporation of oleate-capped β–NaYF4:Yb3+, Er3+ nanoparticles (UCNP-OA) into amphiphilic HA by microemulsion method. Resulting structures have a spherical, micelle-like appearance with a hydrodynamic diameter of 180 nm. UCNP-OA-loaded HA micelles show a good stability in PBS buffer and cell culture media. The intensity of green emission of UCNP-OA-loaded HA micelles in water is about five times higher than that of ligand-free UCNP, indicating that amphiphilic HA effectively protects UCNP luminescence from quenching by water molecules. We found that UCNP-OA-loaded HA micelles in concentrations up to 50 μg mL−1 increase cell viability of normal human dermal fibroblasts (NHDF), while viability of human breast adenocarcinoma cells MDA–MB–231 is reduced at these concentrations. The utility of UCNP-OA-loaded HA micelles as a bio-imaging probe was demonstrated in vitro by successful labelling of NHDF and MDA–MB–231 cells overexpressing the CD44 receptor

  18. Oleyl-hyaluronan micelles loaded with upconverting nanoparticles for bio-imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pospisilova, Martina, E-mail: martina.pospisilova@contipro.com; Mrazek, Jiri; Matuska, Vit; Kettou, Sofiane; Dusikova, Monika; Svozil, Vit; Nesporova, Kristina; Huerta-Angeles, Gloria; Vagnerova, Hana; Velebny, Vladimir [Contipro Biotech (Czech Republic)

    2015-09-15

    Hyaluronan (HA) represents an interesting polymer for nanoparticle coating due to its biocompatibility and enhanced cell interaction via CD44 receptor. Here, we describe incorporation of oleate-capped β–NaYF{sub 4}:Yb{sup 3+}, Er{sup 3+} nanoparticles (UCNP-OA) into amphiphilic HA by microemulsion method. Resulting structures have a spherical, micelle-like appearance with a hydrodynamic diameter of 180 nm. UCNP-OA-loaded HA micelles show a good stability in PBS buffer and cell culture media. The intensity of green emission of UCNP-OA-loaded HA micelles in water is about five times higher than that of ligand-free UCNP, indicating that amphiphilic HA effectively protects UCNP luminescence from quenching by water molecules. We found that UCNP-OA-loaded HA micelles in concentrations up to 50 μg mL{sup −1} increase cell viability of normal human dermal fibroblasts (NHDF), while viability of human breast adenocarcinoma cells MDA–MB–231 is reduced at these concentrations. The utility of UCNP-OA-loaded HA micelles as a bio-imaging probe was demonstrated in vitro by successful labelling of NHDF and MDA–MB–231 cells overexpressing the CD44 receptor.

  19. Advanced bioimaging technologies in assessment of the quality of bone and scaffold materials. Techniques and applications

    Energy Technology Data Exchange (ETDEWEB)

    Qin Ling; Leung, Kwok Sui (eds.) [Chinese Univ. of Hong Kong (China). Dept. of Orthopaedics and Traumatology; Genant, H.K. [California Univ., San Francisco, CA (United States); Griffith, J.F. [Chinese Univ. of Hong Kong (China). Dept. of Radiology and Organ Imaging

    2007-07-01

    This book provides a perspective on the current status of bioimaging technologies developed to assess the quality of musculoskeletal tissue with an emphasis on bone and cartilage. It offers evaluations of scaffold biomaterials developed for enhancing the repair of musculoskeletal tissues. These bioimaging techniques include micro-CT, nano-CT, pQCT/QCT, MRI, and ultrasound, which provide not only 2-D and 3-D images of the related organs or tissues, but also quantifications of the relevant parameters. The advance bioimaging technologies developed for the above applications are also extended by incorporating imaging contrast-enhancement materials. Thus, this book will provide a unique platform for multidisciplinary collaborations in education and joint R and D among various professions, including biomedical engineering, biomaterials, and basic and clinical medicine. (orig.)

  20. Advanced bioimaging technologies in assessment of the quality of bone and scaffold materials. Techniques and applications

    International Nuclear Information System (INIS)

    This book provides a perspective on the current status of bioimaging technologies developed to assess the quality of musculoskeletal tissue with an emphasis on bone and cartilage. It offers evaluations of scaffold biomaterials developed for enhancing the repair of musculoskeletal tissues. These bioimaging techniques include micro-CT, nano-CT, pQCT/QCT, MRI, and ultrasound, which provide not only 2-D and 3-D images of the related organs or tissues, but also quantifications of the relevant parameters. The advance bioimaging technologies developed for the above applications are also extended by incorporating imaging contrast-enhancement materials. Thus, this book will provide a unique platform for multidisciplinary collaborations in education and joint R and D among various professions, including biomedical engineering, biomaterials, and basic and clinical medicine. (orig.)

  1. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  2. Metal and Complementary Molecular Bioimaging in Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Nady eBraidy

    2014-07-01

    Full Text Available Alzheimer’s disease (AD is the leading cause of dementia in the elderly. AD represents a complex neurological disorder which is best understood as the consequence of a number of interconnected genetic and lifestyle variables, which culminate in multiple changes to brain structure and function. At a molecular level, metal dyshomeostasis is frequently observed in AD due to anomalous binding of metals such as Iron (Fe, Copper (Cu and Zinc (Zn, or impaired regulation of redox-active metals which can induce the formation of cytotoxic reactive oxygen species and neuronal damage. Neuroimaging of metals in a variety of intact brain cells and tissues is emerging as an important tool for increasing our understanding of the role of metal dysregulation in AD. Several imaging techniques have been used to study the cerebral metallo-architecture in biological specimens to obtain spatially resolved data on chemical elements present in a sample. Hyperspectral techniques, such as particle-induced X-ray emission (PIXE, energy dispersive X-ray spectroscopy (EDS, X-ray fluorescence microscopy (XFM, synchrotron X-ray fluorescence (SXRF, secondary ion mass spectrometry (SIMS, and laser ablation inductively coupled mass spectrometry (LA-ICPMS can reveal relative intensities and even semi-quantitative concentrations of a large set of elements with differing spatial resolution and detection sensitivities. Other mass spectrometric and spectroscopy imaging techniques such as laser ablation electrospray ionisation mass spectrometry (LA ESI-MS, MALDI imaging mass spectrometry (MALDI-IMS, and Fourier transform infrared spectroscopy (FTIR can be used to correlate changes in elemental distribution with the underlying pathology in AD brain specimens. The current review aims to discuss the advantages and challenges of using these emerging elemental and molecular imaging techniques, and highlight clinical achievements in AD research using bioimaging techniques.

  3. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    , thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project...... of sequence data, generated using next-generation sequencing (NGS) technologies, from either forensic (Chapter 1) or ancient (Chapters 2-5) materials. These chapters present projects very different in nature, reflecting the diversity of questions that have become possible to address in the ancient DNA field......, for which more data is currently being generated; therefore it should be interpreted as a preliminary report. In addition to the five chapters, an introduction and five appendices are included. Appended articles are included for the reader's interest, these represent the collaborations I have been part...

  4. Computational methods for global/local analysis

    Science.gov (United States)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  5. One-step fabrication of nitrogen-doped fluorescent nanoparticles from non-conjugated natural products and their temperature-sensing and bioimaging applications

    Directory of Open Access Journals (Sweden)

    Xiaoling Zeng

    2015-03-01

    Full Text Available A facile solvothermal method was used to prepare N-doped fluorescent nanoparticles (NFNPs at gram scale from tartaric acid/citric acid/ethylenediamine using oleic acid as the reaction medium. The quantum yield of the obtained fluorescent nanoparticles could reach 48.7%. The NFNPs were characterized by multiple analytical techniques. By combining with the circular dichroism (CD spectra, the structure and the origin of photoluminescence of the NFNPs were discussed. The fluorescent intensity of the obtained NFNPs had remarkable stability, and exhibited a reversible temperature-dependent enhancement/quenching. The products with low cytotoxicity could be introduced into the target cells for in vitro bioimaging.

  6. From Science History and Applications Developments of Life System to Bio-Imaging Technology

    Institute of Scientific and Technical Information of China (English)

    YANLi-min; LOUWei; HEGuo-sen

    2004-01-01

    This paper presents brief science history and application developments of imaging technology,and discusses the bio-imaging technology. Real-time image measurement techniques and parallel processing of a realistic example is given. Finally coming converging technology of nano-bio-Info-cgno (NBIC) is extended for future trend.

  7. Fourier methods for biosequence analysis.

    OpenAIRE

    Benson, D C

    1990-01-01

    Novel methods are discussed for using fast Fourier transforms for DNA or protein sequence comparison. These methods are also intended as a contribution to the more general computer science problem of text search. These methods extend the capabilities of previous FFT methods and show that these methods are capable of considerable refinement. In particular, novel methods are given which (1) enable the detection of clusters of matching letters, (2) facilitate the insertion of gaps to enhance seq...

  8. Multiscale 3D bioimaging: from cell, tissue to whole organism

    Science.gov (United States)

    Lau, S. H.; Wang, Ge; Chandrasekeran, Margam; Fan, Victor; Nazrul, Mohd; Chang, Hauyee; Fong, Tiffany; Gelb, Jeff; Feser, Michael; Yun, Wenbing

    2009-05-01

    While electron microscopes and AFMs are capable of high resolution imaging to molecular levels, there is an ongoing problem in integrating these results into the larger scale structure and functions of tissue and organs within a complex organism. Imaging biological samples with optical microscopy is predominantly done with histology and immunohistochemistry, which can take up to a several weeks to prepare, are artifact prone and only available as individual 2D images. At the nano resolution scale, the higher resolution electron microscopy and AFM are used, but again these require destructive sample preparation and data are in 2D. To bridge this gap, we describe a rapid non invasive hierarchical bioimaging technique using a novel lab based x-ray computed tomography to characterize complex biological organism in multiscale- from whole organ (mesoscale) to calcified and soft tissue (microscale), to subcellular structures, nanomaterials and cellular-scaffold interaction (nanoscale). While MicroCT (micro x-ray computed tomography) is gaining in popularity for non invasive bones and tissue imaging, contrast and resolution are still vastly inadequate compared to histology. In this study we will present multiscale results from a novel microCT and nanoCT (nano x-ray tomography system). The novel MicroCT can image large specimen and tissue sample at histology resolution of submicron voxel resolution, often without contrast agents, while the nanoCT using x-ray optics similar to those used in synchrotron radiation facilities, has 20nm voxel resolution, suitable for studying cellular, subcellular morphology and nanomaterials. Multiscale examples involving both calcified and soft tissue will be illustrated, which include imaging a rat tibia to the individual channels of osteocyte canaliculli and lacunae and an unstained whole murine lung to its alveoli. The role of the novel CT will also be discussed as a possible means for rapid virtual histology using a biopsy of a human

  9. Probabilistic methods for structural response analysis

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  10. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-09-15

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a study of the performance of the self-seeding scheme accounting for spatiotemporal coupling caused by the use of a single crystal monochromator. Our analysis indicates that this distortion is easily suppressed by the right choice of diamond crystal planes and that the proposed undulator source yields about the same performance as in the case for a X-ray seed pulse with no coupling. Simulations show that the FEL power reaches 2 TW in the 3 keV-5 keV photon energy range, which is the most preferable for single biomolecule imaging.

  11. Constructing an Intelligent Patent Network Analysis Method

    OpenAIRE

    Wu, Chao-Chan; Yao, Ching-Bang

    2012-01-01

    Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks...

  12. Evaluation methods of SWOT analysis

    OpenAIRE

    VANĚK, Michal; Mikoláš, Milan; Žváková, Kateřina

    2012-01-01

    Strategic management is an integral part of top management. By formulating the right strategy and its subsequent implementation, a managed organization can attract and retain a comparative advantage. In order to fulfil this expectation, the strategy also has to be supported with relevant findings of performed strategic analyses. The best known and probably the most common of these is a SWOT analysis. In practice, however, the analysis is reduced to mere presentation of influence factors, whic...

  13. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  14. Stable and Size-Tunable Aggregation-Induced Emission Nanoparticles Encapsulated with Nanographene Oxide and Applications in Three-Photon Fluorescence Bioimaging.

    Science.gov (United States)

    Zhu, Zhenfeng; Qian, Jun; Zhao, Xinyuan; Qin, Wei; Hu, Rongrong; Zhang, Hequn; Li, Dongyu; Xu, Zhengping; Tang, Ben Zhong; He, Sailing

    2016-01-26

    Organic fluorescent dyes with high quantum yield are widely applied in bioimaging and biosensing. However, most of them suffer from a severe effect called aggregation-caused quenching (ACQ), which means that their fluorescence is quenched at high molecular concentrations or in the aggregation state. Aggregation-induced emission (AIE) is a diametrically opposite phenomenon to ACQ, and luminogens with this feature can effectively solve this problem. Graphene oxide has been utilized as a quencher for many fluorescent dyes, based on which biosensing can be achieved. However, using graphene oxide as a surface modification agent of fluorescent nanoparticles is seldom reported. In this article, we used nanographene oxide (NGO) to encapsulate fluorescent nanoparticles, which consisted of a type of AIE dye named TPE-TPA-FN (TTF). NGO significantly improved the stability of nanoparticles in aqueous dispersion. In addition, this method could control the size of nanoparticles' flexibly as well as increase their emission efficiency. We then used the NGO-modified TTF nanoparticles to achieve three-photon fluorescence bioimaging. The architecture of ear blood vessels in mice and the distribution of nanoparticles in zebrafish could be observed clearly. Furthermore, we extended this method to other AIE luminogens and showed it was widely feasible. PMID:26641528

  15. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  16. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam

    2000-08-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  17. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  18. Sustainable, Rapid Synthesis of Bright-Luminescent CuInS2-ZnS Alloyed Nanocrystals: Multistage Nano-xenotoxicity Assessment and Intravital Fluorescence Bioimaging in Zebrafish-Embryos

    Science.gov (United States)

    Chetty, S. Shashank; Praneetha, S.; Basu, Sandeep; Sachidanandan, Chetana; Murugan, A. Vadivel

    2016-05-01

    Near-infrared (NIR) luminescent CuInS2-ZnS alloyed nanocrystals (CIZS-NCs) for highly fluorescence bioimaging have received considerable interest in recent years. Owing, they became a desirable alternative to heavy-metal based-NCs and organic dyes with unique optical properties and low-toxicity for bioimaging and optoelectronic applications. In the present study, bright and robust CIZS-NCs have been synthesized within 5 min, as-high-as 230 °C without requiring any inert-gas atmosphere via microwave-solvothermal (MW-ST) method. Subsequently, the in vitro and in vivo nano-xenotoxicity and cellular uptake of the MUA-functionalized CIZS-NCs were investigated in L929, Vero, MCF7 cell lines and zebrafish-embryos. We observed minimal toxicity and acute teratogenic consequences upto 62.5 μg/ml of the CIZS-NCs in zebrafish-embryos. We also observed spontaneous uptake of the MUA-functionalized CIZS-NCs by 3 dpf older zebrafish-embryos that are evident through bright red fluorescence-emission at a low concentration of 7.8 μg/mL. Hence, we propose that the rapid, low-cost, large-scale “sustainable” MW-ST synthesis of CIZS-NCs, is an ideal bio-nanoprobe with good temporal and spatial resolution for rapid labeling, long-term in vivo tracking and intravital-fluorescence-bioimaging (IVBI).

  19. Analysis methods for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  20. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    覃卓敏

    2008-01-01

    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  1. Neodymium-doped nanoparticles for infrared fluorescence bioimaging: The role of the host

    International Nuclear Information System (INIS)

    The spectroscopic properties of different infrared-emitting neodymium-doped nanoparticles (LaF3:Nd3+, SrF2:Nd3+, NaGdF4: Nd3+, NaYF4: Nd3+, KYF4: Nd3+, GdVO4: Nd3+, and Nd:YAG) have been systematically analyzed. A comparison of the spectral shapes of both emission and absorption spectra is presented, from which the relevant role played by the host matrix is evidenced. The lack of a “universal” optimum system for infrared bioimaging is discussed, as the specific bioimaging application and the experimental setup for infrared imaging determine the neodymium-doped nanoparticle to be preferentially used in each case

  2. Upconverting and NIR emitting rare earth based nanostructures for NIR-bioimaging

    Science.gov (United States)

    Hemmer, Eva; Venkatachalam, Nallusamy; Hyodo, Hiroshi; Hattori, Akito; Ebina, Yoshie; Kishimoto, Hidehiro; Soga, Kohei

    2013-11-01

    In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near-infrared (NIR) range under NIR excitation may overcome those problems. Due to the outstanding optical and magnetic properties of lanthanide ions (Ln3+), nanoscopic host materials doped with Ln3+, e.g. Y2O3:Er3+,Yb3+, are promising candidates for NIR-NIR bioimaging. Ln3+-doped gadolinium-based inorganic nanostructures, such as Gd2O3:Er3+,Yb3+, have a high potential as opto-magnetic markers allowing the combination of time-resolved optical imaging and magnetic resonance imaging (MRI) of high spatial resolution. Recent progress in our research on over-1000 nm NIR fluorescent nanoprobes for in vivo NIR-NIR bioimaging will be discussed in this review.In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near

  3. Neodymium-doped nanoparticles for infrared fluorescence bioimaging: The role of the host

    Energy Technology Data Exchange (ETDEWEB)

    Rosal, Blanca del; Pérez-Delgado, Alberto; Rocha, Ueslen; Martín Rodríguez, Emma; Jaque, Daniel, E-mail: daniel.jaque@uam.es [Fluorescence Imaging Group, Dpto. de Física de Materiales, Facultad de Ciencias, Universidad Autónoma de Madrid, Campus de Cantoblanco, Madrid 28049 (Spain); Misiak, Małgorzata; Bednarkiewicz, Artur [Wroclaw Research Centre EIT+, ul. Stabłowicka 147, 54-066 Wrocław (Poland); Institute of Physics, University of Tartu, 14c Ravila Str., 50411 Tartu (Estonia); Vanetsev, Alexander S. [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Orlovskii, Yurii [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Prokhorov General Physics Institute RAS, 38 Vavilov Str., 119991 Moscow (Russian Federation); Jovanović, Dragana J.; Dramićanin, Miroslav D. [Vinča Institute of Nuclear Sciences, University of Belgrade, P.O. Box 522, Belgrade 11001 (Serbia); Upendra Kumar, K.; Jacinto, Carlos [Grupo de Fotônica e Fluidos Complexos, Instituto de Física, Universidade Federal de Alagoas, 57072-900 Maceió-AL (Brazil); Navarro, Elizabeth [Depto. de Química, Eco Catálisis, UAM-Iztapalapa, Sn. Rafael Atlixco 186, México 09340, D.F (Mexico); and others

    2015-10-14

    The spectroscopic properties of different infrared-emitting neodymium-doped nanoparticles (LaF{sub 3}:Nd{sup 3+}, SrF{sub 2}:Nd{sup 3+}, NaGdF{sub 4}: Nd{sup 3+}, NaYF{sub 4}: Nd{sup 3+}, KYF{sub 4}: Nd{sup 3+}, GdVO{sub 4}: Nd{sup 3+}, and Nd:YAG) have been systematically analyzed. A comparison of the spectral shapes of both emission and absorption spectra is presented, from which the relevant role played by the host matrix is evidenced. The lack of a “universal” optimum system for infrared bioimaging is discussed, as the specific bioimaging application and the experimental setup for infrared imaging determine the neodymium-doped nanoparticle to be preferentially used in each case.

  4. Statistical analysis and optimization methods

    Czech Academy of Sciences Publication Activity Database

    Halámek, Josef; Holík, M.; Jurák, Pavel; Kasal, Miroslav

    Liptovský Mikuláš : Vojenská Akadémia FZV, 2002 - (Puttera, J.), s. 286 - 289 ISBN 80-8040-180-2. [KTERP. Tatranské Zruby (SK), 24.04.2002-26.04.2002] R&D Projects: GA ČR GA102/02/1339 Institutional research plan: CEZ:AV0Z2065902 Keywords : scatter plot * confidence ellipses * graphical method Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  5. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    OpenAIRE

    M. B. Gongalsky; Osminkina, L. A.; A. Pereira; A. A. Manankov; Fedorenko, A. A.; Vasiliev, A. N.; Solovyev, V. V.; Kudryavtsev, A. A.; Sentis, M.; Kabashin, A. V.; V. Yu. Timoshenko

    2016-01-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparenc...

  6. Functional CdSe/CdS@SiO2 nanoparticles for bioimaging applications

    OpenAIRE

    Aubert, Tangi; Wassmuth, Daniel; Soenen, Stefaan; Van Deun, Rik; Braeckmans, Kevin; Hens, Zeger

    2014-01-01

    Semiconductor quantum dots (QDs) constitute very promising candidates as light emitters for numerous applications in the field of biotechnology, such as cell labeling or other bioimaging techniques. For such applications, semiconductor QDs represent an attractive alternative to classic organic fluorophores as they exhibit a far superior photostability by several orders of magnitude and a higher brightness thanks to large absorption cross-sections. Within this family of materials, core-shell h...

  7. Heterogeneous core/shell fluoride nanocrystals with enhanced upconversion photoluminescence for in vivo bioimaging

    Science.gov (United States)

    Hao, Shuwei; Yang, Liming; Qiu, Hailong; Fan, Rongwei; Yang, Chunhui; Chen, Guanying

    2015-06-01

    We report on heterogeneous core/shell CaF2:Yb3+/Ho3+@NaGdF4 nanocrystals of 17 nm with efficient upconversion (UC) photoluminescence (PL) for in vivo bioimaging. Monodisperse core/shell nanostructures were synthesized using a seed-mediated growth process involving two quite different approaches of liquid-solid-solution and thermal decomposition. They exhibit green emission with a sharp band around 540 nm when excited at ~980 nm, which is about 39 times brighter than the core CaF2:Yb3+/Ho3+ nanoparticles. PL decays at 540 nm revealed that such an enhancement arises from efficient suppression of surface-related deactivation from the core nanocrystals. In vivo bioimaging employing water-dispersed core/shell nanoparticles displayed high contrast against the background.We report on heterogeneous core/shell CaF2:Yb3+/Ho3+@NaGdF4 nanocrystals of 17 nm with efficient upconversion (UC) photoluminescence (PL) for in vivo bioimaging. Monodisperse core/shell nanostructures were synthesized using a seed-mediated growth process involving two quite different approaches of liquid-solid-solution and thermal decomposition. They exhibit green emission with a sharp band around 540 nm when excited at ~980 nm, which is about 39 times brighter than the core CaF2:Yb3+/Ho3+ nanoparticles. PL decays at 540 nm revealed that such an enhancement arises from efficient suppression of surface-related deactivation from the core nanocrystals. In vivo bioimaging employing water-dispersed core/shell nanoparticles displayed high contrast against the background. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr02287h

  8. Alexa Fluor-labeled Fluorescent Cellulose Nanocrystals for Bioimaging Solid Cellulose in Spatially Structured Microenvironments

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G.; Kelly, Ryan T.; Orr, Galya; Hu, Dehong; Dehoff, Karl J.; Brockman, Fred J.; Wilkins, Michael J.

    2015-03-18

    Cellulose nanocrystal materials have been labeled with modern Alexa Fluor dyes in a process that first links the dye to a cyanuric chloride molecule. Subsequent reaction with cellulose nanocrystals provides dyed solid microcrystalline cellulose material that can be used for bioimaging and suitable for deposition in films and spatially structured microenvironments. It is demonstrated with single molecular fluorescence microscopy that these films are subject to hydrolysis by cellulose enzymes.

  9. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  10. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  11. The Intersection of CMOS Microsystems and Upconversion Nanoparticles for Luminescence Bioimaging and Bioassays

    Directory of Open Access Journals (Sweden)

    Liping Wei

    2014-09-01

    Full Text Available Organic fluorophores and quantum dots are ubiquitous as contrast agents for bio-imaging and as labels in bioassays to enable the detection of biological targets and processes. Upconversion nanoparticles (UCNPs offer a different set of opportunities as labels in bioassays and for bioimaging. UCNPs are excited at near-infrared (NIR wavelengths where biological molecules are optically transparent, and their luminesce in the visible and ultraviolet (UV wavelength range is suitable for detection using complementary metal-oxide-semiconductor (CMOS technology. These nanoparticles provide multiple sharp emission bands, long lifetimes, tunable emission, high photostability, and low cytotoxicity, which render them particularly useful for bio-imaging applications and multiplexed bioassays. This paper surveys several key concepts surrounding upconversion nanoparticles and the systems that detect and process the corresponding luminescence signals. The principle of photon upconversion, tuning of emission wavelengths, UCNP bioassays, and UCNP time-resolved techniques are described. Electronic readout systems for signal detection and processing suitable for UCNP luminescence using CMOS technology are discussed. This includes recent progress in miniaturized detectors, integrated spectral sensing, and high-precision time-domain circuits. Emphasis is placed on the physical attributes of UCNPs that map strongly to the technical features that CMOS devices excel in delivering, exploring the interoperability between the two technologies.

  12. Breathing laser as an inertia-free swept source for high-quality ultrafast optical bioimaging.

    Science.gov (United States)

    Wei, Xiaoming; Xu, Jingjiang; Xu, Yiqing; Yu, Luoqin; Xu, Jianbing; Li, Bowen; Lau, Andy K S; Wang, Xie; Zhang, Chi; Tsia, Kevin K; Wong, Kenneth K Y

    2014-12-01

    We demonstrate an all-fiber breathing laser as inertia-free swept source (BLISS), with an ultra-compact design, for the emerging ultrafast bioimaging modalities. The unique feature of BLISS is its broadband wavelength-swept operation (∼60  nm) with superior temporal stability in terms of both long term (0.08 dB over 27 h) and shot-to-shot power variations (2.1%). More importantly, it enables a wavelength sweep rate of >10  MHz (∼7×10⁸  nm/s)—orders-of-magnitude faster than the existing swept sources based on mechanical or electrical tuning techniques. BLISS thus represents a practical and new generation of swept source operating in the unmet megahertz swept-rate regime that aligns with the pressing need for scaling the optical bioimaging speed in ultrafast phenomena study or high-throughput screening applications. To showcase its utility in high-speed optical bioimaging, we here employ BLISS for ultrafast time-stretch microscopy and multi-MHz optical coherence tomography of the biological specimen at a single-shot line-scan rate or A-scan rate of 11.5 MHz. PMID:25490629

  13. Reconfigurability Analysis Method for Spacecraft Autonomous Control

    OpenAIRE

    Dayi Wang; Chengrui Liu

    2014-01-01

    As a critical requirement for spacecraft autonomous control, reconfigurability should be considered in design stage of spacecrafts by involving effective reconfigurability analysis method in guiding system designs. In this paper, a novel reconfigurability analysis method is proposed for spacecraft design. First, some basic definitions regarding spacecraft reconfigurability are given. Then, based on function tree theory, a reconfigurability modeling approach is established to properly describe...

  14. The Qualitative Method of Impact Analysis.

    Science.gov (United States)

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  15. Probabilistic structural analysis by extremum methods

    Science.gov (United States)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  16. Nonlinear time series analysis methods and applications

    CERN Document Server

    Diks, Cees

    1999-01-01

    Methods of nonlinear time series analysis are discussed from a dynamical systems perspective on the one hand, and from a statistical perspective on the other. After giving an informal overview of the theory of dynamical systems relevant to the analysis of deterministic time series, time series generated by nonlinear stochastic systems and spatio-temporal dynamical systems are considered. Several statistical methods for the analysis of nonlinear time series are presented and illustrated with applications to physical and physiological time series.

  17. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  18. Computational structural analysis and finite element methods

    CERN Document Server

    Kaveh, A

    2014-01-01

    Graph theory gained initial prominence in science and engineering through its strong links with matrix algebra and computer science. Moreover, the structure of the mathematics is well suited to that of engineering problems in analysis and design. The methods of analysis in this book employ matrix algebra, graph theory and meta-heuristic algorithms, which are ideally suited for modern computational mechanics. Efficient methods are presented that lead to highly sparse and banded structural matrices. The main features of the book include: application of graph theory for efficient analysis; extension of the force method to finite element analysis; application of meta-heuristic algorithms to ordering and decomposition (sparse matrix technology); efficient use of symmetry and regularity in the force method; and simultaneous analysis and design of structures.

  19. Study on seismic equipment fragility analysis method

    International Nuclear Information System (INIS)

    Vulnerable points of nuclear power plants can be found through seismic PSA which is an effective method to evaluate the seismic effect on nuclear power plants, and fragility analysis is one important step for seismic PSA. In this paper, the concept of seismic equipment fragility is introduced, the mathematic model of seismic fragility is given, the determination of equipment failure modes is discussed, fragility analysis variables and methods (i. e. method based-on analysis and method based-on dynamic testing) are mainly studied, and finally median fragility, the distribution of randomness and uncertainty and HCLPF capacity can be calculated by formulations. On the other hand, when developing seismic fragility, there are three types of information that can be relied on: data from real earthquake experience, test data and analysis data, and this data used in specific nuclear plant need to be collected and completed. (authors)

  20. LANDSCAPE ANALYSIS METHOD OF RIVERINE TERRITORIES

    OpenAIRE

    Fedoseeva O. S.

    2013-01-01

    The article proposes a method for landscape area analysis, which consists of four stages. Technique is proposed as a tool for the practical application of pre-project research materials in the design solutions for landscape areas planning and organization

  1. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  2. CMEIAS bioimage informatics that define the landscape ecology of immature microbial biofilms developed on plant rhizoplane surfaces

    Directory of Open Access Journals (Sweden)

    Frank B Dazzo

    2015-10-01

    Full Text Available Colonization of the rhizoplane habitat is an important activity that enables certain microorganisms to promote plant growth. Here we describe various types of computer-assisted microscopy that reveal important ecological insights of early microbial colonization behavior within biofilms on plant root surfaces grown in soil. Examples of the primary data are obtained by analysis of processed images of rhizoplane biofilm landscapes analyzed at single-cell resolution using the emerging technology of CMEIAS bioimage informatics software. Included are various quantitative analyses of the in situ biofilm landscape ecology of microbes during their pioneer colonization of white clover roots, and of a rhizobial biofertilizer strain colonized on rice roots where it significantly enhances the productivity of this important crop plant. The results show that spatial patterns of immature biofilms developed on rhizoplanes that interface rhizosphere soil are highly structured (rather than distributed randomly when analyzed at the appropriate spatial scale, indicating that regionalized microbial cell-cell interactions and the local environment can significantly affect their cooperative and competitive colonization behaviors.

  3. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  4. ANALYSIS OF MODERN CAR BODY STRAIGHTENING METHODS

    Directory of Open Access Journals (Sweden)

    Arhun, Sch.

    2013-01-01

    Full Text Available The analysis of modern car body panels straightening methods is carried out. There have been described both traditional and alternative methods of car body panels straightening. The urgency of magnetic pulse teсhnology dignment is grounded. The main advantages of magnetic pulse teсhno-logy of car body straightening are defernined.

  5. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application of...

  6. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy anal

  7. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  8. Water-soluble photoluminescent fullerene capped mesoporous silica for pH-responsive drug delivery and bioimaging

    Science.gov (United States)

    Tan, Lei; Wu, Tao; Tang, Zhao-Wen; Xiao, Jian-Yun; Zhuo, Ren-Xi; Shi, Bin; Liu, Chuan-Jun

    2016-08-01

    In this paper, a biocompatible and water-soluble fluorescent fullerene (C60-TEG-COOH) coated mesoporous silica nanoparticle (MSN) was successfully fabricated for pH-sensitive drug release and fluorescent cell imaging. The MSN was first reacted with 3-aminopropyltriethoxysilane to obtain an amino-modified MSN, and then the water-soluble C60 with a carboxyl group was used to cover the surface of the MSN through electrostatic interaction with the amino group in PBS solution (pH = 7.4). The release of doxorubicin hydrochloride (DOX) could be triggered under a mild acidic environment (lysosome, pH = 5.0) due to the protonation of C60-TEG-COO‑, which induced the dissociation of the C60-TEG-COOH modified MSN (MSN@C60). Furthermore, the uptake of nanoparticles by cells could be tracked because of the green fluorescent property of the C60-modified MSN. In an in vitro study, the prepared materials showed excellent biocompatibility and the DOX-loaded nanocarrier exhibited efficient anticancer ability. This work offered a simple method for designing a simultaneous pH-responsive drug delivery and bioimaging system.

  9. Method validation for phthalate analysis from water

    OpenAIRE

    Irina Dumitraşc

    2013-01-01

    The goal of method validation is to provide objective evidence that the evaluated method will show acceptable reproducibility and accuracy so as to be applicable. The objective of this paper is to present a validation method for quantitative phthalates analysis from water by solid phase extraction (SPE) and determination by gas chromatography in combination with mass spectrometry detector (GC–MS) in electronic ionization mode (EI) with selected -ion monitoring (SIM) acquisition...

  10. Possibilities and limits of surface analysis methods

    International Nuclear Information System (INIS)

    The possibilities and limits of the surface analysis methods are presented and stated by means of a choice. It is tried to show how to built up a systematology of all methods. Some capable methods are described in detail. The examples of analyses are chosen under the point of view to give a contribution to the questions existing in the Institute for Reactor Development at the moment. (orig.)

  11. Reconfigurability Analysis Method for Spacecraft Autonomous Control

    Directory of Open Access Journals (Sweden)

    Dayi Wang

    2014-01-01

    Full Text Available As a critical requirement for spacecraft autonomous control, reconfigurability should be considered in design stage of spacecrafts by involving effective reconfigurability analysis method in guiding system designs. In this paper, a novel reconfigurability analysis method is proposed for spacecraft design. First, some basic definitions regarding spacecraft reconfigurability are given. Then, based on function tree theory, a reconfigurability modeling approach is established to properly describe system’s reconfigurability characteristics, and corresponding analysis procedure based on minimal cut set and minimal path set is further presented. In addition, indexes of fault reconfigurable degree and system reconfigurable rate for evaluating reconfigurability are defined, and the methodology for analyzing system’s week links is also constructed. Finally, the method is verified by a spacecraft attitude measuring system, and the results show that the presented method cannot only implement the quantitative reconfigurability evaluations but also find the weak links, and therefore provides significant improvements for spacecraft reconfigurability design.

  12. Excitation methods for energy dispersive analysis

    International Nuclear Information System (INIS)

    The rapid development in recent years of energy dispersive x-ray fluorescence analysis has been based primarily on improvements in semiconductor detector x-ray spectrometers. However, the whole analysis system performance is critically dependent on the availability of optimum methods of excitation for the characteristic x rays in specimens. A number of analysis facilities based on various methods of excitation have been developed over the past few years. A discussion is given of the features of various excitation methods including charged particles, monochromatic photons, and broad-energy band photons. The effects of the excitation method on background and sensitivity are discussed from both theoretical and experimental viewpoints. Recent developments such as pulsed excitation and polarized photons are also discussed

  13. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it...

  14. Recent Progress on the Preparation of Luminescent Silicon Nanoparticles for Bio-Imaging Applications

    Science.gov (United States)

    Maurice, V.; Sublemontier, O.; Herlin-Boime, N.; Doris, E.; Raccurt, O.; Sanson, A.

    2010-10-01

    : Luminescent silicon nanoparticles particles produced by laser pyrolysis are considered as possible alternative to replace toxic Quantum Dot in bioimaging applications. However, these nanoparticles are fully oxidized when kept in water, therefore, the luminescent silicon core must be be protected from oxidation. The Si nanoparticles were embedded in monodisperse silica beads (˜50 nm) produced in microemulsion. The silica beads provide protection of the silicon core and allow stability of the photoluminescence over time. They are well dispersed in water and biological medium with a colloidal stability of several days.

  15. Extravasation of Pt-based chemotherapeutics - bioimaging of their distribution in resectates using laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS).

    Science.gov (United States)

    Egger, Alexander E; Kornauth, Christoph; Haslik, Werner; Hann, Stephan; Theiner, Sarah; Bayer, Günther; Hartinger, Christian G; Keppler, Bernhard K; Pluschnig, Ursula; Mader, Robert M

    2015-03-01

    Platinum-based drugs (cisplatin, carboplatin and oxaliplatin) are widely used in cancer treatment. They are administered intravenously, thus accidental extravasations of infusions can occur. This may cause severe complications for the patient as the toxic platinum compounds likely persist in subcutaneous tissue. At high concentrations, platinum toxicity in combination with local thrombosis may result in tissue necrosis, eventually requiring surgical intervention. To describe tissue distribution at the anatomic level, we quantified drug extravasation in cryosections of various tissues (muscle, nerve tissue, connective tissue, fat tissue) by means of quantitative laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) and compared the resulting data with bulk analysis of microwave-assisted digestion of tissue samples followed by ICP-MS analysis. Samples of three patients receiving systemic chemotherapy either via peripheral venous access or central access via port-a-cath® were analyzed. Pt was enriched up to 50-times in connective tissue when compared with muscle tissue or drain samples collected over five days. The large areas of subcutaneous fat tissue showed areactive necrosis and average Pt concentrations (determined upon sample digestion) ranged from 0.2 μg g(-1) (therapy with 25 mg m(-2) cisplatin, four weeks after peripheral extravasation) to 10 μg g(-1) (therapy with 50 mg m(-2) oxaliplatin: four weeks after port-a-cath® extravasation). A peripheral nerve subjected to bioimaging by LA-ICP-MS showed a 5-times lower Pt concentration (0.2 μg g(-1)) than the surrounding connective tissue (1.0 μg g(-1)). This is in accordance with the patient showing no signs of neurotoxicity during recovery from extravasation side-effects. Thus, bioimaging of cutaneous nerve tissue may contribute to understand the risk of peripheral neurotoxic events. PMID:25659827

  16. Communication Error Analysis Method based on CREAM

    International Nuclear Information System (INIS)

    Communication error has been considered as a primary reason of many incidents and accidents in nuclear industry. In order to prevent these accidents, an analysis method of communication errors is proposed. This study presents a qualitative method to analyze communication errors. The qualitative method focuses on finding a root cause of the communication error and predicting the type of communication error which could happen in nuclear power plants. We develop context conditions and antecedent-consequent links of influential factors related to communication error. A case study has been conducted to validate the applicability of the proposed methods

  17. Instrumental methods of analysis, 7th edition

    International Nuclear Information System (INIS)

    The authors have prepared an organized and generally polished product. The book is fashioned to be used as a textbook for an undergraduate instrumental analysis course, a supporting textbook for graduate-level courses, and a general reference work on analytical instrumentation and techniques for professional chemists. Four major areas are emphasized: data collection and processing, spectroscopic instrumentation and methods, liquid and gas chromatographic methods, and electrochemical methods. Analytical instrumentation and methods have been updated, and a thorough citation of pertinent recent literature is included

  18. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  19. Image Analysis by Methods of Dimension Reduction

    Czech Academy of Sciences Publication Activity Database

    Moravec, P.; Snášel, V.; Frolov, A.; Húsek, Dušan; Řezanková, H.; Polyakov, P.Y.

    Los Alamitos: IEEE Computer Society, 2007, s. 272-277. ISBN 0-7695-2894-5. [CISIM'07. International Conference on Computer Information Systems and Industrial Management Applications /6./. Elk (PL), 28.06.2007-30.06.2007] R&D Projects: GA ČR GA201/05/0079 Institutional research plan: CEZ:AV0Z10300504 Keywords : image analysis * methods of dimension reduction * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research

  20. Scaling Internet Search Engines - Methods and Analysis

    OpenAIRE

    Risvik, Knut Magne

    2004-01-01

    This thesis focuses on methods and analysis for building scalable Internet Search Engines. In this work, we have developed a search kernel, an architecture framework and applications that are being used in industrial and commercial products. Furthermore, we present both analysis and design of key elements. Essential to building a large-scale search engine is to understand the dynamics of the content in which we are searching. For the challenging case of searching the web, there are multiple d...

  1. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  2. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  3. Review of strain buckling: analysis methods

    International Nuclear Information System (INIS)

    This report represents an attempt to review the mechanical analysis methods reported in the literature to account for the specific behaviour that we call buckling under strain. In this report, this expression covers all buckling mechanisms in which the strains imposed play a role, whether they act alone (as in simple buckling under controlled strain), or whether they act with other loadings (primary loading, such as pressure, for example). Attention is focused on the practical problems relevant to LMFBR reactors. The components concerned are distinguished by their high slenderness ratios and by rather high thermal levels, both constant and variable with time. Conventional static buckling analysis methods are not always appropriate for the consideration of buckling under strain. New methods must therefore be developed in certain cases. It is also hoped that this review will facilitate the coding of these analytical methods to aid the constructor in his design task and to identify the areas which merit further investigation

  4. Probabilistic structural analysis methods development for SSME

    Science.gov (United States)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  5. Implicitly Weighted Methods in Robust Image Analysis

    OpenAIRE

    Kalina, J. (Jan)

    2012-01-01

    This paper is devoted to highly robust statistical methods with applications to image analysis. The methods of the paper exploit the idea of implicit weighting, which is inspired by the highly robust least weighted squares regression estimator. We use a correlation coefficient based on implicit weighting of individual pixels as a highly robust similarity measure between two images. The reweighted least weighted squares estimator is considered as an alternative regression estimator with a clea...

  6. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, J F

    2007-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentrablatt Math "". . . carefully structured with many detailed worked examples . . .""-The Mathematical Gazette "". . . an up-to-date and user-friendly account . . .""-Mathematika An Introduction to Numerical Methods and Analysis addresses the mathematics underlying approximation and scientific computing and successfully explains where approximation methods come from, why they sometimes work (or d

  7. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E. [Emory Univ. School of Public Health, Atlanta, GA (United States); Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  8. Multifunctional NaYF4:Yb, Er@mSiO2@Fe3O4-PEG nanoparticles for UCL/MR bioimaging and magnetically targeted drug delivery

    Science.gov (United States)

    Liu, Bei; Li, Chunxia; Ma, Ping'an; Chen, Yinyin; Zhang, Yuanxin; Hou, Zhiyao; Huang, Shanshan; Lin, Jun

    2015-01-01

    A low toxic multifunctional nanoplatform, integrating both mutimodal diagnosis methods and antitumor therapy, is highly desirable to assure its antitumor efficiency. In this work, we show a convenient and adjustable synthesis of multifunctional nanoparticles NaYF4:Yb, Er@mSiO2@Fe3O4-PEG (MFNPs) based on different sizes of up-conversion nanoparticles (UCNPs). With strong up-conversion fluorescence offered by UCNPs, superparamagnetism properties attributed to Fe3O4 nanoparticles and porous structure coming from the mesoporous SiO2 shell, the as-obtained MFNPs can be utilized not only as a contrast agent for dual modal up-conversion luminescence (UCL)/magnetic resonance (MR) bio-imaging, but can also achieve an effective magnetically targeted antitumor chemotherapy both in vitro and in vivo. Furthermore, the UCL intensity of UCNPs and the magnetic properties of Fe3O4 in the MFNPs were carefully balanced. Silica coating and further PEG modifying can improve the hydrophilicity and biocompatibility of the as-synthesized MFNPs, which was confirmed by the in vitro/in vivo biocompatibility and in vivo long-time bio-distributions tests. Those results revealed that the UCNPs based magnetically targeted drug carrier system we synthesized has great promise in the future for multimodal bio-imaging and targeted cancer therapy.A low toxic multifunctional nanoplatform, integrating both mutimodal diagnosis methods and antitumor therapy, is highly desirable to assure its antitumor efficiency. In this work, we show a convenient and adjustable synthesis of multifunctional nanoparticles NaYF4:Yb, Er@mSiO2@Fe3O4-PEG (MFNPs) based on different sizes of up-conversion nanoparticles (UCNPs). With strong up-conversion fluorescence offered by UCNPs, superparamagnetism properties attributed to Fe3O4 nanoparticles and porous structure coming from the mesoporous SiO2 shell, the as-obtained MFNPs can be utilized not only as a contrast agent for dual modal up-conversion luminescence (UCL

  9. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  10. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... clearly defined failure modes, the MCFM can bestarted from each idealized single mode limit state in turn to identify a locally most central point on the elaborate limitstate surface. Typically this procedure leads to a fewer number of locally most central failure points on the elaboratelimit state...

  11. LANDSCAPE ANALYSIS METHOD OF RIVERINE TERRITORIES

    Directory of Open Access Journals (Sweden)

    Fedoseeva O. S.

    2013-10-01

    Full Text Available The article proposes a method for landscape area analysis, which consists of four stages. Technique is proposed as a tool for the practical application of pre-project research materials in the design solutions for landscape areas planning and organization

  12. Novel NMR Method for Organic Aerosol Analysis

    Czech Academy of Sciences Publication Activity Database

    Horník, Štěpán

    Prague: Institute of Chemical Process Fundamental of the CAS, v. v. i, 2015 - (Bendová, M.; Wagner, Z.), s. 20-21 ISBN 978-80-86186-70-2. [Bažant Postgraduate Conference 2015. Prague (CZ)] Institutional support: RVO:67985858 Keywords : nmr method * organic aerosol composition * analysis Subject RIV: CF - Physical ; Theoretical Chemistry

  13. Novel NMR Method for Organic Aerosol Analysis

    Czech Academy of Sciences Publication Activity Database

    Horník, Štěpán

    Prague : Institute of Chemical Process Fundamental of the CAS, v. v. i, 2015 - (Bendová, M.; Wagner, Z.), s. 20-21 ISBN 978-80-86186-70-2. [Bažant Postgraduate Conference 2015. Prague (CZ)] Institutional support: RVO:67985858 Keywords : nmr method * organic aerosol composition * analysis Subject RIV: CF - Physical ; Theoretical Chemistry

  14. Serum-stable quantum dot--protein hybrid nanocapsules for optical bio-imaging

    International Nuclear Information System (INIS)

    We introduce shell cross-linked protein/quantum dot (QD) hybrid nanocapsules as a serum-stable systemic delivery nanocarrier for tumor-targeted in vivo bio-imaging applications. Highly luminescent, heavy-metal-free Cu0.3InS2/ZnS (CIS/ZnS) core-shell QDs are synthesized and mixed with amine-reactive six-armed poly(ethylene glycol) (PEG) in dichloromethane. Emulsification in an aqueous solution containing human serum albumin (HSA) results in shell cross-linked nanocapsules incorporating CIS/ZnS QDs, exhibiting high luminescence and excellent dispersion stability in a serum-containing medium. Folic acid is introduced as a tumor-targeting ligand. The feasibility of tumor-targeted in vivo bio-imaging is demonstrated by measuring the fluorescence intensity of several major organs and tumor tissue after an intravenous tail vein injection of the nanocapsules into nude mice. The cytotoxicity of the QD-loaded HSA-PEG nanocapsules is also examined in several types of cells. Our results show that the cellular uptake of the QDs is critical for cytotoxicity. Moreover, a significantly lower level of cell death is observed in the CIS/ZnS QDs compared to nanocapsules loaded with cadmium-based QDs. This study suggests that the systemic tumor targeting of heavy-metal-free QDs using shell cross-linked HSA-PEG hybrid nanocapsules is a promising route for in vivo tumor diagnosis with reduced non-specific toxicity. (papers)

  15. Multiple predictor smoothing methods for sensitivity analysis

    International Nuclear Information System (INIS)

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  16. Methods of quantitative fire hazard analysis

    International Nuclear Information System (INIS)

    Simplified fire hazard analysis methods have been developed as part of the FIVE risk-based fire induced vulnerability evaluation methodology for nuclear power plants. These fire hazard analyses are intended to permit plant fire protection personnel to conservatively evaluate the potential for credible exposure fires to cause critical damage to essential safe-shutdown equipment and thereby screen from further analysis spaces where a significant fire hazard clearly does not exist. This document addresses the technical bases for the fire hazard analysis methods. A separate user's guide addresses the implementation of the fire screening methodology, which has been implemented with three worksheets and a number of look-up tables. The worksheets address different locations of targets relative to exposure fire sources. The look-up tables address fire-induced conditions in enclosures in terms of three stages: a fire plume/ceiling jet period, an unventilated enclosure smoke filling period and a ventilated quasi-steady period

  17. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were developed and input into the analysis. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. Total costs of each level of a standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, was calculated for each alternative standard. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis

  18. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  19. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  20. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  1. Semiconductor Quantum Dots for Bioimaging and Biodiagnostic Applications

    OpenAIRE

    Kairdolf, Brad A.; Andrew M Smith; Stokes, Todd H.; Wang, May D.; Young, Andrew N.; Nie, Shuming

    2013-01-01

    Semiconductor quantum dots (QDs) are light-emitting particles on the nanometer scale that have emerged as a new class of fluorescent labels for chemical analysis, molecular imaging, and biomedical diagnostics. Compared with traditional fluorescent probes, QDs have unique optical and electronic properties such as size-tunable light emission, narrow and symmetric emission spectra, and broad absorption spectra that enable the simultaneous excitation of multiple fluorescence colors. QDs are also ...

  2. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  3. Heteroscedastic regression analysis method for mixed data

    Institute of Scientific and Technical Information of China (English)

    FU Hui-min; YUE Xiao-rui

    2011-01-01

    The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data, type- I censored data and type- Ⅱ censored data from the location-scale distribution. The best unbiased estimations of regression coefficients, as well as the confidence limits of the location parameter and scale parameter were given. Furthermore, the point estimations and confidence limits of percentiles were obtained. Thus, the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution. So the presented method has a broad range of promising applications.

  4. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-01

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDHMF; GC-TOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet. PMID:18976770

  5. Robust Image Analysis of BeadChip Microarrays

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan; Schlenker, A.

    Lisbon: Scitepress, 2015 - (Secca, M.; Schier, J.; Fred, A.; Gamboa, H.; Elias, D.), s. 89-94 ISBN 978-989-758-072-7. [BIOIMAGING 2015. International Conference on Bioimaging /2./. Lisbon (PT), 12.01.2015-15.01.2015] Grant ostatní: SVV(CZ) 260034 Institutional support: RVO:67985807 Keywords : microarray * robust image analysis * noise * outlying measurements * background effect Subject RIV: IN - Informatics, Computer Science

  6. Ultrastable green fluorescence carbon dots with a high quantum yield for bioimaging and use as theranostic carriers

    DEFF Research Database (Denmark)

    Yang, Chuanxu; Thomsen, Rasmus Peter; Ogaki, Ryosuke;

    2015-01-01

    further assembled the Cdots into nanocomplexes with hyaluronic acid for potential use as theranostic carriers. After confirming that the Cdot nanocomplexes exhibited negligible cytotoxicity with H1299 lung cancer cells, in vitro bioimaging of the Cdots and nanocomplexes was carried out. Doxorubicin (Dox...

  7. A special issue on reviews in biomedical applications of nanomaterials, tissue engineering, stem cells, bioimaging, and toxicity.

    Science.gov (United States)

    Nalwa, Hari Singh

    2014-10-01

    This second special issue of the Journal of Biomedical Nanotechnology in a series contains another 30 state-of-the-art reviews focused on the biomedical applications of nanomaterials, biosensors, bone tissue engineering, MRI and bioimaging, single-cell detection, stem cells, endothelial progenitor cells, toxicity and biosafety of nanodrugs, nanoparticle-based new therapeutic approaches for cancer, hepatic and cardiovascular disease. PMID:25992404

  8. Power System Transient Stability Analysis through a Homotopy Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  9. Generalized analysis method for neutron resonance transmission analysis

    International Nuclear Information System (INIS)

    Neutron resonance densitometry (NRD) is a non-destructive analysis method, which can be applied to quantify special nuclear materials (SNM) in small particle-like debris of melted fuel that are formed in severe accidents of nuclear reactors such as the Fukushima Daiichi nuclear power plants. NRD uses neutron resonance transmission analysis (NRTA) to quantify SNM and neutron resonance capture analysis (NRCA) to identify matrix materials and impurities. To apply NRD for the characterization of arbitrary-shaped thick materials, a generalized method for the analysis of NRTA data has been developed. The method has been applied on data resulting from transmission through thick samples with an irregular shape and an areal density of SNM up to 0.253 at/b (≈100 g/cm2). The investigation shows that NRD can be used to quantify SNM with a high accuracy not only in inhomogeneous samples made of particle-like debris but also in samples made of large rocks with an irregular shape by applying the generalized analysis method for NRTA. (author)

  10. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  11. Method of thermal derivative gradient analysis (TDGA

    Directory of Open Access Journals (Sweden)

    M. Cholewa

    2009-07-01

    Full Text Available In this work a concept of thermal analysis was shown, using for crystallization kinetics description the temperature derivatives after time and direction. Method of thermal derivative gradient analysis (TDGA is assigned for alloys and metals investigation as well as cast composites in range of solidification. The construction and operation characteristics were presented for the test stand including processing modules and probes together with thermocouples location. Authors presented examples of results interpretation for AlSi11 alloy castings with diversified wall thickness and at different pouring temperature.

  12. Text analysis devices, articles of manufacture, and text analysis methods

    Science.gov (United States)

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  13. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses

  14. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  15. Advances in the homotopy analysis method

    CERN Document Server

    Liao, Shijun

    2013-01-01

    Unlike other analytic techniques, the Homotopy Analysis Method (HAM) is independent of small/large physical parameters. Besides, it provides great freedom to choose equation type and solution expression of related linear high-order approximation equations. The HAM provides a simple way to guarantee the convergence of solution series. Such uniqueness differentiates the HAM from all other analytic approximation methods. In addition, the HAM can be applied to solve some challenging problems with high nonlinearity. This book, edited by the pioneer and founder of the HAM, describes the current ad

  16. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  17. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    OpenAIRE

    Józef DREWNIAK; Stanisław ZAWIŚLAK; Wieczorek, Andrzej

    2014-01-01

    In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, ...

  18. Probabilistic structural analysis methods and applications

    Science.gov (United States)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  19. Analysis of information risk management methods

    OpenAIRE

    Zudin, Rodion

    2014-01-01

    Zudin, Rodion Analysis of information risk management methods Jyväskylä: University of Jyväskylä, 2014, 33 p. Information Systems, Bachelor’s Thesis Supervisor: Siponen, Mikko A brief overview in the information risk management field is done in this study by introducing the shared terminology and methodology of the field using literature overview in the first chapter. Second chapter consists of examining and comparing two information risk management methodologies propo...

  20. Safety relief valve alternate analysis method

    International Nuclear Information System (INIS)

    An experimental test program was started in the United States in 1976 to define and quantify Safety Relief Valve (SRV) phenomena in General Electric Mark I Suppression Chambers. The testing considered several discharged devices and was used to correlate SRV load prediction models. The program was funded by utilities with Mark I containments and has resulted in a detailed SRV load definition as a portion of the Mark I containment program Load Definition Report (LDR). The (USNRC) has reviewed and approved the LDR SRV load definition. In addition, the USNRC has permitted calibration of structural models used for predicting torus response to SRV loads. Model calibration is subject to confirmatory in-plant testing. The SRV methodology given in the LDR requires that transient dynamic pressures be applied to a torus structural model that includes a fluid added mass matrix. Preliminary evaluations of torus response have indicated order of magnitude conservatisms, with respect to test results, which could result in unrealistic containment modifications. In addition, structural response trends observed in full-scale tests between cold pipe, first valve actuation and hot pipe, subsequent valve actuation conditions have not been duplicated using current analysis methods. It was suggested by others that an energy approach using current fluid models be utilized to define loads. An alternate SRV analysis method is defined to correct suppression chamber structural response to a level that permits economical but conservative design. Simple analogs are developed for the purpose of correcting the analytical response obtained from LDR analysis methods. Analogs evaluated considered forced vibration and free vibration structural response. The corrected response correlated well with in-plant test response. The correlation of the analytical model at test conditions permits application of the alternate analysis method at design conditions. (orig./HP)

  1. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  2. Spectroscopic chemical analysis methods and apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  3. Computer modeling for neutron activation analysis methods

    International Nuclear Information System (INIS)

    Full text: The INP AS RU develops databases for the neutron-activation analysis - ND INAA [1] and ELEMENT [2]. Based on these databases, the automated complex is under construction aimed at modeling of methods for natural and technogenic materials analysis. It is well known, that there is a variety of analysis objects with wide spectra, different composition and concentration of elements, which makes it impossible to develop universal methods applicable for every analytical research. The modelling is based on algorithm, that counts the period of time in which the sample was irradiated in nuclear reactor, providing the sample's total absorption and activity analytical peaks areas with given errors. The analytical complex was tested for low-elemental analysis (determination of Fe and Zn in vegetation samples, and Cu, Ag and Au - in technological objects). At present, the complex is applied for multielemental analysis of sediment samples. In this work, modern achievements in the analytical chemistry (measurement facilities, high-resolution detectors, IAEA and IUPAC databases) and information technology applications (Java software, database management systems (DBMS), internet technologies) are applied. Reference: 1. Tillaev T., Umaraliev A., Gurvich L.G., Yuldasheva K., Kadirova J. Specialized database for instrumental neutron activation analysis - ND INAA 1.0, The 3-rd Eurasian Conference Nuclear Science and its applications, 2004, pp.270-271.; 2. Gurvich L.G., Tillaev T., Umaraliev A. The Information-analytical database on the element contents of natural objects. The 4-th International Conference Modern problems of Nuclear Physics, Samarkand, 2003, p.337. (authors)

  4. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  5. Progress of MEMS Scanning Micromirrors for Optical Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Lih Y. Lin

    2015-11-01

    Full Text Available Microelectromechanical systems (MEMS have an unmatched ability to incorporate numerous functionalities into ultra-compact devices, and due to their versatility and miniaturization, MEMS have become an important cornerstone in biomedical and endoscopic imaging research. To incorporate MEMS into such applications, it is critical to understand underlying architectures involving choices in actuation mechanism, including the more common electrothermal, electrostatic, electromagnetic, and piezoelectric approaches, reviewed in this paper. Each has benefits and tradeoffs and is better suited for particular applications or imaging schemes due to achievable scan ranges, power requirements, speed, and size. Many of these characteristics are fabrication-process dependent, and this paper discusses various fabrication flows developed to integrate additional optical functionality beyond simple lateral scanning, enabling dynamic control of the focus or mirror surface. Out of this provided MEMS flexibility arises some challenges when obtaining high resolution images: due to scanning non-linearities, calibration of MEMS scanners may become critical, and inherent image artifacts or distortions during scanning can degrade image quality. Several reviewed methods and algorithms have been proposed to address these complications from MEMS scanning. Given their impact and promise, great effort and progress have been made toward integrating MEMS and biomedical imaging.

  6. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  7. Review of Computational Stirling Analysis Methods

    Science.gov (United States)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  8. Optical methods for the analysis of dermatopharmacokinetics

    Science.gov (United States)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  9. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  10. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  11. Influence of gold nanoparticle architecture on in vitro bioimaging and cellular uptake

    International Nuclear Information System (INIS)

    Gold nanoparticles (GNPs) are favorable nanostructures for several biological applications due to their easy synthesis and biocompatible properties. Commonly studied GNP shapes are nanosphere (AuNS), nanorod (AuNR), and nanocage (AuNC). In addition to distinct geometries and structural symmetries, these shapes have different photophysical properties detected by surface plasmon resonances. Therefore, choosing the best shaped GNP for a specific purpose is crucial to the success of the application. In this study, all three shapes of GNP were investigated for their potency to interact with cell surface receptors. Anti-HER2 antibody was conjugated to the surface of nanoparticles. MCF-7 breast adenocarcinoma and hMSC human mesenchymal cell lines were treated with GNPs and analyzed for cellular uptake and bioimaging efficiencies using the UV–vis spectroscopy and dark-field microscopy

  12. Multiple-photon spectrum of CdS semiconductor quantum dot for bioimaging

    International Nuclear Information System (INIS)

    We study the dynamic processes of multiple-photon absorption and emission in a semiconductor quantum dot. By the non-perturbative time-dependent Schroedinger equation, it is shown that electrons in the quantum dot can be optically excited from the valence band to the conduction band via multiphoton processes, leaving holes in the valence band. The radiative recombination of the conduction-band electrons with the valence-band holes results in optical emission of a single photon having an energy which is larger than the input photon energy, resulting in the high-photon-energy luminescence from the quantum dot activated by low-energy photons to emit radiation in the visible optical regime for bioimaging application

  13. Recent conjugation strategies of small organic fluorophores and ligands for cancer-specific bioimaging.

    Science.gov (United States)

    Ha, Yonghwang; Choi, Hyun-Kyung

    2016-03-25

    Conjugation between various small fluorophores and specific ligands has become one of the main strategies for bioimaging in disease diagnosis, medicinal chemistry, immunology, and fluorescence-guided surgery, etc. Herein, we present our review of recent studies relating to molecular fluorescent imaging techniques for various cancers in cell-based and animal-based models. Various organic fluorophores, especially near-infrared (NIR) probes, have been employed with specific ligands. Types of ligands used were small molecules, peptides, antibodies, and aptamers; each has specific affinities for cellular receptor proteins, cancer-specific antigens, enzymes, and nucleic acids. This review can aid in the selection of cancer-specific ligands and fluorophores, and may inspire the further development of new conjugation strategies in various cellular and animal models. PMID:26892219

  14. Method and apparatus for simultaneous spectroelectrochemical analysis

    Science.gov (United States)

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  15. Methods development for criticality safety analysis

    International Nuclear Information System (INIS)

    A status review on the work at Oak Ridge to develop improved methods for performing multigroup, discrete-ordinates, and Monte Carlo criticality safety analyses is presented. In the area of multigroup cross section preparation this work entails the testing of ENDE/B-IV based and other cross-section libraries in the SCALE system, the development of improved cross-section processing methods for the AMPX system, and the generation of an ENDF/B-V based library. In the area of systems analysis this work entails improvements to the one-dimensional discrete-ordinates code XSDRNPM-S, the testing of the combinatorial geometry version of KENO, KENO-IV/CG, and development of an advanced version of KENO, KENO-V. Also presented is a brief review of the existing criticality safety analytical sequences in the SCALE system, CSAS1 and CSAS2, and the development of the advanced analytical sequences CSAS3 and CSAS4

  16. Selective spectroscopic methods for water analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  17. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  18. A comparison of analytical methods for detection of [14C]trichloro acetic acid-derived radioactivity in needles and branches of spruce (Picea sp.)

    International Nuclear Information System (INIS)

    The branches (wood and needles) of spruces of varying age treated with [14C]trichloro acetic acid (3.7 GBq/mmol) were studied, using the following methods: Qualitative: - Conventional macroautoradiography with X-ray film and histological classification. Quantitative: - 14C combustion analysis with the sample oxidizer A 307 (Canberra/Packard) followed by measurement of radioactivity using the LS counter 6000 (Beckman Instrumentts); - digital autoradiography with the Digital Autoradiograph LB 286 (Berthold GmbH); -digital autoradiography with the Bio-imaging Analyzer BAS 2000 (Fuji Film Co.). (orig.)

  19. Advances in quantitative electroencephalogram analysis methods.

    Science.gov (United States)

    Thakor, Nitish V; Tong, Shanbao

    2004-01-01

    Quantitative electroencephalogram (qEEG) plays a significant role in EEG-based clinical diagnosis and studies of brain function. In past decades, various qEEG methods have been extensively studied. This article provides a detailed review of the advances in this field. qEEG methods are generally classified into linear and nonlinear approaches. The traditional qEEG approach is based on spectrum analysis, which hypothesizes that the EEG is a stationary process. EEG signals are nonstationary and nonlinear, especially in some pathological conditions. Various time-frequency representations and time-dependent measures have been proposed to address those transient and irregular events in EEG. With regard to the nonlinearity of EEG, higher order statistics and chaotic measures have been put forward. In characterizing the interactions across the cerebral cortex, an information theory-based measure such as mutual information is applied. To improve the spatial resolution, qEEG analysis has also been combined with medical imaging technology (e.g., CT, MR, and PET). With these advances, qEEG plays a very important role in basic research and clinical studies of brain injury, neurological disorders, epilepsy, sleep studies and consciousness, and brain function. PMID:15255777

  20. Economic analysis of alternative LLW disposal methods

    International Nuclear Information System (INIS)

    The Environmental Protection Agency (EPA) has evaluated the costs and benefits of alternative disposal technologies as part of its program to develop generally applicable environmental standards for the land disposal of low-level radioactive waste (LLW). Costs, population health effects and Critical Population Group (CPG) exposures resulting from alternative waste treatment and disposal methods were evaluated both in absolute terms and also relative to a base case (current practice). Incremental costs of the standard included costs for packaging, processing, transportation, and burial of waste. Benefits are defined in terms of reductions in the general population health risk (expected fatal cancers and genetic effects) evaluated over 10,000 years. A cost-effectiveness ratio, defined as the incremental cost per avoided health effect, was calculated for each alternative standard. The cost-effectiveness analysis took into account a number of waste streams, hydrogeologic and climatic region settings, and waste treatment and disposal methods. This paper describes the alternatives considered and preliminary results of the cost-effectiveness analysis. 15 references, 7 figures, 3 tables

  1. 3D analysis methods - Study and seminar

    International Nuclear Information System (INIS)

    The first part of the report results from a study that was performed as a Nordic co-operation activity with active participation from Studsvik Scandpower and Westinghouse Atom in Sweden, and VTT in Finland. The purpose of the study was to identify and investigate the effects rising from using the 3D transient com-puter codes in BWR safety analysis, and their influence on the transient analysis methodology. One of the main questions involves the critical power ratio (CPR) calculation methodology. The present way, where the CPR calculation is per-formed with a separate hot channel calculation, can be artificially conservative. In the investigated cases, no dramatic minimum CPR effect coming from the 3D calculation is apparent. Some cases show some decrease in the transient change of minimum CPR with the 3D calculation, which confirms the general thinking that the 1D calculation is conservative. On the other hand, the observed effect on neutron flux behaviour is quite large. In a slower transient the 3D effect might be stronger. The second part of the report is a summary of a related seminar that was held on the 3D analysis methods. The seminar was sponsored by the Reactor Safety part (NKS-R) of the Nordic Nuclear Safety Research Programme (NKS). (au)

  2. Comparison study of microarray meta-analysis methods

    OpenAIRE

    Yang Yee; Campain Anna

    2010-01-01

    Abstract Background Meta-analysis methods exist for combining multiple microarray datasets. However, there are a wide range of issues associated with microarray meta-analysis and a limited ability to compare the performance of different meta-analysis methods. Results We compare eight meta-analysis methods, five existing methods, two naive methods and a novel approach (mDEDS). Comparisons are performed using simulated data and two biological case studies with varying degrees of meta-analysis c...

  3. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  4. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  5. An analysis of the practical DPG method

    CERN Document Server

    Gopalakrishnan, Jay

    2011-01-01

    In this work we give a complete error analysis of the Discontinuous Petrov Galerkin (DPG) method, accounting for all the approximations made in its practical implementation. Specifically, we consider the DPG method that uses a trial space consisting of polynomials of degree $p$ on each mesh element. Earlier works showed that there is a "trial-to-test" operator $T$, which when applied to the trial space, defines a test space that guarantees stability. In DPG formulations, this operator $T$ is local: it can be applied element-by-element. However, an infinite dimensional problem on each mesh element needed to be solved to apply $T$. In practical computations, $T$ is approximated using polynomials of some degree $r > p$ on each mesh element. We show that this approximation maintains optimal convergence rates, provided that $r\\ge p+N$, where $N$ is the space dimension (two or more), for the Laplace equation. We also prove a similar result for the DPG method for linear elasticity. Remarks on the conditioning of the...

  6. Mesoporous silica nanoparticles with organo-bridged silsesquioxane framework as innovative platforms for bioimaging and therapeutic agent delivery.

    Science.gov (United States)

    Du, Xin; Li, Xiaoyu; Xiong, Lin; Zhang, Xueji; Kleitz, Freddy; Qiao, Shi Zhang

    2016-06-01

    Mesoporous silica material with organo-bridged silsesquioxane frameworks is a kind of synergistic combination of inorganic silica, mesopores and organics, resulting in some novel or enhanced physicochemical and biocompatible properties compared with conventional mesoporous silica materials with pure Si-O composition. With the rapid development of nanotechnology, monodispersed nanoscale periodic mesoporous organosilica nanoparticles (PMO NPs) and organo-bridged mesoporous silica nanoparticles (MSNs) with various organic groups and structures have recently been synthesized from 100%, or less, bridged organosilica precursors, respectively. Since then, these materials have been employed as carrier platforms to construct bioimaging and/or therapeutic agent delivery nanosystems for nano-biomedical application, and they demonstrate some unique and/or enhanced properties and performances. This review article provides a comprehensive overview of the controlled synthesis of PMO NPs and organo-bridged MSNs, physicochemical and biocompatible properties, and their nano-biomedical application as bioimaging agent and/or therapeutic agent delivery system. PMID:27017579

  7. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications.

    Science.gov (United States)

    Liu, Li-Wei; Hu, Si-Yi; Pan, Ying; Zhang, Jia-Qi; Feng, Yue-Shu; Zhang, Xi-He

    2014-01-01

    In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell not only enhances the brightness of the QDs but also improves their stability in a biological environment. An in-vivo imaging study showed that F127-CdS/ZnS micelles had strong luminescence. These results suggest that these nanoparticles have significant advantages for bioimaging applications and may offer a new direction for the early detection of cancer in humans. PMID:24991530

  8. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications

    Directory of Open Access Journals (Sweden)

    Li-wei Liu

    2014-06-01

    Full Text Available In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell not only enhances the brightness of the QDs but also improves their stability in a biological environment. An in-vivo imaging study showed that F127-CdS/ZnS micelles had strong luminescence. These results suggest that these nanoparticles have significant advantages for bioimaging applications and may offer a new direction for the early detection of cancer in humans.

  9. Radioisotope method of compound flow analysis

    Directory of Open Access Journals (Sweden)

    Petryka Leszek

    2015-01-01

    Full Text Available The paper presents gamma radiation application to analysis of a multicomponent or multiphase flow. Such information as a selected component content in the mixture transported through pipe is crucial in many industrial or laboratory installations. Properly selected sealed radioactive source and collimators, deliver the photon beam, penetrating cross section of the flow. Detectors mounted at opposite to the source side of the pipe, allow recording of digital signals representing composition of the stream. In the present development of electronics, detectors and computer software, a significant progress in know-how of this field may be observed. The paper describes application of this method to optimization and control of hydrotransport of solid particles and propose monitoring facilitating prevent of a pipe clogging or dangerous oscillations.

  10. Digital methods for mediated discourse analysis

    DEFF Research Database (Denmark)

    Kjær, Malene; Larsen, Malene Charlotte

    2015-01-01

    practice of health care professionals (Author 1, 2014) and 2) young people’s identity construction on social media platforms (Author 2, 2010, 2015, in press). The paper’s contribution is a methodological discussion on digital data collection using methods such as online interviewing (via e-mail or chat......In this paper we discuss methodological strategies for collecting multimodal data using digital resources. The aim is to show how digital resources can provide ethnographic insights into mediated actions (Scollon, 2002) that can otherwise be difficult to observe or engage in, due to, for instance......, restrictions or privately mediated settings. Having used mediated discourse analysis (Scollon 2002, Scollon & Scollon, 2004) as a framework in two different research projects, we show how the framework, in correlation with digital resources for data gathering, provides new understandings of 1) the daily...

  11. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  12. Gap analysis: Concepts, methods, and recent results

    Science.gov (United States)

    Jennings, M.D.

    2000-01-01

    Rapid progress is being made in the conceptual, technical, and organizational requirements for generating synoptic multi-scale views of the earth's surface and its biological content. Using the spatially comprehensive data that are now available, researchers, land managers, and land-use planners can, for the first time, quantitatively place landscape units - from general categories such as 'Forests' or 'Cold-Deciduous Shrubland Formation' to more categories such as 'Picea glauca-Abies balsamea-Populus spp. Forest Alliance' - in their large-area contexts. The National Gap Analysis Program (GAP) has developed the technical and organizational capabilities necessary for the regular production and analysis of such information. This paper provides a brief overview of concepts and methods as well as some recent results from the GAP projects. Clearly, new frameworks for biogeographic information and organizational cooperation are needed if we are to have any hope of documenting the full range of species occurrences and ecological processes in ways meaningful to their management. The GAP experience provides one model for achieving these new frameworks.

  13. CARBON SEQUESTRATION: A METHODS COMPARATIVE ANALYSIS

    International Nuclear Information System (INIS)

    All human activities are related with the energy consumption. Energy requirements will continue to rise, due to the modern life and the developing countries growth. Most of the energy demand emanates from fossil fuels. Fossil fuels combustion has negative environmental impacts, with the CO2 production to be dominating. The fulfillment of the Kyoto protocol criteria requires the minimization of CO2 emissions. Thus the management of the CO2 emissions is an urgent matter. The use of appliances with low energy use and the adoption of an energy policy that prevents the unnecessary energy use, can play lead to the reduction of carbon emissions. A different route is the introduction of ''clean'' energy sources, such as renewable energy sources. Last but not least, the development of carbon sequestration methods can be promising technique with big future potential. The objective of this work is the analysis and comparison of different carbon sequestration and deposit methods. Ocean deposit, land ecosystems deposit, geological formations deposit and radical biological and chemical approaches will be analyzed

  14. Missing data treatment method on cluster analysis

    OpenAIRE

    Elsiddig Elsadig Mohamed Koko; Amin Ibrahim Adam Mohamed

    2015-01-01

    The missing data in household health survey was challenged for the researcher because of incomplete analysis. The statistical tool cluster analysis methodology implemented in the collected data of Sudan's household health survey in 2006. Current research specifically focuses on the data analysis as the objective is to deal with the missing values in cluster analysis. Two-Step Cluster Analysis is applied in which each participant is classified into one of the identified pattern and the opt...

  15. Comprehensive cosmographic analysis by Markov chain method

    International Nuclear Information System (INIS)

    We study the possibility of extracting model independent information about the dynamics of the Universe by using cosmography. We intend to explore it systematically, to learn about its limitations and its real possibilities. Here we are sticking to the series expansion approach on which cosmography is based. We apply it to different data sets: Supernovae type Ia (SNeIa), Hubble parameter extracted from differential galaxy ages, gamma ray bursts, and the baryon acoustic oscillations data. We go beyond past results in the literature extending the series expansion up to the fourth order in the scale factor, which implies the analysis of the deceleration q0, the jerk j0, and the snap s0. We use the Markov chain Monte Carlo method (MCMC) to analyze the data statistically. We also try to relate direct results from cosmography to dark energy (DE) dynamical models parametrized by the Chevallier-Polarski-Linder model, extracting clues about the matter content and the dark energy parameters. The main results are: (a) even if relying on a mathematical approximate assumption such as the scale factor series expansion in terms of time, cosmography can be extremely useful in assessing dynamical properties of the Universe; (b) the deceleration parameter clearly confirms the present acceleration phase; (c) the MCMC method can help giving narrower constraints in parameter estimation, in particular for higher order cosmographic parameters (the jerk and the snap), with respect to the literature; and (d) both the estimation of the jerk and the DE parameters reflect the possibility of a deviation from the ΛCDM cosmological model.

  16. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  17. Nonlinear Factor Analysis as a Statistical Method

    OpenAIRE

    Yalcin, Ilker; Amemiya, Yasuo

    2001-01-01

    Factor analysis and its extensions are widely used in the social and behavioral sciences, and can be considered useful tools for exploration and model fitting in multivariate analysis. Despite its popularity in applications, factor analysis has attracted rather limited attention from statisticians. Three issues, identification ambiguity, heavy reliance on normality, and limitation to linearity, may have contributed to statisticians' lack of interest in factor analysis. In th...

  18. Critical Security Methods : New Frameworks for Analysis

    NARCIS (Netherlands)

    Voelkner, Nadine; Huysmans, Jef; Claudia, Aradau; Neal, Andrew

    2015-01-01

    Critical Security Methods offers a new approach to research methods in critical security studies. It argues that methods are not simply tools to bridge the gap between security theory and security practice. Rather, to practise methods critically means engaging in a more free and experimental interpl

  19. System and method for making quantum dots

    KAUST Repository

    Bakr, Osman M.

    2015-05-28

    Embodiments of the present disclosure provide for methods of making quantum dots (QDs) (passivated or unpassivated) using a continuous flow process, systems for making QDs using a continuous flow process, and the like. In one or more embodiments, the QDs produced using embodiments of the present disclosure can be used in solar photovoltaic cells, bio-imaging, IR emitters, or LEDs.

  20. K-method of cognitive mapping analysis

    OpenAIRE

    Snarskii, A. A.; Zorinets, D. I.; Lande, D. V.; Levchenko, A. V.

    2016-01-01

    Introduced a new calculation method (K-method) for cognitive maps. K - method consists of two consecutive steps. In the first stage, allocated subgraph composed of all paths from one selected node (concept) to another node (concept) from the cognitive map (directed weighted graph) . In the second stage, after the transition to an undirected graph (symmetrization adjacency matrix) the influence of one node to another calculated with Kirchhoff method. In the proposed method, there is no problem...

  1. Advanced Software Methods for Physics Analysis

    International Nuclear Information System (INIS)

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming

  2. Biosurfactant templated quantum sized fluorescent gold nanoclusters for in vivo bioimaging in zebrafish embryos.

    Science.gov (United States)

    S, Chandirasekar; C, Chandrasekaran; T, Muthukumarasamyvel; G, Sudhandiran; N, Rajendiran

    2016-07-01

    We report the biosurfactant (sodium cholate) templated bright bluish-green emitting gold nanoclusters (AuNCs) by green chemical approach. Optical properties of the AuNCs were studied using UV-vis and luminescence spectroscopy. Lifetime of the fluorescent AuNCs was measured using time correlated single photon counting technique (TCSPC). High-resolution transmission electron microscopy (HR-TEM) and dynamic light scattering (DLS) were used to measure the sizes of the clusters. In-vivo toxicity and bioimaging studies of sodium cholate (NaC) templated AuNCs were carried out at different developmental stages of zebrafish embryos. The survival rate, hatching rate, heart rate, malformation and apoptotic gene expression experiments shows no significant toxicity in developing embryos up to 100μL/mL of AuNCs concentration and the AuNCs stained embryos exhibited green fluorescence with high intensity over the period from 4 to 96hpf (hours post fertilization) which shows that AuNCs were stable in living organisms. PMID:27037785

  3. Cyanine-based 1-amino-1-deoxyglucose as fluorescent probes for glucose transporter mediated bioimaging.

    Science.gov (United States)

    Xu, Hu; Liu, Xinyu; Yang, Jinna; Liu, Ran; Li, Taoli; Shi, Yunli; Zhao, Hongxia; Gao, Qingzhi

    2016-05-27

    Two novel cyanine-based 1-amino-1-deoxy-β-glucose conjugates (Glu-1N-Cy3 and Glu-1N-Cy5) were designed, synthesized and their fluorescence characteristics were studied. Both Glu-1N-Cy3 and Glu-1N-Cy5 accumulate in living HT29 human colon cancer cells, which overexpress glucose transporters (GLUTs). The cellular uptake of the bioprobes was inhibited by natural GLUT substrate d-glucose and 2-deoxy-d-glucose. The GLUT specificity of the probes was validated with quercetin, which is both a permeant substrate via GLUTs and a high-affinity inhibitor of GLUT-mediated glucose transport. Competitive fluorometric assay for GLUT substrate cell uptake revealed that Glu-1N-Cy3 and Glu-1N-Cy5 are 5 and 10 times more sensitive than 2-NBDG, a leading fluorescent glucose bioprobe. This study provides fundamental data supporting the potential of these two conjugates as new powerful tools for GLUT-mediated theranostics, in vitro and in vivo molecular bioimaging and drug R&D. PMID:27033602

  4. A multifunctional mesoporous silica nanocomposite for targeted delivery, controlled release of doxorubicin and bioimaging.

    Science.gov (United States)

    Xie, Meng; Shi, Hui; Li, Zhen; Shen, Haijun; Ma, Kun; Li, Bo; Shen, Song; Jin, Yi

    2013-10-01

    In this study, a targeting drug delivery system based on mesoporous silica nanoparticle (MSN) was successfully developed for anti-cancer drug delivery and bioimaging. Carboxyl functionalized MSN (MSN/COOH) was firstly prepared and then modified with folate as the cancer targeting moiety and a near infrared fluorescent dye as labeling segment. Folate was conjugated to MSN/COOH via functional polyethyleneglycol (PEG), constructing the vector MSN/COOH-PEG-FA. The functionalization with carboxyl caused the pore surface of the nanocarrier more negative than native MSN, which could provide attractive forces between the nanoparticles and positively charged doxorubicin hydrochloride (DOX). Meanwhile, the folate modification significantly enhanced the cellular uptake of the delivery system compared to unmodified counterparts. Furthermore, the introduction of PEG increased the water dispersibility. Besides, the modification with the near infrared fluorescent dye Cy5 made the system effective for live cell and in vivo imaging. Therefore, the Cy5-MSN/COOH-PEG-FA system could be a promising nanocarrier for simultaneous diagnosis and treatment of diseases. PMID:23711784

  5. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2012-01-01

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a stu...

  6. Gadolinia nanofibers as a multimodal bioimaging and potential radiation therapy agent

    Energy Technology Data Exchange (ETDEWEB)

    Grishin, A. M., E-mail: grishin@kth.se, E-mail: grishin@inmatech.com [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Petrozavodsk State University, 185910 Petrozavodsk, Karelian Republic (Russian Federation); Jalalian, A. [KTH Royal Institute of Technology, SE-164 40 Stockholm-Kista (Sweden); INMATECH Intelligent Materials Technology, SE-127 45 Skärholmen (Sweden); Tsindlekht, M. I. [Racah Institute of Physics, Hebrew University of Jerusalem, 91904 Jerusalem (Israel)

    2015-05-15

    Continuous bead-free C-type cubic gadolinium oxide (Gd{sub 2}O{sub 3}) nanofibers 20-30 μm long and 40-100 nm in diameter were sintered by sol-gel calcination assisted electrospinning technique. Dipole-dipole interaction of neighboring Gd{sup 3+} ions in nanofibers with large length-to-diameter aspect ratio results in some kind of superparamagnetic behavior: fibers are magnetized twice stronger than Gd{sub 2}O{sub 3} powder. Being compared with commercial Gd-DTPA/Magnevist{sup ®}, Gd{sub 2}O{sub 3} diethyleneglycol-coated (Gd{sub 2}O{sub 3}-DEG) fibers show high 1/T{sub 1} and 1/T{sub 2} proton relaxivities. Intense room temperature photoluminescence, high NMR relaxivity and high neutron scattering cross-section of {sup 157}Gd nucleus promise to integrate Gd{sub 2}O{sub 3} fibers for multimodal bioimaging and neutron capture therapy.

  7. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    International Nuclear Information System (INIS)

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully

  8. Fluorescent Polymer Nanoparticles Based on Dyes: Seeking Brighter Tools for Bioimaging.

    Science.gov (United States)

    Reisch, Andreas; Klymchenko, Andrey S

    2016-04-01

    Speed, resolution and sensitivity of today's fluorescence bioimaging can be drastically improved by fluorescent nanoparticles (NPs) that are many-fold brighter than organic dyes and fluorescent proteins. While the field is currently dominated by inorganic NPs, notably quantum dots (QDs), fluorescent polymer NPs encapsulating large quantities of dyes (dye-loaded NPs) have emerged recently as an attractive alternative. These new nanomaterials, inspired from the fields of polymeric drug delivery vehicles and advanced fluorophores, can combine superior brightness with biodegradability and low toxicity. Here, we describe the strategies for synthesis of dye-loaded polymer NPs by emulsion polymerization and assembly of pre-formed polymers. Superior brightness requires strong dye loading without aggregation-caused quenching (ACQ). Only recently several strategies of dye design were proposed to overcome ACQ in polymer NPs: aggregation induced emission (AIE), dye modification with bulky side groups and use of bulky hydrophobic counterions. The resulting NPs now surpass the brightness of QDs by ≈10-fold for a comparable size, and have started reaching the level of the brightest conjugated polymer NPs. Other properties, notably photostability, color, blinking, as well as particle size and surface chemistry are also systematically analyzed. Finally, major and emerging applications of dye-loaded NPs for in vitro and in vivo imaging are reviewed. PMID:26901678

  9. A four-dimensional snapshot hyperspectral video-endoscope for bio-imaging applications

    Science.gov (United States)

    Lim, Hoong-Ta; Murukeshan, Vadakke Matham

    2016-04-01

    Hyperspectral imaging has proven significance in bio-imaging applications and it has the ability to capture up to several hundred images of different wavelengths offering relevant spectral signatures. To use hyperspectral imaging for in vivo monitoring and diagnosis of the internal body cavities, a snapshot hyperspectral video-endoscope is required. However, such reported systems provide only about 50 wavelengths. We have developed a four-dimensional snapshot hyperspectral video-endoscope with a spectral range of 400–1000 nm, which can detect 756 wavelengths for imaging, significantly more than such systems. Capturing the three-dimensional datacube sequentially gives the fourth dimension. All these are achieved through a flexible two-dimensional to one-dimensional fiber bundle. The potential of this custom designed and fabricated compact biomedical probe is demonstrated by imaging phantom tissue samples in reflectance and fluorescence imaging modalities. It is envisaged that this novel concept and developed probe will contribute significantly towards diagnostic in vivo biomedical imaging in the near future.

  10. Super-resolution fluorescent materials: an insight into design and bioimaging applications.

    Science.gov (United States)

    Yang, Zhigang; Sharma, Amit; Qi, Jing; Peng, Xiao; Lee, Dong Yeop; Hu, Rui; Lin, Danying; Qu, Junle; Kim, Jong Seung

    2016-08-22

    Living organisms are generally composed of complex cellular processes which persist only within their native environments. To enhance our understanding of the biological processes lying within complex milieus, various techniques have been developed. Specifically, the emergence of super-resolution microscopy has generated a renaissance in cell biology by redefining the existing dogma towards nanoscale cell dynamics, single synaptic vesicles, and other complex bioprocesses by overcoming the diffraction-imposed resolution barrier that is associated with conventional microscopy techniques. Besides the typical technical reliance on the optical framework and computational algorithm, super-resolution imaging microscopy resorts largely to fluorescent materials with special photophysical properties, including fluorescent proteins, organic fluorophores and nanomaterials. In this tutorial review article, with the emphasis on cell biology, we summarize the recent developments in fluorescent materials being utilized in various super-resolution techniques with successful integration into bio-imaging applications. Fluorescent proteins (FP) applied in super-resolution microscopy will not be covered herein as it has already been well summarized; additionally, we demonstrate the breadth of opportunities offered from a future perspective. PMID:27296269

  11. Aqueous synthesis and biostabilization of CdS@ZnS quantum dots for bioimaging applications

    Science.gov (United States)

    Chen, L.; Liu, Y.; Lai, C.; Berry, R. M.; Tam, K. C.

    2015-10-01

    Bionanohybrids, combining biocompatible natural polymers with inorganic materials, have aroused interest because of their structural, functional, and environmental advantages. In this work, we report on the stabilization of CdS@ZnS core-shell quantum dots (QDs) using carboxylated cellulose nanocrytals (CNCs) as nanocarrieers in aqueous phase. The high colloidal stability was achieved with sufficient negative charge on CNC surface and the coordination of Cd2+ to carboxylate groups. This coordination allows the in-situ nucleation and growth of QDs on CNC surface. The influences of QD to CNC ratio, pH and ZnS coating on colloidal stability and photoluminescence property of CNC/QD nanohybirds were also studied. The results showed that products obtained at pH 8 with a CdS to CNC weight ratio of 0.19 and a ZnS/CdS molar ratio of 1.5 possessed excellent colloidal stability and highest photoluminescence intensity. By anchoring QDs on rigid bionanotemplates, CNC/CdS@ZnS exhibited long-term colloidal and optical stability. Using biocompatible CNC as nanocarriers, the products have been demonstrated to exhibit low cytotoxicity towards HeLa cells and can serve as promising red-emitting fluorescent bioimaging probes.

  12. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xiaodong [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China); Gao, Ya; Huang, Zhibing; Chen, Xiaohui; Ke, Zhiyong [School of Basic Medical Science, Southern Medical University, Guangzhou 510515 (China); Zhao, Peiliang; Yan, Yichen [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Liu, Ruiyuan, E-mail: ruiyliu@smu.edu.cn [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Qu, Jinqing, E-mail: cejqqu@scut.edu.cn [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China)

    2015-07-01

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully.

  13. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Polito, Anthony B.; Maurer-Gardner, Elizabeth I.; Hussain, Saber M., E-mail: saber.hussain@us.af.mil [Air Force Research Laboratory, Molecular Bioeffects Branch, Bioeffects Division, Human Effectiveness Directorate (United States)

    2015-12-15

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract.

  14. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    International Nuclear Information System (INIS)

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract

  15. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  16. Development of analysis methods for seismically isolated nuclear structures

    International Nuclear Information System (INIS)

    KAERI's contributions to the project entitled Development of Analysis Methods for Seismically Isolated Nuclear Structures under IAEA CRP of the intercomparison of analysis methods for predicting the behaviour of seismically isolated nuclear structures during 1996-1999 in effort to develop the numerical analysis methods and to compare the analysis results with the benchmark test results of seismic isolation bearings and isolated nuclear structures provided by participating countries are briefly described. Certain progress in the analysis procedures for isolation bearings and isolated nuclear structures has been made throughout the IAEA CRPs and the analysis methods developed can be improved for future nuclear facility applications. (author)

  17. Methods for analysis of fluoroquinolones in biological fluids

    Science.gov (United States)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  18. Applied analysis mathematical methods in natural science

    CERN Document Server

    Senba, Takasi

    2004-01-01

    This book provides a general introduction to applied analysis; vectoranalysis with physical motivation, calculus of variation, Fourieranalysis, eigenfunction expansion, distribution, and so forth,including a catalogue of mathematical theories, such as basicanalysis, topological spaces, complex function theory, real analysis,and abstract analysis. This book also gives fundamental ideas ofapplied mathematics to discuss recent developments in nonlinearscience, such as mathematical modeling of reinforced random motion ofparticles, semi-conductor device equation in applied physics, andchemotaxis in

  19. NOA: a novel Network Ontology Analysis method

    OpenAIRE

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) meth...

  20. Comparison study of microarray meta-analysis methods

    Directory of Open Access Journals (Sweden)

    Yang Yee

    2010-08-01

    Full Text Available Abstract Background Meta-analysis methods exist for combining multiple microarray datasets. However, there are a wide range of issues associated with microarray meta-analysis and a limited ability to compare the performance of different meta-analysis methods. Results We compare eight meta-analysis methods, five existing methods, two naive methods and a novel approach (mDEDS. Comparisons are performed using simulated data and two biological case studies with varying degrees of meta-analysis complexity. The performance of meta-analysis methods is assessed via ROC curves and prediction accuracy where applicable. Conclusions Existing meta-analysis methods vary in their ability to perform successful meta-analysis. This success is very dependent on the complexity of the data and type of analysis. Our proposed method, mDEDS, performs competitively as a meta-analysis tool even as complexity increases. Because of the varying abilities of compared meta-analysis methods, care should be taken when considering the meta-analysis method used for particular research.

  1. Methods for Mediation Analysis with Missing Data

    Science.gov (United States)

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  2. Statistical Analysis and Multivariate Methods in MS Excel

    OpenAIRE

    Postler, Štěpán

    2010-01-01

    The aim of this thesis is creating an application in Microsoft Office Excel 2003 that allows user to solve problems using some statistical analysis methods. This application is provided in stat.xls file, which is the main part of the thesis. The XLS file enables Excel to applicate methods of univariate and bivariate categorical analysis of frequencies, univariate, bivariate and even multivariate analysis of quantitative data and index analysis of economic data. To applicate the methods the fi...

  3. SYSTEMATIZATION AND ANALYSIS OF METHODS FOR MACHINING HOLES IN COMPOSITES

    OpenAIRE

    Пасічник, Віталій Анатолійович; Черказний, Віталій Юрійович

    2015-01-01

    Purpose. Analysis of the literature on mechanical methods of processing holes in composite materials, analysis and systematization. Creating a system of evaluation according to the needs and situation.Design/methodology/approach. Managing estimates for sets of indicators on quality, productivity and efficiency, technological capabilities of each method, the system determines which method in this situation is the most effective.Findings. Conducted analysis of methods for machining holes in com...

  4. Present methods for mineralogical analysis of uranium ores

    International Nuclear Information System (INIS)

    Most promising methods of mineralogic analysis of uranium and uranium-containing minerals, ores and rocks are considered. They include X-ray diffraction, electron microscopy and fluorescence spectroscopy methods. Principle physical basis and capabilities of each method are described; examples of its practical application are presented. Comparative characteristic of method for mineralogic analysis of radioactive ores and their reprocessing products is given. Attention is paid to the equipment and various devices for analysis

  5. Complex geophysical methods for apatitye ore analysis

    International Nuclear Information System (INIS)

    A number of geophysical testing methods are considered, including X-ray radiometry, gamma-gamma techniques and measurements of magnetic susceptibility. This group of methods provides the accuracy needed for determining the phosphate content of all ore types in the Khibiny deposits and allows quantitative assay of a number of minerals, including apatite, sphene and titanomagnetite. It is shown that most testing work can be satisfactorily carried out by the highly efficient gamma-gamma method. (author)

  6. Cost benefit analysis methods in public sector

    OpenAIRE

    Kinnunen, T.

    2016-01-01

    Cost-benefit analysis is an economic analysis tool that can be used to support public decision making, when there are several mutually exclusive alternatives being considered. It compares the monetary value of the benefits resulting from a specific project or policy with the costs accrued by it. However, it would appear that it is currently used mainly for investment projects, and not for analyzing public services. This thesis is a literature study on the use of cost-benefit analysis in the p...

  7. Nuclear analysis methods in monitoring occupational health

    International Nuclear Information System (INIS)

    With the increasing industrialisation of the world has come an increase in exposure to hazardous chemicals. Their effect on the body depends upon the concentration of the element in the work environment; its chemical form; the possible different routes of intake; and the individual's biological response to the chemical. Nuclear techniques of analysis such as neutron activation analysis (NAA) and proton induced X-ray emission analysis (PIXE), have played an important role in understanding the effects hazardous chemicals can have on occupationally exposed workers. In this review, examples of their application, mainly in monitoring exposure to heavy metals is discussed

  8. Chemical Analysis Methods for Silicon Carbide

    Institute of Scientific and Technical Information of China (English)

    Shen Keyin

    2006-01-01

    @@ 1 General and Scope This Standard specifies the determination method of silicon dioxide, free silicon, free carbon, total carbon, silicon carbide, ferric sesquioxide in silicon carbide abrasive material.

  9. Numerical analysis in electromagnetics the TLM method

    CERN Document Server

    Saguet, Pierre

    2013-01-01

    The aim of this book is to give a broad overview of the TLM (Transmission Line Matrix) method, which is one of the "time-domain numerical methods". These methods are reputed for their significant reliance on computer resources. However, they have the advantage of being highly general.The TLM method has acquired a reputation for being a powerful and effective tool by numerous teams and still benefits today from significant theoretical developments. In particular, in recent years, its ability to simulate various situations with excellent precision, including complex materials, has been

  10. Steel mill products analysis using qualities methods

    Directory of Open Access Journals (Sweden)

    B. Gajdzik

    2016-10-01

    Full Text Available The article presents the subject matter of steel mill product analysis using quality tools. The subject of quality control were bolts and a ball bushing. The Pareto chart and fault mode and effect analysis (FMEA were used to assess faultiness of the products. The faultiness analysis in case of the bolt enabled us to detect the following defects: failure to keep the dimensional tolerance, dents and imprints, improper roughness, lack of pre-machining, non-compatibility of the electroplating and faults on the surface. Analysis of the ball bushing has also revealed defects such as: failure to keep the dimensional tolerance, dents and imprints, improper surface roughness, lack of surface premachining as well as sharp edges and splitting of the material.

  11. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  12. Human resources methods analysis in engineering company

    OpenAIRE

    Akšteinová, Michaela

    2009-01-01

    Aim of this bachelor thesis is to make analysis of human resources activities in company SPG Czech,s.r.o. which produces machines. Thesis is divided in theoretical and practical part. In theoretical part human resources activities are described: work analysis, obtaining and choosing of employees, accepting and adjustment of employees, managing the job performance and evaluating the employees, education and formation of the employees, rewarding of the employees and care of the employees. Based...

  13. Analysis of queues methods and applications

    CERN Document Server

    Gautam, Natarajan

    2012-01-01

    Introduction Analysis of Queues: Where, What, and How?Systems Analysis: Key ResultsQueueing Fundamentals and Notations Psychology in Queueing Reference Notes Exercises Exponential Interarrival and Service Times: Closed-Form Expressions Solving Balance Equations via Arc CutsSolving Balance Equations Using Generating Functions Solving Balance Equations Using Reversibility Reference Notes ExercisesExponential Interarrival and Service Times: Numerical Techniques and Approximations Multidimensional Birth and Death ChainsMultidimensional Markov Chains Finite-State Markov ChainsReference Notes Exerci

  14. PIC (PRODUCTS OF INCOMPLETE COMBUSTION) ANALYSIS METHODS

    Science.gov (United States)

    The report gives results of method evaluations for products of incomplete combustion (PICs): 36 proposed PICs were evaluated by previously developed gas chromatography/flame ionization detection (GC/FID) and gas chromatography/mass spectroscopy (GC/MS) methods. It also gives resu...

  15. Research on the Method of Big Data Analysis

    OpenAIRE

    Li, Z. H.; H.F. Qin

    2013-01-01

    With the development of society, the relational database facing to the great opportunities and challenges, how to store big data, analysis big data is become a hot issue. This article from the traditional data analysis start, find out the traditional data analysis situation and the trend of data analysis. Big data is facing a lot of issues, such as architecture, analysis technical, storage, privacy and security. Due to the method of analysis, the article mainly introduced to the structu...

  16. Methods of Fourier analysis and approximation theory

    CERN Document Server

    Tikhonov, Sergey

    2016-01-01

    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  17. Frequentist methods for failure data analysis

    International Nuclear Information System (INIS)

    This note presents some frequentist methods for calculation of the reliability law of a component, taking into account failure data and degradation data, extracted from operation feedback data banks. (author). 10 tabs., 22 figs., 14 refs

  18. Method for chromium analysis and speciation

    Energy Technology Data Exchange (ETDEWEB)

    Aiken, Abigail M.; Peyton, Brent M.; Apel, William A.; Petersen, James N.

    2004-11-02

    A method of detecting a metal in a sample comprising a plurality of metal is disclosed. The method comprises providing the sample comprising a metal to be detected. The sample is added to a reagent solution comprising an enzyme and a substrate, where the enzyme is inhibited by the metal to be detected. An array of chelating agents is used to eliminate the inhibitory effects of additional metals in the sample. An enzymatic activity in the sample is determined and compared to an enzymatic activity in a control solution to detect the metal to be detected. A method of determining a concentration of the metal in the sample is also disclosed. A method of detecting a valence state of a metal is also disclosed.

  19. DNA Methylation Analysis: Choosing the Right Method

    Directory of Open Access Journals (Sweden)

    Sergey Kurdyukov

    2016-01-01

    Full Text Available In the burgeoning field of epigenetics, there are several methods available to determine the methylation status of DNA samples. However, choosing the method that is best suited to answering a particular biological question still proves to be a difficult task. This review aims to provide biologists, particularly those new to the field of epigenetics, with a simple algorithm to help guide them in the selection of the most appropriate assay to meet their research needs. First of all, we have separated all methods into two categories: those that are used for: (1 the discovery of unknown epigenetic changes; and (2 the assessment of DNA methylation within particular regulatory regions/genes of interest. The techniques are then scrutinized and ranked according to their robustness, high throughput capabilities and cost. This review includes the majority of methods available to date, but with a particular focus on commercially available kits or other simple and straightforward solutions that have proven to be useful.

  20. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    OpenAIRE

    Ming-Chang Lee

    2014-01-01

    Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey set...

  1. The BioImage Study: novel approaches to risk assessment in the primary prevention of atherosclerotic cardiovascular disease--study design and objectives

    DEFF Research Database (Denmark)

    Muntendam, Pieter; McCall, Carol; Sanz, Javier;

    2010-01-01

    The identification of asymptomatic individuals at risk for near-term atherothrombotic events to ensure optimal preventive treatment remains a challenging goal. In the BioImage Study, novel approaches are tested in a typical health-plan population. Based on certain demographic and risk characteris......The identification of asymptomatic individuals at risk for near-term atherothrombotic events to ensure optimal preventive treatment remains a challenging goal. In the BioImage Study, novel approaches are tested in a typical health-plan population. Based on certain demographic and risk...

  2. General method of quantitative spectrographic analysis

    International Nuclear Information System (INIS)

    A spectrographic method was developed to determine 23 elements in a wide range of concentrations; the method can be applied to metallic or refractory samples. Previous melting with lithium tetraborate and germanium oxide is done in order to avoid the influence of matrix composition and crystalline structure. Germanium oxide is also employed as internal standard. The resulting beads ar mixed with graphite powder (1:1) and excited in a 10 amperes direct current arc. (Author) 12 refs

  3. AVIS: analysis method for document coherence

    International Nuclear Information System (INIS)

    The present document intends to give a short insight into AVIS, a method which permits to verify the quality of technical documents. The paper includes the presentation of the applied approach based on the K.O.D. method, the definition of quality criteria of a technical document, as well as a description of the means of valuating these criteria. (authors). 9 refs., 2 figs

  4. Economic Analysis of Paddy Threshing Methods

    OpenAIRE

    Prasanna, P.H.S.N.; L H P Gunaratne; Withana, W.D.R.S.

    2004-01-01

    Post-harvest losses of paddy in Sri Lanka are as high as 15 percent of total production. Of this, about 24 percent of losses occur during the threshing and cleaning stage with tractor treading being the most common paddy threshing method. In order to overcome these deficiencies, recently small and combined threshers have been introduced. This study attempted to determine the efficiency of different paddy threshing methods, and to estimate the profitability of small and combined thresher owner...

  5. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  6. Radioisotope method of compound flow analysis

    Czech Academy of Sciences Publication Activity Database

    Petryka, L.; Zych, M.; Hanus, R.; Sobota, J.; Vlasák, Pavel; Malczewska, B.

    Liberec: Technical university of Liberec, 2014 - (Dančová, P.; Vít, T.), s. 459-462. (EPJ Web of Conferences). ISSN 2100-014X. [Experimental Fluid Mechanics 2014. Český Krumlov (CZ), 18.11.2014-21.11.2014] R&D Projects: GA ČR GAP105/10/1574 Grant ostatní: National Science Centre of Poland(PL) N N523 755340 Institutional support: RVO:67985874 Keywords : compound flow analysis * Cross Correlation Analysis * hydrotransport of solid particles Subject RIV: BK - Fluid Dynamics

  7. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications

    OpenAIRE

    Li-wei Liu; Si-yi Hu; Ying Pan; Jia-qi Zhang; Yue-shu Feng; Xi-he Zhang

    2014-01-01

    In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell n...

  8. Chemical aspects of nuclear methods of analysis

    International Nuclear Information System (INIS)

    This final report includes papers which fall into three general areas: development of practical pre-analysis separation techniques, uranium/thorium separation from other elements for analytical and processing operations, and theory and mechanism of separation techniques. A separate abstract was prepared for each of the 9 papers

  9. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  10. An ESDIRK Method with Sensitivity Analysis Capabilities

    DEFF Research Database (Denmark)

    Kristensen, Morten Rode; Jørgensen, John Bagterp; Thomsen, Per Grove;

    2004-01-01

    A new algorithm for numerical sensitivity analysis of ordinary differential equations (ODEs) is presented. The underlying ODE solver belongs to the Runge-Kutta family. The algorithm calculates sensitivities with respect to problem parameters and initial conditions, exploiting the special structure...

  11. Implicitly Weighted Methods in Robust Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 44, č. 3 (2012), s. 449-462. ISSN 0924-9907 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robustness * high breakdown point * outlier detection * robust correlation analysis * template matching * face recognition Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.767, year: 2012

  12. Principles and Analysis of Krylov Subspace Methods

    Czech Academy of Sciences Publication Activity Database

    Strakoš, Zdeněk

    Ostrava : Institute of Geonics AS CR, 2005. ISBN 80-86407-04-7. [SNA '05. Seminar on Numerical Analysis. Ostrava (CZ), 07.02.2005-11.02.2005] Institutional research plan: CEZ:AV0Z10300504 Subject RIV: BA - General Mathematics

  13. Polydiacetylene-enclosed near-infrared fluorescent semiconducting polymer dots for bioimaging and sensing.

    Science.gov (United States)

    Wu, Pei-Jing; Kuo, Shih-Yu; Huang, Ya-Chi; Chen, Chuan-Pin; Chan, Yang-Hsiang

    2014-05-20

    Semiconducting polymer dots (P-dots) recently have emerged as a new type of ultrabright fluorescent probe with promising applications in biological imaging and detection. With the increasing desire for near-infrared (NIR) fluorescing probes for in vivo biological measurements, the currently available NIR-emitting P-dots are very limited and the leaching of the encapsulated dyes/polymers has usually been a concern. To address this challenge, we first embedded the NIR dyes into the matrix of poly[(9,9-dioctylfluorene)-co-2,1,3-benzothiadiazole-co-4,7-di(thiophen-2-yl)-2,1,3-benzothiadiazole] (PF-BT-DBT) polymer and then enclosed the doped P-dots with polydiacetylenes (PDAs) to avoid potential leakage of the entrapped NIR dyes from the P-dot matrix. These PDA-enclosed NIR-emitting P-dots not only emitted much stronger NIR fluorescence than conventional organic molecules but also exhibited enhanced photostability over CdTe quantum dots, free NIR dyes, and gold nanoclusters. We next conjugated biomolecules onto the surface of the resulting P-dots and demonstrated their capability for specific cellular labeling without any noticeable nonspecific binding. To employ this new class of material as a facile sensing platform, an easy-to-prepare test paper, obtained by soaking the paper into the PDA-enclosed NIR-emitting P-dot solution, was used to sense external stimuli such as ions, temperature, or pH, depending on the surface functionalization of PDAs. We believe these PDA-coated NIR-fluorescing P-dots will be very useful in a variety of bioimaging and analytical applications. PMID:24749695

  14. Low-temperature approach to highly emissive copper indium sulfide colloidal nanocrystals and their bioimaging applications.

    Science.gov (United States)

    Yu, Kui; Ng, Peter; Ouyang, Jianying; Zaman, Md Badruz; Abulrob, Abedelnasser; Baral, Toya Nath; Fatehi, Dorothy; Jakubek, Zygmunt J; Kingston, David; Wu, Xiaohua; Liu, Xiangyang; Hebert, Charlie; Leek, Donald M; Whitfield, Dennis M

    2013-04-24

    We report our newly developed low-temperature synthesis of colloidal photoluminescent (PL) CuInS2 nanocrystals (NCs) and their in vitro and in vivo imaging applications. With diphenylphosphine sulphide (SDPP) as a S precursor made from elemental S and diphenylphosphine, this is a noninjection based approach in 1-dodecanethiol (DDT) with excellent synthetic reproducibility and large-scale capability. For a typical synthesis with copper iodide (CuI) as a Cu source and indium acetate (In(OAc)3) as an In source, the growth temperature was as low as 160 °C and the feed molar ratios were 1Cu-to-1In-to-4S. Amazingly, the resulting CuInS2 NCs in toluene exhibit quantum yield (QY) of ~23% with photoemission peaking at ~760 nm and full width at half maximum (FWHM) of ~140 nm. With a mean size of ~3.4 nm (measured from the vertices to the bases of the pyramids), they are pyramidal in shape with a crystal structure of tetragonal chalcopyrite. In situ (31)P NMR (monitored from 30 °C to 100 °C) and in situ absorption at 80 °C suggested that the Cu precursor should be less reactive toward SDPP than the In precursor. For our in vitro and in vivo imaging applications, CuInS2/ZnS core-shell QDs were synthesized; afterwards, dihydrolipoic acid (DHLA) or 11-mercaptoundecanoic acid (MUA) were used for ligand exchange and then bio-conjugation was performed. Two single-domain antibodies (sdAbs) were used. One was 2A3 for in vitro imaging of BxPC3 pancreatic cancer cells. The other was EG2 for in vivo imaging of a Glioblastoma U87MG brain tumour model. The bioimaging data illustrate that the CuInS2 NCs from our SDPP-based low-temperature noninjection approach are good quality. PMID:23486927

  15. Transonic wing analysis using advanced computational methods

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  16. Discrete element analysis methods of generic differential quadratures

    CERN Document Server

    Chen, Chang-New

    2008-01-01

    Presents generic differential quadrature, the extended differential quadrature and the related discrete element analysis methods. This book demonstrated their ability for solving generic scientific and engineering problems.

  17. Adaptive computational methods for aerothermal heating analysis

    Science.gov (United States)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  18. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  19. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  20. Multibody Finite Element Method and Application in Hydraulic Structure Analysis

    OpenAIRE

    Chao Su; Yebin Zhao; Yusong Jiang

    2015-01-01

    Multibody finite element method is proposed for analysis of contact problems in hydraulic structure. This method is based on the block theory of discontinuous deformation analysis (DDA) method and combines advantages of finite element method (FEM) and the displacement compatibility equation in classical elastic mechanics. Each single block is analyzed using FEM in corresponding local coordinate system and all contacting blocks need to satisfy the displacement compatibility requirement between...

  1. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov

    2014-09-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  2. [Analysis of porphyrin photosensitizers using HPLC method].

    Science.gov (United States)

    Jia, Min-ge; Wu, Hai-yan; Sun, Li-li; Yao, Chun-suo; Zhang, Shao-liang; Li, Ya-wei; Fang, Qi-cheng

    2015-08-01

    Photodynamic therapy (PDT), because of its good targeting, minimal invasion, and safety, is becoming a very active area in cancer prevention and treatment, in which the photosensitizers have proved to be the core element for PDT. We developed a new HPLC method for analyzing porphyrin photosensitizers using Shiseido Capcell PAK C18 (150 mm x 4.6 mm, 5 µm) as the column at 30 °C, methanol-1% aqueous solution of acetic acid as the mobile phase in a flow rate of 1.0 mL · min(-1) in a gradient elution mode, and the detection wavelength at 380 nm. This method, showing good specificity, precision, accuracy and robusty via methodology validations, can be applied to the purity test and assay of porphyrin photosensitizers, and has played a key guide role in the R&D of the new porphyrin photosensitizer--sinoporphyrin sodium. PMID:26669003

  3. Updated Methods for Seed Shape Analysis

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2016-01-01

    Full Text Available Morphological variation in seed characters includes differences in seed size and shape. Seed shape is an important trait in plant identification and classification. In addition it has agronomic importance because it reflects genetic, physiological, and ecological components and affects yield, quality, and market price. The use of digital technologies, together with development of quantification and modeling methods, allows a better description of seed shape. Image processing systems are used in the automatic determination of seed size and shape, becoming a basic tool in the study of diversity. Seed shape is determined by a variety of indexes (circularity, roundness, and J index. The comparison of the seed images to a geometrical figure (circle, cardioid, ellipse, ellipsoid, etc. provides a precise quantification of shape. The methods of shape quantification based on these models are useful for an accurate description allowing to compare between genotypes or along developmental phases as well as to establish the level of variation in different sets of seeds.

  4. Femtosecond protein nanocrystallography—data analysis methods

    OpenAIRE

    Kirian, R. A.; X. Wang; Weierstall, U.; Schmidt, K. E.; Spence, J. C. H.; Hunter, M; Fromme, P.; White, Thomas; Chapman, H. N.; Holton, J

    2010-01-01

    X-ray diffraction patterns may be obtained from individual submicron protein nanocrystals using a femtosecond pulse from a free-electron X-ray laser. Many “single-shot” patterns are read out every second from a stream of nanocrystals lying in random orientations. The short pulse terminates before significant atomic (or electronic) motion commences, minimizing radiation damage. Simulated patterns for Photosystem I nanocrystals are used to develop a method for recovering structure factors from ...

  5. MLPA method for PMP22 gene analysis:

    OpenAIRE

    Kokalj-Vokač, Nadja; Stangler Herodež, Špela; Zagradišnik, Boris

    2005-01-01

    DNA copy number alterations are responsible for several categories of human diseases and syndromes. These changes can be detected by cytogenetic studies when there is involvement of several kilobases or megabases of DNA. Examination of sub-microscopic changes is possible by using short probes flanked by the same primer pairs. Multiplex ligation-dependent probe amplification (MLPA) is a simple, high resolution method by which not sample nucleic acids but probes added to the samples are amplifi...

  6. Geometrical Methods for Power Network Analysis

    CERN Document Server

    Bellucci, Stefano; Gupta, Neeraj

    2013-01-01

    This book is a short introduction to power system planning and operation using advanced geometrical methods. The approach is based on well-known insights and techniques developed in theoretical physics in the context of Riemannian manifolds. The proof of principle and robustness of this approach is examined in the context of the IEEE 5 bus system. This work addresses applied mathematicians, theoretical physicists and power engineers interested in novel mathematical approaches to power network theory.

  7. PERFORMANCE ANALYSIS OF HARDWARE TROJAN DETECTION METHODS

    OpenAIRE

    Ehsan, Sharifi; Kamal, Mohammadiasl; Mehrdad, Havasi; Amir, Yazdani

    2015-01-01

    Due to the increasing use of information and communication technologies in most aspects of life, security of the information has drawn the attention of governments and industry as well as the researchers. In this regard, structural attacks on the functions of a chip are called hardware Trojans, and are capable of rendering ineffective the security protecting our systems and data. This method represents a big challenge for cyber-security as it is nearly impossible to detect with any currently ...

  8. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  9. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested to...... describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations of the...

  10. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that per

  11. Advanced CFD methods for wind turbine analysis

    Science.gov (United States)

    Lynch, C. Eric

    2011-12-01

    Horizontal-axis wind turbines operate in a complex, inherently unsteady aerodynamic environment. Even when the rotor is not stalled, the flow over the blades is dominated by three-dimensional (3-D) effects. Stall is accompanied by massive flow separation and vortex shedding over the suction surface of the blades. Under yawed conditions, dynamic stall may be present as well. In all operating conditions, there is bluff-body shedding from the turbine nacelle and support structure which interacts with the rotor wake. In addition, the high aspect ratios of wind turbine blades make them very flexible, leading to substantial aeroelastic deformation of the blades, altering the aerodynamics. Finally, when situated in a wind farm, turbines must operate in the unsteady wake of upstream neighbors. Though computational fluid dynamics (CFD) has made significant inroads as a research tool, simple, inexpensive methods, such as blade element momentum (BEM) theory, are still the workhorses in wind turbine design and aeroelasticity applications. These methods generally assume a quasi-steady flowfield and use two-dimensional aerodynamic approximations with very limited empirical 3-D corrections. As a result, they are unable to accurately predict rotor loads near the edges of the operating envelope. CFD methods make very few limiting assumptions about the flowfield, and thus have much greater potential for predicting these flows. In this work, a range of unstructured grid CFD techniques for predicting wind turbine loads and aeroelasticity has been developed and applied to a wind turbine configuration of interest. First, a nearest neighbor search algorithm based on a k-dimensional tree data structure was used to improve the computational efficiency of an approximate unsteady actuator blade method. This method was then shown to predict root and tip vortex locations and strengths similar to an overset method on the same background mesh, but without the computational expense of modeling

  12. Sensitivity analysis via reduced order adjoint method

    International Nuclear Information System (INIS)

    Notwithstanding the voluminous literature on adjoint sensitivity analysis, it has been generally dismissed by practitioners as cumbersome with limited value in realistic engineering models. This perception reflects two limitations about adjoint sensitivity analysis: a) its most effective application is limited to calculation of first-order variations; when higher order derivatives are required, it quickly becomes computationally inefficient; and b) the number of adjoint model evaluations depends on the number of responses, which renders it ineffective for multi-physics model where entire distributions, such as flux and power distribution, are often transferred between the various physics models. To overcome these challenges, this manuscript employs recent advances in reduced order modeling to re-cast the adjoint model equations into a form that renders its application to real reactor models practical. Past work applied reduced order modeling techniques to render reduction for general nonlinear high dimensional models by identifying mathematical subspaces, called active subspaces, that capture all dominant features of the model, including both linear and nonlinear variations. We demonstrate the application of these techniques to the calculation of first-order derivatives, or as commonly known sensitivity coefficients, for a fuel assembly model with many responses. We show that the computational cost becomes dependent on the physics model itself, via the so-called rank of the active subspace, rather than the number of responses or parameters. (author)

  13. Spectral analysis method for detecting an element

    Science.gov (United States)

    Blackwood, Larry G [Idaho Falls, ID; Edwards, Andrew J [Idaho Falls, ID; Jewell, James K [Idaho Falls, ID; Reber, Edward L [Idaho Falls, ID; Seabury, Edward H [Idaho Falls, ID

    2008-02-12

    A method for detecting an element is described and which includes the steps of providing a gamma-ray spectrum which has a region of interest which corresponds with a small amount of an element to be detected; providing nonparametric assumptions about a shape of the gamma-ray spectrum in the region of interest, and which would indicate the presence of the element to be detected; and applying a statistical test to the shape of the gamma-ray spectrum based upon the nonparametric assumptions to detect the small amount of the element to be detected.

  14. Tournament Methods for WLAN: Analysis and Efficiency

    Science.gov (United States)

    Galtier, Jérôme

    In the context of radio distributed networks, we present a generalized approach for Medium Access Control (MAC) with a fixed congestion window. Our protocol is quite simple to analyze and can be used in a lot of different situations. We give mathematical evidence showing that our performance is asymptotically tight. We also place ourselves in the WiFi and WiMAX frameworks, and discuss experimental results showing acollision reduction of 14% to 21% compared to the best-known methods. We discuss channel capacity improvement and fairness considerations.

  15. Vulnerability analysis of three remote voting methods

    CERN Document Server

    Enguehard, Chantal

    2009-01-01

    This article analyses three methods of remote voting in an uncontrolled environment: postal voting, internet voting and hybrid voting. It breaks down the voting process into different stages and compares their vulnerabilities considering criteria that must be respected in any democratic vote: confidentiality, anonymity, transparency, vote unicity and authenticity. Whether for safety or reliability, each vulnerability is quantified by three parameters: size, visibility and difficulty to achieve. The study concludes that the automatisation of treatments combined with the dematerialisation of the objects used during an election tends to substitute visible vulnerabilities of a lesser magnitude by invisible and widespread vulnerabilities.

  16. Experimental and analysis methods in radiochemical experiments

    Science.gov (United States)

    Cattadori, C. M.; Pandola, L.

    2016-04-01

    Radiochemical experiments made the history of neutrino physics by achieving the first observation of solar neutrinos (Cl experiment) and the first detection of the fundamental pp solar neutrinos component (Ga experiments). They measured along decades the integral νe charged current interaction rate in the exposed target. The basic operation principle is the chemical separation of the few atoms of the new chemical species produced by the neutrino interactions from the rest of the target, and their individual counting in a low-background counter. The smallness of the expected interaction rate (1 event per day in a ˜ 100 ton target) poses severe experimental challenges on the chemical and on the counting procedures. The main aspects related to the analysis techniques employed in solar neutrino experiments are reviewed and described, with a special focus given to the event selection and the statistical data treatment.

  17. Surface analysis methods in materials science

    CERN Document Server

    Sexton, Brett; Smart, Roger

    1992-01-01

    The idea for this book stemmed from a remark by Philip Jennings of Murdoch University in a discussion session following a regular meeting of the Australian Surface Science group. He observed that a text on surface analysis and applica­ tions to materials suitable for final year undergraduate and postgraduate science students was not currently available. Furthermore, the members of the Australian Surface Science group had the research experience and range of coverage of sur­ face analytical techniques and applications to provide a text for this purpose. A of techniques and applications to be included was agreed at that meeting. The list intended readership of the book has been broadened since the early discussions, particularly to encompass industrial users, but there has been no significant alter­ ation in content. The editors, in consultation with the contributors, have agreed that the book should be prepared for four major groups of readers: - senior undergraduate students in chemistry, physics, metallur...

  18. Information Security Risk Analysis Methods and Research Trends: AHP and Fuzzy Comprehensive Method

    Directory of Open Access Journals (Sweden)

    Ming-Chang Lee

    2014-02-01

    Full Text Available Information security risk analysis becomes an increasingly essential component of organization’s operations. Traditional Information security risk analysis is quantitative and qualitative analysis methods. Quantitative and qualitative analysis methods have some advantages for information risk analysis. However, hierarchy process has been widely used in security assessment. A future research direction may be development and application of soft computing such as rough sets, grey sets, fuzzy systems, generic algorithm, support vector machine, and Bayesian network and hybrid model. Hybrid model are developed by integrating two or more existing model. A Practical advice for evaluation information security risk is discussed. This approach is combination with AHP and Fuzzy comprehensive method

  19. Molten Salt Breeder Reactor Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinsu; Jeong, Yongjin; Lee, Deokjung [Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2015-05-15

    Utilizing the uranium-thorium fuel cycle shows considerable potential for the possibility of MSR. The concept of MSBR should be revised because of molten salt reactor's advantage such as outstanding neutron economy, possibility of continuous online reprocessing and refueling, a high level of inherent safety, and economic benefit by keeping off the fuel fabrication process. For the development of MSR research, this paper provides the MSBR single-cell, two-cell and whole core model for computer code input, and several calculation results including depletion calculation of each models. The calculations are carried out by using MCNP6, a Monte Carlo computer code, which has CINDER90 for depletion calculation using ENDF-VII nuclear data. From the calculation results of various reactor design parameters, the temperature coefficients are all negative at the initial state and MTC becomes positive at the equilibrium state. From the results of core rod worth, the graphite control rod alone cannot makes the core subcritical at initial state. But the equilibrium state, the core can be made subcritical state only by graphite control rods. Through the comparison of the results of each models, the two-cell method can represent the MSBR core model more accurately with a little more computational resources than the single-cell method. Many of the thermal spectrum MSR have adopted a multi-region single-fluid strategy.

  20. Biodegradable, Elastomeric, and Intrinsically Photoluminescent Poly(Silicon-Citrates) with high Photostability and Biocompatibility for Tissue Regeneration and Bioimaging.

    Science.gov (United States)

    Du, Yuzhang; Xue, Yumeng; Ma, Peter X; Chen, Xiaofeng; Lei, Bo

    2016-02-01

    Biodegradable polymer biomaterials with intrinsical photoluminescent properties have attracted much interest, due to their potential advantages for tissue regeneration and noninvasive bioimaging. However, few of current biodegradable polymers possess tunable intrinsically fluorescent properties, such as high photostability, fluorescent lifetime, and quantum field, and strong mechanical properties for meeting the requirements of biomedical applications. Here, by a facile one-step thermal polymerization, elastomeric poly(silicone-citrate) (PSC) hybrid polymers are developed with controlled biodegradability and mechanical properties, tunable inherent fluorescent emission (up to 600 nm), high photostability (beyond 180 min for UV and six months for natural light), fluorescent lifetime (near 10 ns) and quantum yield (16%-35%), high cellular biocompatibility, and minimal inflammatory response in vivo, which provide advantages over conventional fluorescent dyes, quantum dots, and current fluorescent polymers. The promising applications of PSC hybrids for cell and implants imaging in vitro and in vivo are successfully demonstrated. The development of elastomeric PSC polymer may provide a new strategy in synthesizing new inorganic-organic hybrid photo-luminescent materials for tissue regeneration and bioimaging applications. PMID:26687865

  1. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  2. Methods of remote surface chemical analysis for asteroid missions

    International Nuclear Information System (INIS)

    Different remote sensing methods are discussed which can be applied to investigate the chemical composition of minor bodies of the Solar System. The secondary-ion method, remote laser mass-analysis and electron beam induced X-ray emission analysis are treated in detail. Relative advantages of these techniques are analyzed. The physical limitation of the methods: effects of solar magnetic field and solar wind on the secondary-ion and laser methods and the effect of electrostatic potential of the space apparatus on the ion and electron beam methods are described. First laboratory results of remote laser method are given. (D.Gy.)

  3. Probabilistic structural analysis methods for space transportation propulsion systems

    Science.gov (United States)

    Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.

    1991-01-01

    Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .

  4. Modified Homotopy Analysis Method for Zakharov-Kuznetsov Equations

    Directory of Open Access Journals (Sweden)

    Muhammad USMAN

    2013-01-01

    Full Text Available In this paper, we apply Modified Homotopy Analysis Method (MHAM to find appropriate solutions of Zakharov-Kuznetsov equations which are of utmost importance in applied and engineering sciences. The proposed modification is the elegant coupling of Homotopy Analysis Method (HAM and Taylor’s series. Numerical results coupled with graphical representation explicitly reveal the complete reliability of the proposed algorithm.

  5. 21 CFR 163.5 - Methods of analysis.

    Science.gov (United States)

    2010-04-01

    ... in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the AOAC... CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of analysis prescribed in “Official...

  6. An improved evaluation method for fault tree kinetic analysis

    International Nuclear Information System (INIS)

    By means of the exclusive sum of products of a fault tree, the improved method uses the basic event parameters direct in the synthetic evaluation and makes the fault tree kinetic analysis more simple. This paper provides a reasonable evaluation method for the kinetic analysis of basic events which has parameters of the synthetic distribution, too

  7. Comparative Study Among Lease Square Method, Steepest Descent Method, and Conjugate Gradient Method for Atmopsheric Sounder Data Analysis

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-09-01

    Full Text Available Comparative study among Least Square Method: LSM, Steepest Descent Method: SDM, and Conjugate Gradient Method: CGM for atmospheric sounder data analysis (estimation of vertical profiles for water vapor is conducted. Through simulation studies, it is found that CGM shows the best estimation accuracy followed by SDM and LSM. Method dependency on atmospheric models is also clarified.

  8. Analysis of speech waveform quantization methods

    Directory of Open Access Journals (Sweden)

    Tadić Predrag R.

    2008-01-01

    Full Text Available Digitalization, consisting of sampling and quantization, is the first step in any digital signal processing algorithm. In most cases, the quantization is uniform. However, having knowledge of certain stochastic attributes of the signal (namely, the probability density function, or pdf, quantization can be made more efficient, in the sense of achieving a greater signal to quantization noise ratio. This means that narrower channel bandwidths are required for transmitting a signal of the same quality. Alternatively, if signal storage is of interest, rather than transmission, considerable savings in memory space can be made. This paper presents several available methods for speech signal pdf estimation, and quantizer optimization in the sense of minimizing the quantization error power.

  9. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  10. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺

    2002-01-01

    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  11. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  12. Participant Interaction in Asynchronous Learning Environments: Evaluating Interaction Analysis Methods

    Science.gov (United States)

    Blanchette, Judith

    2012-01-01

    The purpose of this empirical study was to determine the extent to which three different objective analytical methods--sequence analysis, surface cohesion analysis, and lexical cohesion analysis--can most accurately identify specific characteristics of online interaction. Statistically significant differences were found in all points of…

  13. Transportation Mode Choice Analysis Based on Classification Methods

    OpenAIRE

    Zeņina, N; Borisovs, A

    2011-01-01

    Mode choice analysis has received the most attention among discrete choice problems in travel behavior literature. Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. This paper investigates performance of mode choice analysis with classification methods - decision trees, discriminant analysis and multinomial logit. Experimental results have demonstrated satisfactory quality of classification.

  14. Method and procedure of fatigue analysis for nuclear equipment

    International Nuclear Information System (INIS)

    As an example, the fatigue analysis for the upper head of the pressurizer in one NPP was carried out by using ANSYS, a finite element method analysis software. According to RCC-M code, only two kinds of typical transients of temperature and pressure were considered in the fatigue analysis. Meanwhile, the influence of earthquake was taken into account. The method and procedure of fatigue analysis for nuclear safety equipment were described in detail. This paper provides a reference for fatigue analysis and assessment of nuclear safety grade equipment and pipe. (authors)

  15. Visualization Method for Finding Critical Care Factors in Variance Analysis

    OpenAIRE

    YUI, Shuntaro; BITO, Yoshitaka; OBARA, Kiyohiro; KAMIYAMA, Takuya; SETO, Kumiko; Ban, Hideyuki; HASHIZUME, Akihide; HAGA, Masashi; Oka, Yuji

    2006-01-01

    We present a novel visualization method for finding care factors in variance analysis. The analysis has two stages: first stage enables users to extract a significant variance, and second stage enables users to find out a critical care factors of the variance. The analysis has been validated by using synthetically created inpatient care processes. It was found that the method is efficient in improving clinical pathways.

  16. RESULTS OF THE QUESTIONNAIRE: ANALYSIS METHODS

    CERN Multimedia

    Staff Association

    2014-01-01

    Five-yearly review of employment conditions   Article S V 1.02 of our Staff Rules states that the CERN “Council shall periodically review and determine the financial and social conditions of the members of the personnel. These periodic reviews shall consist of a five-yearly general review of financial and social conditions;” […] “following methods […] specified in § I of Annex A 1”. Then, turning to the relevant part in Annex A 1, we read that “The purpose of the five-yearly review is to ensure that the financial and social conditions offered by the Organization allow it to recruit and retain the staff members required for the execution of its mission from all its Member States. […] these staff members must be of the highest competence and integrity.” And for the menu of such a review we have: “The five-yearly review must include basic salaries and may include any other financial or soc...

  17. Complementarity of Traffic Flow Intersecting Method with Intersection Capacity Analysis

    OpenAIRE

    Lanović, Zdenko

    2009-01-01

    The paper studies the complementarity of the methods from the field of traffic flow theory: methods of traffic flow intersecting intensity and the method for the at-grade intersection capacity analysis. Apart from checking mutual implications of these methods, the proportionality of mutual influences is assessed. Harmonized application of these methods acts efficiently on the entire traffic network, and not only on the intersections that are usually incorrectly represented as the only network...

  18. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  19. Comparison of factor analysis and phase analysis methods applied to cardiac scintigraphy

    International Nuclear Information System (INIS)

    Since 1982 factor analysis has been proposed as an alternative to phase analysis for the processing of heart scintigraphy at equilibrium. Three factor analysis algorithms have been described and the clinical evaluation of these methods has been carried out in 128 patients with coronary artery disease and compared with phase analysis. The study indicates that factor analysis methods are not more accurate than phase analysis for the detection of abnormalities of ventricular contraction

  20. Stability and Accuracy Analysis for Taylor Series Numerical Method

    Institute of Scientific and Technical Information of China (English)

    赵丽滨; 姚振汉; 王寿梅

    2004-01-01

    The Taylor series numerical method (TSNM) is a time integration method for solving problems in structural dynamics. In this paper, a detailed analysis of the stability behavior and accuracy characteristics of this method is given. It is proven by a spectral decomposition method that TSNM is conditionally stable and belongs to the category of explicit time integration methods. By a similar analysis, the characteristic indicators of time integration methods, the percentage period elongation and the amplitude decay of TSNM, are derived in a closed form. The analysis plays an important role in implementing a procedure for automatic searching and finding convergence radii of TSNM. Finally, a linear single degree of freedom undamped system is analyzed to test the properties of the method.

  1. A review of analysis methods about thermal buckling

    International Nuclear Information System (INIS)

    This paper highlights the main items emerging from a large bibliographical survey carried out on strain-induced buckling analysis methods applicable in the building of fast neutron reactor structures. The work is centred on the practical analysis methods used in construction codes to account for the strain-buckling of thin and slender structures. Methods proposed in the literature concerning past and present studies are rapidly described. Experimental, theoretical and numerical methods are considered. Methods applicable to design and their degree of validation are indicated

  2. Probabilistic structural analysis methods for space propulsion system components

    Science.gov (United States)

    Chamis, Christos C.

    1987-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.

  3. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  5. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  6. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  7. Life cycle analysis of electricity systems: Methods and results

    International Nuclear Information System (INIS)

    The two methods for full energy chain analysis, process analysis and input/output analysis, are discussed. A combination of these two methods provides the most accurate results. Such a hybrid analysis of the full energy chains of six different power plants is presented and discussed. The results of such analyses depend on time, site and technique of each process step and, therefore have no general validity. For renewable energy systems the emissions form the generation of a back-up system should be added. (author). 7 figs, 1 fig

  8. Comparison of Integrated Analysis Methods for Two Model Scenarios

    Science.gov (United States)

    Amundsen, Ruth M.

    1999-01-01

    Integrated analysis methods have the potential to substantially decrease the time required for analysis modeling. Integration with computer aided design (CAD) software can also allow a model to be more accurate by facilitating import of exact design geometry. However, the integrated method utilized must sometimes be tailored to the specific modeling situation, in order to make the process most efficient. Two cases are presented here that illustrate different processes used for thermal analysis on two different models. These examples are used to illustrate how the requirements, available input, expected output, and tools available all affect the process selected by the analyst for the most efficient and effective analysis.

  9. Chemical bioimaging for the subcellular localization of trace elements by high contrast TEM, TEM/X-EDS, and NanoSIMS.

    Science.gov (United States)

    Penen, Florent; Malherbe, Julien; Isaure, Marie-Pierre; Dobritzsch, Dirk; Bertalan, Ivo; Gontier, Etienne; Le Coustumer, Philippe; Schaumlöffel, Dirk

    2016-09-01

    Chemical bioimaging offers an important contribution to the investigation of biochemical functions, biosorption and bioaccumulation processes of trace elements via their localization at the cellular and even at the subcellular level. This paper describes the combined use of high contrast transmission electron microscopy (HC-TEM), energy dispersive X-ray spectroscopy (X-EDS), and nano secondary ion mass spectrometry (NanoSIMS) applied to a model organism, the unicellular green algae Chlamydomonas reinhardtii. HC-TEM providing a lateral resolution of 1nm was used for imaging the ultrastructure of algae cells which have diameters of 5-10μm. TEM coupled to X-EDS (TEM/X-EDS) combined textural (morphology and size) analysis with detection of Ca, P, K, Mg, Fe, and Zn in selected subcellular granules using an X-EDS probe size of approx. 1μm. However, instrumental sensitivity was at the limit for trace element detection. NanoSIMS allowed chemical imaging of macro and trace elements with subcellular resolution (element mapping). Ca, Mg, and P as well as the trace elements Fe, Cu, and Zn present at basal levels were detected in pyrenoids, contractile vacuoles, and granules. Some metals were even localized in small vesicles of about 200nm size. Sensitive subcellular localization of trace metals was possible by the application of a recently developed RF plasma oxygen primary ion source on NanoSIMS which has shown good improvements in terms of lateral resolution (below 50nm), sensitivity, and stability. Furthermore correlative single cell imaging was developed combining the advantages of TEM and NanoSIMS. An advanced sample preparation protocol provided adjacent ultramicrotome sections for parallel TEM and NanoSIMS analyses of the same cell. Thus, the C. reinhardtii cellular ultrastructure could be directly related to the spatial distribution of metals in different cell organelles such as vacuoles and chloroplast. PMID:27288221

  10. Comparison of extraction methods for analysis of flavonoids in onions

    OpenAIRE

    Soeltoft, Malene; Knuthsen, Pia; Nielsen, John

    2008-01-01

    Onions are known to contain high levels of flavonoids and a comparison of the efficiency, reproducibility and detection limits of various extraction methods has been made in order to develop fast and reliable analytical methods for analysis of flavonoids in onions. Conventional and classical methods are time- and solvent-consuming and the presence of light and oxygen during sample preparation facilitate degradation reactions. Thus, classical methods were compared with microwave (irradiatio...

  11. Simultaneous realization of Hg2+ sensing, magnetic resonance imaging and upconversion luminescence in vitro and in vivo bioimaging based on hollow mesoporous silica coated UCNPs and ruthenium complex

    Science.gov (United States)

    Ge, Xiaoqian; Sun, Lining; Ma, Binbin; Jin, Di; Dong, Liang; Shi, Liyi; Li, Nan; Chen, Haige; Huang, Wei

    2015-08-01

    We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak of the Ru complex is observed and the energy transfer process between the UCNPs and the Ru complex was blocked, resulting in an increase of the green emission intensity of the UCNPs. The un-changed 801 nm emission of the nanoprobe was used as an internal standard reference and the detection limit of Hg2+ was determined to be 0.16 μM for this nanoprobe in aqueous solution. In addition, based on the low cytotoxicity as studied by CCK-8 assay, the nanoprobe was successfully applied for cell imaging and small animal imaging. Furthermore, when doped with Gd3+ ions, the nanoprobe was successfully applied to in vivo magnetic resonance imaging (MRI) of Kunming mice, which demonstrates its potential as a MRI positive-contrast agent. Therefore, the method and results may provide more exciting opportunities to afford nanoprobes with multimodal bioimaging and multifunctional applications.We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak

  12. A METHOD FOR EXERGY ANALYSIS OF SUGARCANE BAGASSE BOILERS

    Directory of Open Access Journals (Sweden)

    CORTEZ L.A.B.

    1998-01-01

    Full Text Available This work presents a method to conduct a thermodynamic analysis of sugarcane bagasse boilers. The method is based on the standard and actual reactions which allows the calculation of the enthalpies of each process subequation and the exergies of each of the main flowrates participating in the combustion. The method is presented using an example with real data from a sugarcane bagasse boiler. A summary of the results obtained is also presented together based on the 1st Law of Thermodynamics analysis, the exergetic efficiencies, and the irreversibility rates. The method presented is very rigorous with respect to data consistency, particularly for the flue gas composition.

  13. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  14. Drop impact analysis method of radioactive material container

    International Nuclear Information System (INIS)

    Background: It is important for the safety of the radioactive material containers during transportation. Purpose: In the procedure of reviewing radioactive material containers transportation, it is very important factor to carry a drop impact analysis of container. Methods: This paper presents a drop impact analysis method of radioactive material container. First, do the calculation of several drop cases of the container such as horizontal drop, vertical drip and gradient drop with the famous transient dynamic analysis program LS-DYNA. Second, do the stress evaluation according to the rules in the ASME Section Ⅲ Division I Appendices which are about the fatigue analysis. Results: With this method, we can do the judgment that whether the container's strength is good enough or not. Conclusions: The radioactive material container's strength is good enough by analysis. (authors)

  15. Task analysis methods applicable to control room design review (CDR)

    International Nuclear Information System (INIS)

    This report presents the results of a research study conducted in support of the human factors engineering program of the Atomic Energy Control Board in Canada. It contains five products which may be used by the Atomic Enegy Control Board in relation to Task Analysis of jobs in CANDU nuclear power plants: 1. a detailed method for preparing for a task analysis; 2. a Task Data Form for recording task analysis data; 3. a detailed method for carrying out task analyses; 4. a guide to assessing alternative methods for performing task analyses, if such are proposed by utilities or consultants; and 5. an annotated bibliography on task analysis. In addition, a short explanation of the origins, nature and uses of task analysis is provided, with some examples of its cost effectiveness. 35 refs

  16. Comparative analysis of myocardial revascularization methods for ischemic heart disease

    Directory of Open Access Journals (Sweden)

    Sinkeev M.S.

    2012-09-01

    Full Text Available The review of literature is devoted to the comparative analysis of clinical researches of efficiency and frequency of complications after application of surgical and medicamentous methods of treatment of coronary heart disease.

  17. Meshless methods in biomechanics bone tissue remodelling analysis

    CERN Document Server

    Belinha, Jorge

    2014-01-01

    This book presents the complete formulation of a new advanced discretization meshless technique: the Natural Neighbour Radial Point Interpolation Method (NNRPIM). In addition, two of the most popular meshless methods, the EFGM and the RPIM, are fully presented. Being a truly meshless method, the major advantages of the NNRPIM over the FEM, and other meshless methods, are the remeshing flexibility and the higher accuracy of the obtained variable field. Using the natural neighbour concept, the NNRPIM permits to determine organically the influence-domain, resembling the cellulae natural behaviour. This innovation permits the analysis of convex boundaries and extremely irregular meshes, which is an advantage in the biomechanical analysis, with no extra computational effort associated.   This volume shows how to extend the NNRPIM to the bone tissue remodelling analysis, expecting to contribute with new numerical tools and strategies in order to permit a more efficient numerical biomechanical analysis.

  18. A Simple Buckling Analysis Method for Airframe Composite Stiffened Panel by Finite Strip Method

    Science.gov (United States)

    Tanoue, Yoshitsugu

    Carbon fiber reinforced plastics (CFRP) have been used in structural components for newly developed aircraft and spacecraft. The main structures of an airframe, such as the fuselage and wings, are essentially composed of stiffened panels. Therefore, in the structural design of airframes, it is important to evaluate the buckling strength of the composite stiffened panels. Widely used finite element method (FEM) can analyzed any stiffened panel shape with various boundary conditions. However, in the early phase of airframe development, many studies are required in structural design prior to carrying out detail drawing. In this phase, performing structural analysis using only FEM may not be very efficient. This paper describes a simple buckling analysis method for composite stiffened panels, which is based on finite strip method. This method can deal with isotropic and anisotropic laminated plates and shells with several boundary conditions. The accuracy of this method was verified by comparing it with theoretical analysis and FEM analysis (NASTRAN). It has been observed that the buckling coefficients calculated via the present method are in agreement with results found by detail analysis methods. Consequently, this method is designed to be an effective calculation tool for the buckling analysis in the early phases of airframe design.

  19. Development of root observation method by image analysis system

    OpenAIRE

    Kim, Giyoung

    1995-01-01

    Knowledge of plant roots is important for determining plant-soil relationships, managing soil effectively, studying nutrient and water extraction, and creating a soil quality index. Plant root research is limited by the large amount of time and labor required to wash the roots from the soil and measure the viable roots. A root measurement method based on image analysis was proposed to reduce the time and labor requirement. A thinning algorithm-based image analysis method was us...

  20. Probabilistic structural analysis methods of hot engine structures

    Science.gov (United States)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures is a major activity at Lewis Research Center. Recent activities have focused on extending the methods to include the combined uncertainties in several factors on structural response. This paper briefly describes recent progress on composite load spectra models, probabilistic finite element structural analysis, and probabilistic strength degradation modeling. Progress is described in terms of fundamental concepts, computer code development, and representative numerical results.

  1. Linear steady heat transfer analysis by boundary element method

    International Nuclear Information System (INIS)

    The boundary element method for linear steady heat transfer analysis has been developed. Two types of elements, namely, constant elements and linear elements are described. A mention has been made of the analysis of the problems of a square plate subjected to two constant temperature boundaries and other two being insulated, blunt fin with convection boundary condition, and the steady state temperature distribution in circular segment by using this method. (M.G.B.)

  2. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  3. Nodal Discontinuous Element Methods: Formulations, Analysis, and Applications

    DEFF Research Database (Denmark)

    Hesthaven, Jan

    Part of concluding summary and outlook: "The focus of this thesis has been on the formulation, analysis, and application of high-order accurate computational techniques for solving rather general initial boundary value problems, emphasizing an analysis driven theoretical foundation. As such, the...... methods have applications throughout science and engineering. One can and should expect such high-order accurate and robust methods to play an increasingly important role in modeling in the applied sciences"....

  4. Method development of gas analysis with mass spectrometer

    International Nuclear Information System (INIS)

    Dissolved gas content in deep saline groundwater is an important factor, which has to be known and taken into account when planning the deep repository for the spent nuclear fuel. Posiva has investigated dissolved gases in deep groundwaters since the 1990's. In 2002 Posiva started a project that focused on developing the mass spectrometric method for measuring the dissolved gas content in deep saline groundwater. The main idea of the project was to analyse the dissolved gas content of both the gas phase and the water phase by a mass spectrometer. The development of the method started in 2003 (in the autumn). One of the aims was to create a parallel method for gas analysis with the gas chromatographic method. The starting point of this project was to test if gases could be analysed directly from water using a membrane inlet in the mass spectrometer. The main objective was to develop mass spectrometric methods for gas analysis with direct and membrane inlets. An analysis method for dissolved gases was developed for direct gas inlet mass spectrometry. The accuracy of the analysis method is tested with parallel real PAVE samples analysed in the laboratory of Insinoeoeritoimisto Paavo Ristola Oy. The results were good. The development of the membrane inlet mass spectrometric method still continues. Two different membrane materials (silicone and teflon) were tested. Some basic tests (linearity,repeatability and detection limits for different gases) will be done by this method. (orig.)

  5. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  6. Latent Class Analysis: A Method for Capturing Heterogeneity

    Science.gov (United States)

    Scotto Rosato, Nancy; Baer, Judith C.

    2012-01-01

    Social work researchers often use variable-centered approaches such as regression and factor analysis. However, these methods do not capture important aspects of relationships that are often imbedded in the heterogeneity of samples. Latent class analysis (LCA) is one of several person-centered approaches that can capture heterogeneity within and…

  7. Method of morphological analysis of enterprise management organizational structure

    OpenAIRE

    Heorhiadi, N.; Iwaszczuk, N.; Vilhutska, R.

    2013-01-01

    The essence of the method of morphological analysis of enterprise management organizational structure is described in the article. Setting levels of morphological decomposition and specification of sets of elements are necessary for morphological analysis. Based on empirical research identified factors that influence the formation and use of enterprises management organizational structures.

  8. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    Science.gov (United States)

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  9. Methods for Comprehensive Analysis of Heat Supply Reliability

    OpenAIRE

    V. A. Stennikov; I. V. Postnikov

    2013-01-01

    The paper deals with the problem of comprehensive analysis of heat supply reliability for consumers. It implies a quantitative assessment of the impact of all stages of heat energy production and distribution on heat supply reliability for each consumer of the heat supply system. A short review of existing methods for the analysis of fuel and heat supply reliability is presented that substantiates the key approaches to solving the problem of comprehensive analysis of heat supply reliability. ...

  10. Literature Fingerprinting : A New Method for Visual Literary Analysis

    OpenAIRE

    Keim, Daniel A.; Oelke, Daniela

    2007-01-01

    In computer-based literary analysis different types of features are used to characterize a text. Usually, only a single feature value or vector is calculated for the whole text. In this paper, we combine automatic literature analysis methods with an effective visualization technique to analyze the behavior of the feature values across the text. For an interactive visual analysis, we calculate a sequence of feature values per text and present them to the user as a characteristic fingerprint. T...

  11. USING GROUNDED THEORY AS A METHOD FOR SYSTEM REQUIREMENTS ANALYSIS

    OpenAIRE

    Mohanad Halaweh

    2012-01-01

    Requirements analysis (RA) is a key phase in information systems (IS) development. During this phase, system analysts use different techniques and methods to elicit and structure the system’s requirements. The current paper rationalises the use of grounded theory (GT) as an alternative socio-technical approach to requirement analysis. It will establish theoretically that applying grounded theory procedures and techniques will support and add value to the analysis phase as it solves some probl...

  12. Fluorescent MoS2 Quantum Dots: Ultrasonic Preparation, Up-Conversion and Down-Conversion Bioimaging, and Photodynamic Therapy.

    Science.gov (United States)

    Dong, Haifeng; Tang, Songsong; Hao, Yansong; Yu, Haizhu; Dai, Wenhao; Zhao, Guifeng; Cao, Yu; Lu, Huiting; Zhang, Xueji; Ju, Huangxian

    2016-02-10

    Small size molybdenum disulfide (MoS2) quantum dots (QDs) with desired optical properties were controllably synthesized by using tetrabutylammonium-assisted ultrasonication of multilayered MoS2 powder via OH-mediated chain-like Mo-S bond cleavage mode. The tunable up-bottom approach of precise fabrication of MoS2 QDs finally enables detailed experimental investigations of their optical properties. The synthesized MoS2 QDs present good down-conversion photoluminescence behaviors and exhibit remarkable up-conversion photoluminescence for bioimaging. The mechanism of the emerging photoluminescence was investigated. Furthermore, superior (1)O2 production ability of MoS2 QDs to commercial photosensitizer PpIX was demonstrated, which has great potential application for photodynamic therapy. These early affording results of tunable synthesis of MoS2 QDs with desired photo properties can lead to application in fields of biomedical and optoelectronics. PMID:26761391

  13. The Methods of Sensitivity Analysis and Their Usage for Analysis of Multicriteria Decision

    Directory of Open Access Journals (Sweden)

    Rūta Simanavičienė

    2011-08-01

    Full Text Available In this paper we describe the application's fields of the sensitivity analysis methods. We pass in review the application of these methods in multiple criteria decision making, when the initial data are numbers. We formulate the problem, which of the sensitivity analysis methods is more effective for the usage in the decision making process.Article in Lithuanian

  14. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  15. Investigating Convergence Patterns for Numerical Methods Using Data Analysis

    Science.gov (United States)

    Gordon, Sheldon P.

    2013-01-01

    The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…

  16. Analysis of the electrical conduction using an iterative method

    OpenAIRE

    Dziuba, Z.; Górska, M.

    1992-01-01

    An iterative method for transforming the electrical conduction versus magnetic field $\\hat{\\sigma}\\,(H)$ into the mobility spectrum of the electrical conduction density s (μ) is presented. The mobility spectrum is a new form of presentation of carrier parameters. The method is especially useful in the analysis of a mixed conduction in semiconductors like HgCdTe or in quantum well systems.

  17. Application of numerical analysis methods to thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    This report presents the application of numerical methods to thermoluminescence dosimetry (TLD), showing the advantages obtained over conventional evaluation systems. Different configurations of the analysis method are presented to operate in specific dosimetric applications of TLD, such as environmental monitoring and mailed dosimetry systems for quality assurance in radiotherapy facilities. (Author) 10 refs

  18. Practical Use of Computationally Frugal Model Analysis Methods.

    Science.gov (United States)

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  19. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  20. Reactor physics analysis method based on Monte Carlo homogenization

    International Nuclear Information System (INIS)

    Background: Many new concepts of nuclear energy systems with complicated geometric structures and diverse energy spectra have been put forward to meet the future demand of nuclear energy market. The traditional deterministic neutronics analysis method has been challenged in two aspects: one is the ability of generic geometry processing; the other is the multi-spectrum applicability of the multi-group cross section libraries. The Monte Carlo (MC) method predominates the suitability of geometry and spectrum, but faces the problems of long computation time and slow convergence. Purpose: This work aims to find a novel scheme to take the advantages of both methods drawn from the deterministic core analysis method and MC method. Methods: A new two-step core analysis scheme is proposed to combine the geometry modeling capability and continuous energy cross section libraries of MC method, as well as the higher computational efficiency of deterministic method. First of all, the MC simulations are performed for assembly, and the assembly homogenized multi-group cross sections are tallied at the same time. Then, the core diffusion calculations can be done with these multi-group cross sections. Results: The new scheme can achieve high efficiency while maintain acceptable precision. Conclusion: The new scheme can be used as an effective tool for the design and analysis of innovative nuclear energy systems, which has been verified by numeric tests. (authors)

  1. Method of electrodynamical analysis of complex antennas systems

    OpenAIRE

    Buzova, M. A.

    2011-01-01

    This article is about method of electrodynamical analysis of complex antennas systems containing different scatterers. Developed method is based on Fredholm’s integral equations of first and second kind; 2-D integral equations first and second kind; physical optics. Article lists expressions used for calculation and some of the results.

  2. A Comparison of Card-sorting Analysis Methods

    DEFF Research Database (Denmark)

    Nawaz, Ather

    2012-01-01

    This study investigates how the choice of analysis method for card sorting studies affects the suggested information structure for websites. In the card sorting technique, a variety of methods are used to analyse the resulting data. The analysis of card sorting data helps user experience (UX...... the recurrent patterns found and thus has consequences for the resulting website design. This paper draws an attention to the choice of card sorting analysis and techniques and shows how it impacts the results. The research focuses on how the same data for card sorting can lead to different website structures...

  3. Integrated numerical methods for hypersonic aircraft cooling systems analysis

    Science.gov (United States)

    Petley, Dennis H.; Jones, Stuart C.; Dziedzic, William M.

    1992-01-01

    Numerical methods have been developed for the analysis of hypersonic aircraft cooling systems. A general purpose finite difference thermal analysis code is used to determine areas which must be cooled. Complex cooling networks of series and parallel flow can be analyzed using a finite difference computer program. Both internal fluid flow and heat transfer are analyzed, because increased heat flow causes a decrease in the flow of the coolant. The steady state solution is a successive point iterative method. The transient analysis uses implicit forward-backward differencing. Several examples of the use of the program in studies of hypersonic aircraft and rockets are provided.

  4. Fault Diagnosis and Reliability Analysis Using Fuzzy Logic Method

    Institute of Scientific and Technical Information of China (English)

    Miao Zhinong; Xu Yang; Zhao Xiangyu

    2006-01-01

    A new fuzzy logic fault diagnosis method is proposed. In this method, fuzzy equations are employed to estimate the component state of a system based on the measured system performance and the relationship between component state and system performance which is called as "performance-parameter" knowledge base and constructed by expert. Compared with the traditional fault diagnosis method, this fuzzy logic method can use humans intuitive knowledge and dose not need a precise mapping between system performance and component state. Simulation proves its effectiveness in fault diagnosis. Then, the reliability analysis is performed based on the fuzzy logic method.

  5. Dynamic analysis of shallow spherical shells by the residual method

    International Nuclear Information System (INIS)

    Residual method in the sense of least squares is presented for the dynamic analysis of shallow spherical shells for all kinds of boundary conditions. The basic step in the method is the choice of a trial solution which, because of the presence of undetermined parameters, actually represents a whole family of possible approximations. The residual equations are obtained by substituting the trial solution into the governing differential equations and least square collation method may be used to pick out the best approximation. Unlike the finite element method, the residual method does not require the formation of an energy principle and the approximate solutions are still in analytical forms. (author)

  6. Bearing Capacity Analysis Using Meshless Local Petrov-Galerkin Method

    Directory of Open Access Journals (Sweden)

    Mužík Juraj

    2014-05-01

    Full Text Available The paper deals with use of the meshless method for soil bearing capacity analysis. There are many formulations of the meshless methods. The article presents the Meshless Local Petrov-Galerkin method (MLPG - local weak formulation of the equilibrium equations. The main difference between meshless methods and the conventional finite element method (FEM is that meshless shape functions are constructed using randomly scattered set of points without any relation between points. The Heaviside step function is test function used in the meshless implementation presented in the article. Heaviside test function makes weak formulation integral very simple, because only body integral in governing equation is due a body force.

  7. Time history analysis method for spent fuel racks

    International Nuclear Information System (INIS)

    Background: Spent fuel racks are important facilities to store the spent fuel which are free standing in the spent fuel pool. The response of racks to seismic load is highly nonlinear and involves a complex combination of motions: sliding, impact, twisting and turning. Purpose: An analysis method should be built to accurately replicate these nonlinear responses. Methods: The whole pool multi-rack FEA model was developed and time history analysis was performed which contains the consideration of effect of sliding, impact and friction and the fluid structure interaction effect. Results: The analysis results such as displacement and force under seismic loads were obtained. Conclusion: The method can be used to the seismic analysis for spent fuel racks. (authors)

  8. Microscale extraction method for HPLC carotenoid analysis in vegetable matrices

    OpenAIRE

    Sidney Pacheco; Fernanda Marques Peixoto; Renata Galhardo Borguini; Luzimar da Silva de Mattos do Nascimento; Claudio Roberto Ribeiro Bobeda; Manuela Cristina Pessanha de Araújo Santiago; Ronoel Luiz de Oliveira Godoy

    2014-01-01

    In order to generate simple, efficient analytical methods that are also fast, clean, and economical, and are capable of producing reliable results for a large number of samples, a micro scale extraction method for analysis of carotenoids in vegetable matrices was developed. The efficiency of this adapted method was checked by comparing the results obtained from vegetable matrices, based on extraction equivalence, time required and reagents. Six matrices were used: tomato (Solanum lycopersicum...

  9. Slope Stability Analysis Using Limit Equilibrium Method in Nonlinear Criterion

    OpenAIRE

    Hang Lin; Wenwen Zhong; Wei Xiong; Wenyu Tang

    2014-01-01

    In slope stability analysis, the limit equilibrium method is usually used to calculate the safety factor of slope based on Mohr-Coulomb criterion. However, Mohr-Coulomb criterion is restricted to the description of rock mass. To overcome its shortcomings, this paper combined Hoek-Brown criterion and limit equilibrium method and proposed an equation for calculating the safety factor of slope with limit equilibrium method in Hoek-Brown criterion through equivalent cohesive strength and the fric...

  10. Numerical Analysis of Higher Order Discontinuous Galerkin Finite Element methods

    OpenAIRE

    Hartmann, Ralf

    2008-01-01

    After the introduction in Section 1 this lecture starts off with recalling well-known results from the numerical analysis of the continuous finite element methods. In particular, we recall a priori error estimates in the energy norm and the L2-norm including their proofs for higher order standard finite element methods of Poisson's equation in Section 2 and for the standard and the streamline diffusion finite element method of the linear advection equation in Section 3. ...

  11. Parametric quadratic programming method for elastic contact fracture analysis

    OpenAIRE

    Su, RKL; Zhu, Y; Leung, AYT

    2002-01-01

    A solution procedure for elastic contact fracture mechanics has been proposed in this paper. The procedure is based on the quadratic programming and finite element method (FEM). In this paper, parametric quadratic programming method for two-dimensional contact mechanics analysis is applied to the crack problems involving the crack surfaces in frictional contact. Based on a linear complementary contact condition, the parametric variational principle and FEM, a linear complementary method is ex...

  12. Multifunctional Collaborative Modeling and Analysis Methods in Engineering Science

    Science.gov (United States)

    Ransom, Jonathan B.; Broduer, Steve (Technical Monitor)

    2001-01-01

    Engineers are challenged to produce better designs in less time and for less cost. Hence, to investigate novel and revolutionary design concepts, accurate, high-fidelity results must be assimilated rapidly into the design, analysis, and simulation process. This assimilation should consider diverse mathematical modeling and multi-discipline interactions necessitated by concepts exploiting advanced materials and structures. Integrated high-fidelity methods with diverse engineering applications provide the enabling technologies to assimilate these high-fidelity, multi-disciplinary results rapidly at an early stage in the design. These integrated methods must be multifunctional, collaborative, and applicable to the general field of engineering science and mechanics. Multifunctional methodologies and analysis procedures are formulated for interfacing diverse subdomain idealizations including multi-fidelity modeling methods and multi-discipline analysis methods. These methods, based on the method of weighted residuals, ensure accurate compatibility of primary and secondary variables across the subdomain interfaces. Methods are developed using diverse mathematical modeling (i.e., finite difference and finite element methods) and multi-fidelity modeling among the subdomains. Several benchmark scalar-field and vector-field problems in engineering science are presented with extensions to multidisciplinary problems. Results for all problems presented are in overall good agreement with the exact analytical solution or the reference numerical solution. Based on the results, the integrated modeling approach using the finite element method for multi-fidelity discretization among the subdomains is identified as most robust. The multiple-method approach is advantageous when interfacing diverse disciplines in which each of the method's strengths are utilized. The multifunctional methodology presented provides an effective mechanism by which domains with diverse idealizations are

  13. Mode Analysis with Autocorrelation Method (Single Time Series) in Tokamak

    Science.gov (United States)

    Saadat, Shervin; Salem, Mohammad K.; Goranneviss, Mahmoud; Khorshid, Pejman

    2010-08-01

    In this paper plasma mode analyzed with statistical method that designated Autocorrelation function. Auto correlation function used from one time series, so for this purpose we need one Minov coil. After autocorrelation analysis on mirnov coil data, spectral density diagram is plotted. Spectral density diagram from symmetries and trends can analyzed plasma mode. RHF fields effects with this method ate investigated in IR-T1 tokamak and results corresponded with multichannel methods such as SVD and FFT.

  14. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel

    2011-01-01

    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  15. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  16. Exhaled breath analysis: physical methods, instruments, and medical diagnostics

    International Nuclear Information System (INIS)

    This paper reviews the analysis of exhaled breath, a rapidly growing field in noninvasive medical diagnostics that lies at the intersection of physics, chemistry, and medicine. Current data are presented on gas markers in human breath and their relation to human diseases. Various physical methods for breath analysis are described. It is shown how measurement precision and data volume requirements have stimulated technological developments and identified the problems that have to be solved to put this method into clinical practice. (instruments and methods of investigation)

  17. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Dalia Krikščiūnienė

    2012-07-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks.The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules.The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables.The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis.The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  18. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  19. One-step synthesis of amino-functionalized ultrasmall near infrared-emitting persistent luminescent nanoparticles for in vitro and in vivo bioimaging

    Science.gov (United States)

    Shi, Junpeng; Sun, Xia; Zhu, Jianfei; Li, Jinlei; Zhang, Hongwu

    2016-05-01

    Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be easily conjugated with many bio-functional molecules, which has been successfully utilized to realize in vitro and in vivo imaging. More importantly, the ZGO achieved re-excitation imaging using 650 nm and 808 nm NIR light in situ, which is advantageous for long-term and higher SNR bioimaging.Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be

  20. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  1. Qualitative analysis of the CCEBC/EEAC method

    Institute of Scientific and Technical Information of China (English)

    LIAO; Haohui; TANG; Yun

    2004-01-01

    The CCEBC/EEAC method is an effective method in the quantitative analysis of power system transient stability. This paper provides a qualitative analysis of the CCEBC/EEAC method and shows that from a geometrical point of view, the CCCOI-RM transformation used in the CCEBC/EEAC method can be regarded as a projection of the variables of the system model on a weighted vector space, from which a generalized(-P)-(-δ) trajectory is obtained. Since a transient process of power systems can be approximately regarded as a time-piecewise simple Hamiltonian system, in order to qualitatively analyse the CCEBC/EEAC method, this paper compares the potential energy of a two-machine infinite bus system with its CCEBC/EEAC energy. Numerical result indicates their similarity. Clarifying the qualitative relation between these two kinds of energies is significant in verifying mathematically the CCEBC/EEAC method for judging the criterion of power system transient stability. Moreover, the qualitative analysis of the CCEBC/EEAC method enables us to better understand some important phenomena revealed by quantitative analysis, such as multi-swing loss of stability and isolated stable domain.

  2. INDIRECT DETERMINATION METHOD OF DYNAMIC FORCEBY USING CEPSTRUM ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    吴淼; 魏任之

    1996-01-01

    The dynamic load spectrum is one of the most important basis of design and dynamic characteristics analysis of machines. But it is difficult to measure it on many occasions, especially for mining machines, due to their bad working circumstances and high cost of measurements. For such situation, the load spectrum has to be obtained by indirect determination methods. A new method to identify the load spectrum, cepstrum analysis method, was presented in this paper.This method can be used to eliminate the filtering influence of transfer function to the response signals so that the load spectrum can be determined indirectly. The experimental and engineering actual examples indicates that this method has the advantages that the calculation is simple and the measurement is easy.

  3. Objective analysis of the ARM IOP data: method and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Cedarwall, R; Lin, J L; Xie, S C; Yio, J J; Zhang, M H

    1999-04-01

    Motivated by the need of to obtain accurate objective analysis of field experimental data to force physical parameterizations in numerical models, this paper -first reviews the existing objective analysis methods and interpolation schemes that are used to derive atmospheric wind divergence, vertical velocity, and advective tendencies. Advantages and disadvantages of each method are discussed. It is shown that considerable uncertainties in the analyzed products can result from the use of different analysis schemes and even more from different implementations of a particular scheme. The paper then describes a hybrid approach to combine the strengths of the regular grid method and the line-integral method, together with a variational constraining procedure for the analysis of field experimental data. In addition to the use of upper air data, measurements at the surface and at the top-of-the-atmosphere are used to constrain the upper air analysis to conserve column-integrated mass, water, energy, and momentum. Analyses are shown for measurements taken in the Atmospheric Radiation Measurement Programs (ARM) July 1995 Intensive Observational Period (IOP). Sensitivity experiments are carried out to test the robustness of the analyzed data and to reveal the uncertainties in the analysis. It is shown that the variational constraining process significantly reduces the sensitivity of the final data products.

  4. Improved reliability analysis method based on the failure assessment diagram

    Science.gov (United States)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  5. The analysis of image sequences in nuclear medicine: 2 - factor analysis methods

    International Nuclear Information System (INIS)

    This article is a review of the Factor Analysis methods used in Nuclear Medicine. These methods consist of processing temporal or energetic dynamic scintigraphic series with Descriptive Multivariate Analysis methods. The principle for using these methods is the following: the dynamic series is regarded as a table in which each row is a dixel and each column is an image. The three methods used in Nuclear Medicine are Principal Components Analysis and Factor Correspondence Analysis which are Orthogonal Analysis methods and Factor Analysis which is an Oblique Analysis method. The results of Orthogonal Analysis consist of a limited number of non correlated factor curves and images which can be used for generating parametric images, for nosologic classification, for smoothing or for data compression. The software of Oblique Analysis used in Nuclear Medicine is called factor Analysis of Dynamic Structures (FADS). It generates a limited number of correlated oblique factor images and curves which are estimates of Dynamic Structures and of kinetics in these structures

  6. AN ANALYSIS METHOD FOR HIGH-SPEED CIRCUIT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new method for analyzing high-speed circuit systems is presented. The method adds transmission line end currents to the circuit variables of the classical modified nodal approach. Then the matrix equation describing high-speed circuit system can be formulated directly and analyzed conveniently for its normative form. A time-domain analysis method for transmission lines is also introduced. The two methods are combined together to efficiently analyze high-speed circuit systems having general transmission lines. Numerical experiment is presented and the results are compared with that calculated by Hspice.

  7. Extended Finite Element Method for Fracture Analysis of Structures

    CERN Document Server

    Mohammadi, Soheil

    2008-01-01

    This important textbook provides an introduction to the concepts of the newly developed extended finite element method (XFEM) for fracture analysis of structures, as well as for other related engineering applications.One of the main advantages of the method is that it avoids any need for remeshing or geometric crack modelling in numerical simulation, while generating discontinuous fields along a crack and around its tip. The second major advantage of the method is that by a small increase in number of degrees of freedom, far more accurate solutions can be obtained. The method has recently been

  8. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  9. A study of applicability of soil-structure interaction analysis method using boundary element method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, M. K. [KAERI, Taejon (Korea, Republic of); Kim, M. K. [Yonsei University, Seoul (Korea, Republic of)

    2003-07-01

    In this study, a numerical method for Soil-Structure Interaction (SSI) analysis using FE-BE coupling method is developed. The total system is divided into two parts so called far field and near field. The far field is modeled by boundary element formulation using the multi-layered dynamic fundamental solution and coupled with near field modeled by finite elements. In order to verify the seismic response analysis, the results are compared with those of other commercial code. Finally, several SSI analyses which induced seismic loading are performed to examine the dynamic behavior of the system. As a result, it is shown that the developed method can be an efficient numerical method for solving the SSI analysis.

  10. The colour analysis method applied to homogeneous rocks

    Science.gov (United States)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  11. Continuum damage growth analysis using element free Galerkin method

    Indian Academy of Sciences (India)

    C O Arun; B N Rao; S M Srinivasan

    2010-06-01

    This paper presents an elasto-plastic element free Galerkin formulation based on Newton–Raphson algorithm for damage growth analysis. Isotropic ductile damage evolution law is used. A study has been carried out in this paper using the proposed element free Galerkin method to understand the effect of initial damage and its growth on structural response of single and bi-material problems. A simple method is adopted for enforcing EBCs by scaling the function approximation using a scaling matrix, when non-singular weight functions are used over the entire domain of the problem definition. Numerical examples comprising of one-and two-dimensional problems are presented to illustrate the effectiveness of the proposed method in analysis of uniform and non-uniform damage evolution problems. Effect of material discontinuity on damage growth analysis is also presented.

  12. Identification of the isomers using principal component analysis (PCA) method

    Science.gov (United States)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  13. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  14. Beyond perturbation introduction to the homotopy analysis method

    CERN Document Server

    Liao, Shijun

    2003-01-01

    Solving nonlinear problems is inherently difficult, and the stronger the nonlinearity, the more intractable solutions become. Analytic approximations often break down as nonlinearity becomes strong, and even perturbation approximations are valid only for problems with weak nonlinearity.This book introduces a powerful new analytic method for nonlinear problems-homotopy analysis-that remains valid even with strong nonlinearity. In Part I, the author starts with a very simple example, then presents the basic ideas, detailed procedures, and the advantages (and limitations) of homotopy analysis. Part II illustrates the application of homotopy analysis to many interesting nonlinear problems. These range from simple bifurcations of a nonlinear boundary-value problem to the Thomas-Fermi atom model, Volterra''s population model, Von Kármán swirling viscous flow, and nonlinear progressive waves in deep water.Although the homotopy analysis method has been verified in a number of prestigious journals, it has yet to be ...

  15. Structural analysis with the finite element method linear statics

    CERN Document Server

    Oñate, Eugenio

    2013-01-01

    STRUCTURAL ANALYSIS WITH THE FINITE ELEMENT METHOD Linear Statics Volume 1 : The Basis and Solids Eugenio Oñate The two volumes of this book cover most of the theoretical and computational aspects of the linear static analysis of structures with the Finite Element Method (FEM). The content of the book is based on the lecture notes of a basic course on Structural Analysis with the FEM taught by the author at the Technical University of Catalonia (UPC) in Barcelona, Spain for the last 30 years. Volume1 presents the basis of the FEM for structural analysis and a detailed description of the finite element formulation for axially loaded bars, plane elasticity problems, axisymmetric solids and general three dimensional solids. Each chapter describes the background theory for each structural model considered, details of the finite element formulation and guidelines for the application to structural engineering problems. The book includes a chapter on miscellaneous topics such as treatment of inclined supports, elas...

  16. Adaptive computational methods for SSME internal flow analysis

    Science.gov (United States)

    Oden, J. T.

    1986-01-01

    Adaptive finite element methods for the analysis of classes of problems in compressible and incompressible flow of interest in SSME (space shuttle main engine) analysis and design are described. The general objective of the adaptive methods is to improve and to quantify the quality of numerical solutions to the governing partial differential equations of fluid dynamics in two-dimensional cases. There are several different families of adaptive schemes that can be used to improve the quality of solutions in complex flow simulations. Among these are: (1) r-methods (node-redistribution or moving mesh methods) in which a fixed number of nodal points is allowed to migrate to points in the mesh where high error is detected; (2) h-methods, in which the mesh size h is automatically refined to reduce local error; and (3) p-methods, in which the local degree p of the finite element approximation is increased to reduce local error. Two of the three basic techniques have been studied in this project: an r-method for steady Euler equations in two dimensions and a p-method for transient, laminar, viscous incompressible flow. Numerical results are presented. A brief introduction to residual methods of a-posterior error estimation is also given and some pertinent conclusions of the study are listed.

  17. Observation impact analysis methods for storm surge forecasting systems

    Science.gov (United States)

    Verlaan, Martin; Sumihar, Julius

    2016-02-01

    This paper presents a simple method for estimating the impact of assimilating individual or group of observations on forecast accuracy improvement. This method is derived from the nsemble-based observation impact analysis method of Liu and Kalnay (Q J R Meteorol Soc 134:1327-1335, 2008). The method described here is different in two ways from their method. Firstly, it uses a quadratic function of model-minus-observation residuals as a measure of forecast accuracy, instead of model-minus-analysis. Secondly, it simply makes use of time series of observations and the corresponding model output generated without data assimilation. These time series are usually available in an operational database. Hence, it is simple to implement. It can be used before any data assimilation is implemented. Therefore, it is useful as a design tool of a data assimilation system, namely for selecting which observations to assimilate. The method can also be used as a diagnostic tool, for example, to assess if all observation contributes positively to the accuracy improvement. The method is applicable for systems with stationary error process and fixed observing network. Using twin experiments with a simple one-dimensional advection model, the method is shown to work perfectly in an idealized situation. The method is used to evaluate the observation impact in the operational storm surge forecasting system based on the Dutch Continental Shelf Model version 5 (DCSMv5).

  18. One-step green synthetic approach for the preparation of multicolor emitting copper nanoclusters and their applications in chemical species sensing and bioimaging.

    Science.gov (United States)

    Bhamore, Jigna R; Jha, Sanjay; Mungara, Anil Kumar; Singhal, Rakesh Kumar; Sonkeshariya, Dhanshri; Kailasa, Suresh Kumar

    2016-06-15

    One-step green microwave synthetic approach was developed for the synthesis of copper nanoclusters (Cu NCs) and used as a fluorescent probe for the sensitive detection of thiram and paraquat in water and food samples. Unexpectedly, the prepared Cu NCs exhibited strong orange fluorescence and showed emission peak at 600nm, respectively. Under optimized conditions, the quenching of Cu NCs emission peak at 600nm was linearly proportional to thiram and paraquat concentrations in the ranges from 0.5 to 1000µM, and from 0.2 to 1000µM, with detection limits of 70nM and 49nM, respectively. In addition, bioimaging studies against Bacillus subtilis through confocal fluorescence microscopy indicated that Cu NCs showed strong blue and green fluorescence signals, good permeability and minimum toxicity against the various bacteria species, which demonstrates their potential feasibility for chemical species sensing and bioimaging applications. PMID:26851582

  19. On spectral methods for variance based sensitivity analysis

    OpenAIRE

    Alexanderian, Alen

    2013-01-01

    Consider a mathematical model with a finite number of random parameters. Variance based sensitivity analysis provides a framework to characterize the contribution of the individual parameters to the total variance of the model response. We consider the spectral methods for variance based sensitivity analysis which utilize representations of square integrable random variables in a generalized polynomial chaos basis. Taking a measure theoretic point of view, we provide a rigorous and at the sam...

  20. New analysis method for Ccd X-ray data

    Energy Technology Data Exchange (ETDEWEB)

    Ishiwatari, T.; Cargnelli, M.; Fuhrmann, H.; Kienle, P.; Marton, J.; Zmeskal, J. [Stefan Meyer Institut fuer subatomare Physik, Boltzmanngasse 3, A-1090 Vienna (Austria); Beer, G. [Department of Physics and Astronomy, University of Victoria, P. O. Box 3055 Victoria B. C. V8W 3P6 (Canada); Bragadireanu, A.M.; Curceanu, C.; Iliescu, M.; Sirghi, D.L. [INFN, Laboratori Nazionali di Frascati, C. P. 13, Via E. Fermi 40, I-00044 Frascati (Italy)]|[Institute of Physics and Nuclear Engineering Horia Hulubei, IFIN-HH, Particle Physics Department, P.O. Box MG-6, R-76900 Magurele, Bucharest (Romania); Egger, J.-P. [Institute de Physique, Universite de Neuchatel, 1 rue A.-L. Breguet, CH-2000 Neuchatel (Switzerland); Guaraldo, C. [INFN, Laboratori Nazionali di Frascati, C. P. 13, Via E. Fermi 40, I-00044 Frascati (Italy)]. E-mail: carlo.guaraldo@lnf.infn.it; Itahashi, K.; Strasser, P. [RIKEN Wako Institute, RIKEN, Institute of Physical and Chemical Research, Wako-shi, Saitama, 351-0198 (Japan); Iwasaki, M. [RIKEN Wako Institute, RIKEN, Institute of Physical and Chemical Research, Wako-shi, Saitama, 351-0198 (Japan); Lauss, B. [Department of Physics, LeConte Hall 366, University of California, Berkeley, CA 94720 (United States); Lucherini, V. [INFN, Laboratori Nazionali di Frascati, C. P. 13, Via E. Fermi 40, I-00044 Frascati (Italy); Ludhova, L. [Physics Department, University of Fribourg, CH-1700 Fribourg (Switzerland); Mulhauser, F. [Physics Department, University of Fribourg, CH-1700 Fribourg (Switzerland); Ponta, T. [Institute of Physics and Nuclear Engineering Horia Hulubei, IFIN-HH, Particle Physics Department, P.O. Box MG-6, R-76900 Magurele, Bucharest (Romania); Schaller, L.A. [Physics Department, University of Fribourg, CH-1700 Fribourg (Switzerland); Sirghi, F. [INFN, Laboratori Nazionali di Frascati, C. P. 13, Via E. Fermi 40, I-00044 Frascati (Italy)

    2006-01-15

    The analysis method developed for kaonic nitrogen X-ray data obtained at the DA{phi}NE electron-positron collider of Frascati National Laboratories using Charge-Coupled Devices (CCDs) in the DEAR experimental setup is described. Background events could be highly rejected by this analysis procedure. Three sequential X-ray lines from kaonic nitrogen transitions, showing good energy resolution, could be clearly identified, and the yields measured for the first time.

  1. Baltic sea algae analysis using Bayesian spatial statistics methods

    OpenAIRE

    Eglė Baltmiškytė; Kęstutis Dučinskas

    2013-01-01

    Spatial statistics is one of the fields in statistics dealing with spatialy spread data analysis. Recently, Bayes methods are often applied for data statistical analysis. A spatial data model for predicting algae quantity in the Baltic Sea is made and described in this article. Black Carrageen is a dependent variable and depth, sand, pebble, boulders are independent variables in the described model. Two models with different covariation functions (Gaussian and exponential) are built to estima...

  2. FDTD method based electromagnetic solver for printed-circuit analysis

    OpenAIRE

    Gnilenko, Alexey B.; Paliy, Oleg V.

    2003-01-01

    An electromagnetic solver for printed-circuit analysis is presented. The electromagnetic simulator is based on the finite-difference time-domain method with first-order Mur's absorbing boundary conditions. The solver environment comprises a layout graphic editor for circuit topology preparation and a data postprocessor for presenting the calculation results. The solver has been applied to the analysis of printed-circuit components such as printed antenna, microstrip discontinuities, etc.

  3. Project risk simulation methods – a comparative analysis

    OpenAIRE

    Constanța-Nicoleta BODEA; Augustin PURNUȘ

    2012-01-01

    Effective risk management provides a solid basis for decisionmaking in projects, bringing important benefits. While the financial and economical crisis is present at the global level and the competition in the market is more and more aggressive, the interest in project risk management increases. The paper presents a comparative analysis of the effectiveness of two quantitative risk analysis methods, Monte Carlo simulation and the Three Scenario approach. Two experiments are designed based on ...

  4. MANNER OF STOCKS SORTING USING CLUSTER ANALYSIS METHODS

    Directory of Open Access Journals (Sweden)

    Jana Halčinová

    2014-06-01

    Full Text Available The aim of the present article is to show the possibility of using the methods of cluster analysis in classification of stocks of finished products. Cluster analysis creates groups (clusters of finished products according to similarity in demand i.e. customer requirements for each product. Manner stocks sorting of finished products by clusters is described a practical example. The resultants clusters are incorporated into the draft layout of the distribution warehouse.

  5. Image analysis methods for gamma-hadron separation

    International Nuclear Information System (INIS)

    Gamma-hadron separation is essential in VHE gamma-ray astronomy. In order to separate gamma-ray- from proton-induced air shower images obtained with the H.E.S.S. imaging atmospheric Cherenkov telescopes, image analysis methods are applied to these camera images. Different classifiers are evaluated in a multivariate analysis framework to test the combined separation power and to check for correlations. The results are presented here.

  6. Neutrosophy, a possible method of process analysis uncertainties solving

    OpenAIRE

    Daniela Gîfu; Mirela Teodorescu

    2014-01-01

    This paper presents the importance of neutrosophy theory in order to find a method that could solve the uncertainties arising on process analysis. The aim of this pilot study is to find a procedure to diminish the uncertainties from automotive industry induced by manu-facturing, maintenance, logistics, design, hu-man resources. We consider that Neutrosophy Theory is a sentiment analysis specific case regarding processing of the three states: positive, negative, neutral. The study is intended ...

  7. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  8. Baltic sea algae analysis using Bayesian spatial statistics methods

    Directory of Open Access Journals (Sweden)

    Eglė Baltmiškytė

    2013-03-01

    Full Text Available Spatial statistics is one of the fields in statistics dealing with spatialy spread data analysis. Recently, Bayes methods are often applied for data statistical analysis. A spatial data model for predicting algae quantity in the Baltic Sea is made and described in this article. Black Carrageen is a dependent variable and depth, sand, pebble, boulders are independent variables in the described model. Two models with different covariation functions (Gaussian and exponential are built to estimate the best model fitting for algae quantity prediction. Unknown model parameters are estimated and Bayesian kriging prediction posterior distribution is computed in OpenBUGS modeling environment by using Bayesian spatial statistics methods.

  9. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    Science.gov (United States)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  10. Study on color difference estimation method of medicine biochemical analysis

    Science.gov (United States)

    Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Sun, Jiashi; Zhou, Fengkun

    2006-01-01

    The biochemical analysis in medicine is an important inspection and diagnosis method in hospital clinic. The biochemical analysis of urine is one important item. The Urine test paper shows corresponding color with different detection project or different illness degree. The color difference between the standard threshold and the test paper color of urine can be used to judge the illness degree, so that further analysis and diagnosis to urine is gotten. The color is a three-dimensional physical variable concerning psychology, while reflectance is one-dimensional variable; therefore, the estimation method of color difference in urine test can have better precision and facility than the conventional test method with one-dimensional reflectance, it can make an accurate diagnose. The digital camera is easy to take an image of urine test paper and is used to carry out the urine biochemical analysis conveniently. On the experiment, the color image of urine test paper is taken by popular color digital camera and saved in the computer which installs a simple color space conversion (RGB -> XYZ -> L *a *b *)and the calculation software. Test sample is graded according to intelligent detection of quantitative color. The images taken every time were saved in computer, and the whole illness process will be monitored. This method can also use in other medicine biochemical analyses that have relation with color. Experiment result shows that this test method is quick and accurate; it can be used in hospital, calibrating organization and family, so its application prospect is extensive.

  11. Comparative study of analysis methods in biospeckle phenomenon

    Science.gov (United States)

    da Silva, Emerson Rodrigo; Muramatsu, Mikiya

    2008-04-01

    In this work we present a review of main statistical properties of speckle patterns and accomplish a comparative study of the more used methods for analysis and extraction of information from optical grainy. The first and second order space-time statistics are dicussed in an overview perspective. The biospeckle phenomenon has detailed attention, specially in its application on monitoring of activity in tissues. The main techniques used to obtain information from speckle patterns are presented, with special prominence to autocorrelation function, co-occurrence matrices, Fujii's method, Briers' contrast and spatial and temporal contrast analisys (LASCA and LASTCA). An incipient method for analysis, based on the study of sucessive correlations contrast, is introduced. Numerical simulations, using diferent probability density functions for velocities of scatterers, were made with two objectives: to test the analysis methods and to give subsidies for interpretation of in vivo results. Vegetable and animal tissues are investigated, achieving the monitoring of senescence process and vascularization maps on leaves, the accompaniment of fungi contamined fruits, the mapping of activity in flowers and the analisys of healing in rats subjected to abdominal surgery. Experiments using the biospeckle phenomenon in microscopy are carried out. At last, it is evaluated the potentiality of biospeckle as diagnosis tool in chronic vein ulcer cared with low intensity laser therapy and the better analysis methods for each kind of tissue are pointed.

  12. Maximum-entropy-method analysis of neutron diffraction data

    International Nuclear Information System (INIS)

    The maximum-entropy method (MEM) is a very powerful method for deriving accurate electron-density distributions from X-ray diffraction data. The success of the method depends on the fact that the electron density is always positive. In order to analyse neutron diffraction data by the MEM, it is necessary to overcome the difficulty of negative scattering lengths for some atoms, such as Ti and Mn. In this work, three approaches to the MEM analysis of neutron powder diffraction data are examined. The data, from rutile (TiO2), have been collected previously and analysed by the Rietveld method. The first approach is to add an artificial large constant to the scattering-length density to maintain that density positive, then to subtract the same constant at the completion of the MEM analysis. (orig.)

  13. Inverse thermal analysis method to study solidification in cast iron

    DEFF Research Database (Denmark)

    Dioszegi, Atilla; Hattel, Jesper

    2004-01-01

    solution of a 1-dimensional heat transfer problem connected to solidification of cast alloys. In the analysis, the relation between the thermal state and the fraction solid of the metal is evaluated by a numerical method. This method contains an iteration algorithm controlled by an under relaxation term to......Solidification modelling of cast metals is widely used to predict final properties in cast components. Accurate models necessitate good knowledge of the solidification behaviour. The present study includes a re-examination of the Fourier thermal analysis method. This involves an inverse numerical...... achieve a stable convergence. The heat transfer problem is reduced to 1-dimension to promote the practical application of the method. Thermo-physical properties such as the volumetric heat capacity tabulated in the calculation are introduced as a function of solidifying phases. Experimental equipment was...

  14. Reliability analysis using an enhanced response surface moment method

    International Nuclear Information System (INIS)

    Moment methods, which are powerful and simple techniques for analyzing the reliability of a system, evaluate the statistical moments of a system response function and use information from the probability distribution in the analysis. The full factorial moment method (FFMM) performs reliability analysis by using a 3n full factorial design of experiments (DOE) and the Pearson system for random variables. To overcome the inefficiency of FFMM, the response surface moment method (RSMM) has been proposed, which is based on a response surface model (RSM) that is updated by adding cross product terms into the simple quadratic model. In this paper, we propose the enhanced RSMM (RSMM+) that modifies the procedure of selecting a cross product term in the RSMM and adds a process of judging whether the response surface model can be established before performing an additional experiment. We apply the proposed method to several examples and show that it gives better results in efficiency

  15. Variable cluster analysis method for building neural network model

    Institute of Scientific and Technical Information of China (English)

    王海东; 刘元东

    2004-01-01

    To address the problems that input variables should be reduced as much as possible and explain output variables fully in building neural network model of complicated system, a variable selection method based on cluster analysis was investigated. Similarity coefficient which describes the mutual relation of variables was defined. The methods of the highest contribution rate, part replacing whole and variable replacement are put forwarded and deduced by information theory. The software of the neural network based on cluster analysis, which can provide many kinds of methods for defining variable similarity coefficient, clustering system variable and evaluating variable cluster, was developed and applied to build neural network forecast model of cement clinker quality. The results show that all the network scale, training time and prediction accuracy are perfect. The practical application demonstrates that the method of selecting variables for neural network is feasible and effective.

  16. Highly Efficient Far Red/Near-Infrared Solid Fluorophores: Aggregation-Induced Emission, Intramolecular Charge Transfer, Twisted Molecular Conformation, and Bioimaging Applications.

    Science.gov (United States)

    Lu, Hongguang; Zheng, Yadan; Zhao, Xiaowei; Wang, Lijuan; Ma, Suqian; Han, Xiongqi; Xu, Bin; Tian, Wenjing; Gao, Hui

    2016-01-01

    The development of organic fluorophores with efficient solid-state emissions or aggregated-state emissions in the red to near-infrared region is still challenging. Reported herein are fluorophores having aggregation-induced emission ranging from the orange to far red/near-infrared (FR/NIR) region. The bioimaging performance of the designed fluorophore is shown to have potential as FR/NIR fluorescent probes for biological applications. PMID:26576818

  17. Properties and linkages of some index decomposition analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Ang, B.W. [Department of Industrial and Systems Engineering, National University of Singapore (Singapore); Energy Studies Institute, National University of Singapore (Singapore); Huang, H.C.; Mu, A.R. [Department of Industrial and Systems Engineering, National University of Singapore (Singapore)

    2009-11-15

    We study the properties and linkages of some popular index decomposition analysis (IDA) methods in energy and carbon emission analyses. Specifically, we introduce a simple relationship between the arithmetic mean Divisia index (AMDI) method and the logarithmic mean Divisia index method I (LMDI I), and show that such a relationship can be extended to cover most IDA methods linked to the Divisia index. We also formalize the relationship between the Laspeyres index method and the Shapley value in the IDA context. Similarly, such a relationship can be extended to cover other IDA methods linked to the Laspeyres index through defining the characteristic function in the Shapley value. It is found that these properties and linkages apply to decomposition of changes conducted additively. Similar properties and linkages cannot be established in the multiplicative case. The implications of the findings on IDA studies are discussed. (author)

  18. Properties and linkages of some index decomposition analysis methods

    International Nuclear Information System (INIS)

    We study the properties and linkages of some popular index decomposition analysis (IDA) methods in energy and carbon emission analyses. Specifically, we introduce a simple relationship between the arithmetic mean Divisia index (AMDI) method and the logarithmic mean Divisia index method I (LMDI I), and show that such a relationship can be extended to cover most IDA methods linked to the Divisia index. We also formalize the relationship between the Laspeyres index method and the Shapley value in the IDA context. Similarly, such a relationship can be extended to cover other IDA methods linked to the Laspeyres index through defining the characteristic function in the Shapley value. It is found that these properties and linkages apply to decomposition of changes conducted additively. Similar properties and linkages cannot be established in the multiplicative case. The implications of the findings on IDA studies are discussed.

  19. Quaternion-based discriminant analysis method for color face recognition.

    Science.gov (United States)

    Xu, Yong

    2012-01-01

    Pattern recognition techniques have been used to automatically recognize the objects, personal identities, predict the function of protein, the category of the cancer, identify lesion, perform product inspection, and so on. In this paper we propose a novel quaternion-based discriminant method. This method represents and classifies color images in a simple and mathematically tractable way. The proposed method is suitable for a large variety of real-world applications such as color face recognition and classification of the ground target shown in multispectrum remote images. This method first uses the quaternion number to denote the pixel in the color image and exploits a quaternion vector to represent the color image. This method then uses the linear discriminant analysis algorithm to transform the quaternion vector into a lower-dimensional quaternion vector and classifies it in this space. The experimental results show that the proposed method can obtain a very high accuracy for color face recognition. PMID:22937054

  20. Probabilistic safety analysis : a new nuclear power plants licensing method

    International Nuclear Information System (INIS)

    After a brief retrospect of the application of Probabilistic Safety Analysis in the nuclear field, the basic differences between the deterministic licensing method, currently in use, and the probabilistic method are explained. Next, the two main proposals (by the AIF and the ACRS) concerning the establishment of the so-called quantitative safety goals (or simply 'safety goals') are separately presented and afterwards compared in their most fundamental aspects. Finally, some recent applications and future possibilities are discussed. (Author)

  1. Statistical methods for genetic association analysis involving complex longitudinal data

    OpenAIRE

    Salem, Rany Mansour

    2009-01-01

    Most, if not all, human phenotypes exhibit a temporal, dosage-dependent, or age effect. In this work, I explore and showcase the use different analytical methods for assessing the genetic contribution to traits with temporal trends, or what I refer to as 'dynamic complex traits' (DCTs). The study of DCTs could offer insights into disease pathogenesis that are not achievable in other research settings. I describe the development and application of a method of DCT analysis termed ̀Curve- Based ...

  2. Iteration Complexity Analysis of Block Coordinate Descent Methods

    OpenAIRE

    Hong, Mingyi; Wang, Xiangfeng; Razaviyayn, Meisam; Luo, Zhi-Quan

    2013-01-01

    In this paper, we provide a unified iteration complexity analysis for a family of general block coordinate descent (BCD) methods, covering popular methods such as the block coordinate gradient descent (BCGD) and the block coordinate proximal gradient (BCPG), under various different coordinate update rules. We unify these algorithms under the so-called Block Successive Upper-bound Minimization (BSUM) framework, and show that for a broad class of multi-block nonsmooth convex problems, all algor...

  3. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  4. Modified method of analysis for surgical correction of facial asymmetry

    OpenAIRE

    Christou, Terpsithea; Kau, Chung How; Waite, Peter D.; Kheir, Nadia Abou; Mouritsen, David

    2013-01-01

    Introduction: The aim of this article was to present a new method of analysis using a three dimensional (3D) model of an actual patient with facial asymmetry, for the assessment of her facial changes and the quantification of the deformity. This patient underwent orthodontic and surgical treatment to correct a severe facial asymmetry. Materials and Methods: The surgical procedure was complex and the case was challenging. The treatment procedure required an orthodontic approach followed by Le ...

  5. Probabilistic structural analysis methods for critical SSME propulsion components

    Science.gov (United States)

    Chamis, C. C.

    1986-01-01

    The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The progress in the development of generic probabilistic models for various individual loads which consist of a steady state load, a periodic load, a random load, and a spike, is discussed. The capabilities of the Numerical Evaluation of Stochastic Structures Under Stress finite element code designed for probabilistic structural analysis of the SSME are examined. Variation principles for formulation probabilistic finite elements and a structural analysis for evaluating the geometric and material properties tolerances on the structural response of turbopump blades are being designed.

  6. On the methods and examples of aircraft impact analysis

    International Nuclear Information System (INIS)

    Conclusions: Aircraft impact analysis can be performed today within feasible run times using PCs and available advanced commercial finite element software tools. Adequate element and material model technologies exist. Explicit time integration enables analysis of very large deformation Missile/Target impacts. Meshless/particle based methods may be beneficial for large deformation concrete “punching shear” analysis – potentially solves the “element erosion” problem associated with FE, but are not generally implemented yet in major commercial software. Verification of the complicated modeling technologies continues to be a challenge. Not much work has been done yet on ACI shock loading – redundant and physically separated safety trains key to success. Analysis approach and detail should be “balanced” - commensurate with the significant uncertainties - do not “over-do” details of some parts of the model (e.g., the plane) and the analysis

  7. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical database used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical database, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The database will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects: (1) improving a Doppler reactivity analysis model in a Monte Carlo calculation code MVP, (2) sensitivity study of nuclear cross section date on reactivity calculation of experimental cores composed of UO2 and MOX fuel rods, (3) analysis of isotopic composition data for UO2 and MOX fuels and (4) the guide of reviewing the core analysis codes and others. (author)

  8. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical data base used in reviewing validation of core analysis methods of LWRs in the coming occasions: (1) confirming the core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical data base, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The data base will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects (1) analysis of the measurement data of Doppler reactivity in experimental MOX core simulating LWR cores, (2) measurements of isotopic compositions of fission product nuclides on high-burnup BWR UO2 fuels and the analysis of the measurement data, and (3) neutronics analysis of the experimental data that has been obtained in the international joint programs such as FUBILA and REBUS. (author)

  9. Comparative studies of upconversion luminescence characteristics and cell bioimaging based on one-step synthesized upconversion nanoparticles capped with different functional groups

    International Nuclear Information System (INIS)

    Herein, three types of upconverting NaGdF4:Yb/Er nanoparticles (UCNPs) have been synthesized via one-step hydrothermal synthesis with polyethylene glycol (PEG), polyethylenimine (PEI) and 6-aminocapronic acid (6AA) functionalization. To evident the presence of these groups, FTIR spectra and ζ-potentials were measured to support the successful capping of PEG, PEI and 6AA on the UCNPs. The regular morphology and cubic phase of these functionalized UCNPs were attributed to the capping effect of the surfactants. Tunable upconversion luminescence (UCL) from red to green were observed under 980 nm laser excitation and the UCL tuning was attributed to the presence of various surface ligands. Moreover, surface group dependent UCL bioimaging was performed in HeLa cells. The enhanced UCL bioimaging demonstrated by PEI functionalized UCNPs evident high cell uptake. The significant cell uptake is explained by the electrostatic attraction between the amino groups (–NH2) and the cell membrane. Moreover, the functionalized UCNPs demonstrated low cytotoxicity in MTT assay. Additional, paramagnetic property was presented by these UCNPs under magnetic field. - Highlights: • Tunable upconversion emission by capped functional groups under fixed composition. • Surface dependent upconversion luminescence bioimaging in HeLa cells. • Low cytotoxicity. • Additional paramagnetic property due to Gd3+ ions

  10. Further development of a static seismic analysis method for piping systems: The load coefficient method

    International Nuclear Information System (INIS)

    Currently the ASME Boiler and Pressure Vessel Code is considering a simplified Static Analysis Method for seismic design of piping systems for incorporation into Appendix N of Section 3, Division 1, of the Code. This proposed method, called the Load Coefficient Method, uses coefficients, ranging from .4 to 1.0, times the peak value of the in-structure response spectra with a static analysis technique to evaluate the response of piping systems to seismic events. The coefficient used is a function of the pipe support spacing hence the frequency response of the system and in general, the greater the support spacing the lower the frequency, the lower the spectral response, hence the lower the coefficient. The results of the Load Coefficient Method static analyses have been compared to analyses using the Response Spectrum Modal Analysis Method. Reaction loads were also evaluated with one important modification, a minimum support reaction load as a function of nominal pipe diameter has been established. This assures that lightly loaded supports regardless of the analytical method used will be loaded to realistic values and eliminate the potential for under designed supports. With respect to the accelerations applicable to inline components, a factor of 0.9 times the Square Root of Sum of Square of horizontal floor spectra peaks was determined to envelop the horizontal accelerations and a coefficient of 1.2 was shown to envelop the vertical accelerations. Presented in this paper is the current form of the load coefficient method, a summarization of the results of the over 2,700 benchmark analysis of piping system segments which form the basis for the acceptance of the method, and an explanation of the use of the method

  11. Application of the IPEBS method to dynamic contingency analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A.C.B. [FURNAS, Rio de Janeiro, RJ (Brazil); Pedroso, A.S. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    1994-12-31

    Dynamic contingency analysis is certainly a demanding task in the context of dynamic performance evaluation. This paper presents the results of a test for checking the contingency screening capability of the IPEBS method. A brazilian 1100-bus, 112-gen system was used in the test; the ranking of the contingencies based on critical clearing times obtained with IPEBS, was compared with the ranking derived from detailed time-domain simulation. The results of this comparison encourages us to recommended the use of the method in industry applications, in a complementary basis to the current method of time domain simulation. (author) 5 refs., 1 fig., 2 tabs.

  12. Differential algebraic method for aberration analysis of typical electrostatic lenses

    International Nuclear Information System (INIS)

    In this paper up to fifth-order geometric and third-order chromatic aberration coefficients of typical electrostatic lenses are calculated by means of the charged particle optics code, COSY INFINITY, based on the differential algebraic (DA) method. A two-tube immersion lens and a symmetric einzel lens have been chosen as two examples, whose axial potential distributions are numerically calculated by a FORTRAN program using the finite difference method. The DA results are in good agreement with those evaluated by the aberration integrals in electron optics. The DA method presented here can easily be extended to aberration analysis of other numerically computed electron lenses, including magnetic lenses

  13. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations are...... solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  14. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Weber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution which has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  15. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Waber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution which has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  16. Numerical analysis of jet breakup behavior using particle method

    International Nuclear Information System (INIS)

    A continuous jet changes to droplets where jet breakup occurs. In this study, two-dimensional numerical analysis of jet breakup is performed using the MPS method (Moving Particle Semi-implicit Method) which is a particle method for incompressible flows. The continuous fluid surrounding the jet is neglected. Dependencies of the jet breakup length on the Weber number and the Froude number agree with the experiment. The size distribution of droplets is in agreement with the Nukiyama-Tanasawa distribution that has been widely used as an experimental correlation. Effects of the Weber number and the Froude number on the size distribution are also obtained. (author)

  17. Wing analysis using a transonic potential flow computational method

    Science.gov (United States)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  18. Performance Analysis of Unsupervised Clustering Methods for Brain Tumor Segmentation

    Directory of Open Access Journals (Sweden)

    Tushar H Jaware

    2013-10-01

    Full Text Available Medical image processing is the most challenging and emerging field of neuroscience. The ultimate goal of medical image analysis in brain MRI is to extract important clinical features that would improve methods of diagnosis & treatment of disease. This paper focuses on methods to detect & extract brain tumour from brain MR images. MATLAB is used to design, software tool for locating brain tumor, based on unsupervised clustering methods. K-Means clustering algorithm is implemented & tested on data base of 30 images. Performance evolution of unsupervised clusteringmethods is presented.

  19. Transuranic waste characterization sampling and analysis methods manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  20. Transuranic waste characterization sampling and analysis methods manual

    International Nuclear Information System (INIS)

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP

  1. Combination of the prompt neutron capture method with other neutron methods for substance elemental content analysis

    International Nuclear Information System (INIS)

    Neutron analysis method of determining element composition have found wide range of applications in industry thanks to different types of interaction of neutron with substances /1/. With the aim of widening the range of problems to be solved, on the basis of the device /2/ for determining the element content of substance, possibilities of combining the method based on the use of neutron capture gamma-ray spectrometry with other neutron methods, in particular neutron activation analysis and neutron absorption analysis were studied. In this radionuclide source (252Cf) with the yield of 1,5 x 107 neutron/sec is used. By means of using neutron capture gamma radiation spectrometry the possibilities of determining some elements (H, B, N, S etc. ), which are not determined by very widely used method, activation analysis. These elements can be determined by both the semiconductor and scintillation detectors with parameters fitting the manufacturing requirements. And for a number of elements ( B, Cl, Cd, Sm, Gd) very high limits of determination ( up to 10- 5 %) are possible using semiconductor Ge (Li) -detectors with high resolution. Possibility of determination of some 'well' activated elements ( K, Al, Fe, Mn, Ti, Sc etc.) in samples of ore and products of their processing using the neutron-activation analysis. For 1 hour of irradiation on the experimental device quite accurate analytical peak, of these elements are obtained, allowing to determine them qualitatively. However, with decreasing neutron yield of radionuclide source it becomes more difficult to achieve the necessary parameters both in neutron capture and activation analysis. Experimental works on determination of some elements with large cross-sections of capture ( B, Cd, Sm ) by absorption of neutrons in the investigated substance, i.e. using the neutron absorption analysis method with absence of other large capture cross section elements in the samples being studied. (author)

  2. Magnetopause orientation: Comparison between generic residue analysis and BV method

    Science.gov (United States)

    Dorville, Nicolas; Haaland, Stein; Anekallu, Chandrasekhar; Belmont, Gérard; Rezeau, Laurence

    2015-05-01

    Determining the direction normal to the magnetopause layer is a key step for any study of this boundary. Various techniques have been developed for this purpose. We focus here on generic residue analysis (GRA) methods, which are based on conservation laws, and the new iterative BV method, where B represents the magnetic field and V refers to the ion velocity. This method relies on a fit of the magnetic field hodogram against a modeled geometrical shape and on the way this hodogram is described in time. These two methods have different underlying model assumptions and validity ranges. We compare here magnetopause normals predicted by BV and GRA methods to better understand the sensitivity of each method on small departures from its own physical hypotheses. This comparison is carried out first on artificial data with magnetopause-like noise. Then a statistical study is carried out using a list of 149 flank and dayside magnetopause crossings from Cluster data where the BV method is applicable, i.e., where the magnetopause involves a single-layer current sheet, with a crudely C-shaped magnetic hodogram. These two comparisons validate the quality of the BV method for all these cases where it is applicable. The method provides quite reliable normal directions in all these cases, even when the boundary is moving with a varying velocity, which distorts noticeably the results of most of the other methods.

  3. Classification of analysis methods for characterization of magnetic nanoparticle properties

    DEFF Research Database (Denmark)

    Posth, O.; Hansen, Mikkel Fougt; Steinhoff, U.;

    2015-01-01

    The aim of this paper is to provide a roadmap for the standardization of magnetic nanoparticle (MNP) characterization. We have assessed common MNP analysis techniques under various criteria in order to define the methods that can be used as either standard techniques for magnetic particle charact...

  4. CHROMATOGRAPHIC METHODS IN THE ANALYSIS OF CHOLESTEROL AND RELATED LIPIDS

    NARCIS (Netherlands)

    HOVING, EB

    1995-01-01

    Methods using thin-layer chromatography, solid-phase extraction, gas chromatography, high-performance liquid chromatography and supercritical fluid chromatography are described for the analysis of single cholesterol, esterified and sulfated cholesterol, and for cholesterol in the context of other li

  5. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    Science.gov (United States)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  6. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael;

    2016-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu...

  7. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  8. Material-Point-Method Analysis of Collapsing Slopes

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    , a deformed material description is introduced, based on time integration of the deformation gradient and utilising Gauss quadrature over the volume associated with each material point. The method has been implemented in a Fortran code and employed for the analysis of a landslide that took place during...

  9. Single corn kernel aflatoxin B1 extraction and analysis method

    Science.gov (United States)

    Aflatoxins are highly carcinogenic compounds produced by the fungus Aspergillus flavus. Aspergillus flavus is a phytopathogenic fungus that commonly infects crops such as cotton, peanuts, and maize. The goal was to design an effective sample preparation method and analysis for the extraction of afla...

  10. Solar spectra analysis based on the statistical moment method

    Czech Academy of Sciences Publication Activity Database

    Druckmüller, M.; Klvaňa, Miroslav; Druckmüllerová, Z.

    2007-01-01

    Roč. 31, č. 1 (2007), s. 297-307. ISSN 1845-8319. [Dynamical processes in the solar atmosphere. Hvar, 24.09.2006-29.09.2006] R&D Projects: GA ČR GA205/04/2129 Institutional research plan: CEZ:AV0Z10030501 Keywords : spectral analysis * method Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  11. Aplication of New Method of Multi-Criteria Analysis

    OpenAIRE

    Jiri Dvorsky; Leos Jurica; Zdenek Hradilek; Petr Krejci

    2006-01-01

    The increase of reliability of electrical networks can be achieved for example by use of new components. That modernisation is considerably expensive. Therefore close attention is paid to selection of new components and place of their position. The use of methods of multi-criteria analysis is suitable for this decision making.

  12. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  13. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    International Nuclear Information System (INIS)

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits

  14. Intelligent classification methods of grain kernels using computer vision analysis

    Science.gov (United States)

    Lee, Choon Young; Yan, Lei; Wang, Tianfeng; Lee, Sang Ryong; Park, Cheol Woo

    2011-06-01

    In this paper, a digital image analysis method was developed to classify seven kinds of individual grain kernels (common rice, glutinous rice, rough rice, brown rice, buckwheat, common barley and glutinous barley) widely planted in Korea. A total of 2800 color images of individual grain kernels were acquired as a data set. Seven color and ten morphological features were extracted and processed by linear discriminant analysis to improve the efficiency of the identification process. The output features from linear discriminant analysis were used as input to the four-layer back-propagation network to classify different grain kernel varieties. The data set was divided into three groups: 70% for training, 20% for validation, and 10% for testing the network. The classification experimental results show that the proposed method is able to classify the grain kernel varieties efficiently.

  15. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  16. Regionally Smoothed Meta-Analysis Methods for GWAS Datasets.

    Science.gov (United States)

    Begum, Ferdouse; Sharker, Monir H; Sherman, Stephanie L; Tseng, George C; Feingold, Eleanor

    2016-02-01

    Genome-wide association studies are proven tools for finding disease genes, but it is often necessary to combine many cohorts into a meta-analysis to detect statistically significant genetic effects. Often the component studies are performed by different investigators on different populations, using different chips with minimal SNPs overlap. In some cases, raw data are not available for imputation so that only the genotyped single nucleotide polymorphisms (SNPs) results can be used in meta-analysis. Even when SNP sets are comparable, different cohorts may have peak association signals at different SNPs within the same gene due to population differences in linkage disequilibrium or environmental interactions. We hypothesize that the power to detect statistical signals in these situations will improve by using a method that simultaneously meta-analyzes and smooths the signal over nearby markers. In this study, we propose regionally smoothed meta-analysis methods and compare their performance on real and simulated data. PMID:26707090

  17. Fourier analysis for discontinuous Galerkin and related methods

    Institute of Scientific and Technical Information of China (English)

    ZHANG MengPing; SHU Chi-Wang

    2009-01-01

    In this paper we review a series of recent work on using a Fourier analysis technique to study the sta-bility and error estimates for the discontinuous Galerkin method and other related schemes. The ad-vantage of this approach is that it can reveal instability of certain "bad"' schemes; it can verify stability for certain good schemes which are not easily amendable to standard finite element stability analysis techniques; it can provide quantitative error comparisons among different schemes; and it can be used to study superconvergence and time evolution of errors for the discontinuous Galerkin method. We will briefly describe this Fourier analysis technique, summarize its usage in stability and error estimates for various schemes, and indicate the advantages and disadvantages of this technique in comparison with other finite element techniques.

  18. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  19. Transient Analysis of Hysteresis Queueing Model Using Matrix Geometric Method

    Directory of Open Access Journals (Sweden)

    Wajiha Shah

    2011-10-01

    Full Text Available Various analytical methods have been proposed for the transient analysis of a queueing system in the scalar domain. In this paper, a vector domain based transient analysis is proposed for the hysteresis queueing system with internal thresholds for the efficient and numerically stable analysis. In this system arrival rate of customer is controlled through the internal thresholds and the system is analyzed as a quasi-birth and death process through matrix geometric method with the combination of vector form Runge-Kutta numerical procedure which utilizes the special matrices. An arrival and service process of the system follows a Markovian distribution. We analyze the mean number of customers in the system when the system is in transient state against varying time for a Markovian distribution. The results show that the effect of oscillation/hysteresis depends on the difference between the two internal threshold values.

  20. Development of sample preparation method for honey analysis using PIXE

    International Nuclear Information System (INIS)

    We developed an original preparation method for honey samples (samples in paste-like state) specifically designed for PIXE analysis. The results of PIXE analysis of thin targets prepared by adding a standard containing nine elements to honey samples demonstrated that the preparation method bestowed sufficient accuracy on quantitative values. PIXE analysis of 13 kinds of honey was performed, and eight mineral components (Si, P, S, K, Ca, Mn, Cu and Zn) were detected in all honey samples. The principal mineral components were K and Ca, and the quantitative value for K accounted for the majority of the total value for mineral components. K content in honey varies greatly depending on the plant source. Chestnuts had the highest K content. In fact, it was 2-3 times that of Manuka, which is known as a high quality honey. K content of false-acacia, which is produced in the greatest abundance, was 1/20 that of chestnuts. (author)

  1. Intelligent classification methods of grain kernels using computer vision analysis

    International Nuclear Information System (INIS)

    In this paper, a digital image analysis method was developed to classify seven kinds of individual grain kernels (common rice, glutinous rice, rough rice, brown rice, buckwheat, common barley and glutinous barley) widely planted in Korea. A total of 2800 color images of individual grain kernels were acquired as a data set. Seven color and ten morphological features were extracted and processed by linear discriminant analysis to improve the efficiency of the identification process. The output features from linear discriminant analysis were used as input to the four-layer back-propagation network to classify different grain kernel varieties. The data set was divided into three groups: 70% for training, 20% for validation, and 10% for testing the network. The classification experimental results show that the proposed method is able to classify the grain kernel varieties efficiently

  2. Template matching method for the analysis of interstellar cloud structure

    CERN Document Server

    Juvela, M

    2016-01-01

    The structure of interstellar medium can be characterised at large scales in terms of its global statistics (e.g. power spectra) and at small scales by the properties of individual cores. Interest has been increasing in structures at intermediate scales, resulting in a number of methods being developed for the analysis of filamentary structures. We describe the application of the generic template-matching (TM) method to the analysis of maps. Our aim is to show that it provides a fast and still relatively robust way to identify elongated structures or other image features. We present the implementation of a TM algorithm for map analysis. The results are compared against rolling Hough transform (RHT), one of the methods previously used to identify filamentary structures. We illustrate the method by applying it to Herschel surface brightness data. The performance of the TM method is found to be comparable to that of RHT but TM appears to be more robust regarding the input parameters, for example, those related t...

  3. A method for rapid similarity analysis of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Liu Na

    2006-11-01

    Full Text Available Abstract Background Owing to the rapid expansion of RNA structure databases in recent years, efficient methods for structure comparison are in demand for function prediction and evolutionary analysis. Usually, the similarity of RNA secondary structures is evaluated based on tree models and dynamic programming algorithms. We present here a new method for the similarity analysis of RNA secondary structures. Results Three sets of real data have been used as input for the example applications. Set I includes the structures from 5S rRNAs. Set II includes the secondary structures from RNase P and RNase MRP. Set III includes the structures from 16S rRNAs. Reasonable phylogenetic trees are derived for these three sets of data by using our method. Moreover, our program runs faster as compared to some existing ones. Conclusion The famous Lempel-Ziv algorithm can efficiently extract the information on repeated patterns encoded in RNA secondary structures and makes our method an alternative to analyze the similarity of RNA secondary structures. This method will also be useful to researchers who are interested in evolutionary analysis.

  4. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  5. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  6. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  7. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  8. Improvements in biamperometric method for remote analysis of uranium

    International Nuclear Information System (INIS)

    One of the titrimetric methods most suitable for remote operations with Master Slave Manipulators inside hot cells is the biamperometric method. The biamperometric method for the analysis of uranium reported in the literature is found to give rise to a significant bias, especially with low aliquots of uranium and the waste volume is also considerable which is not desirable from the point of view of radioactive waste disposal. In the present method, the bias as well as waste volume are reduced. Also addition of vanadyl sulphate is found necessary to provide a sharp end point in the titration curve. The role of vanadyl sulphate in improving the titration method has been investigated by spectrophotometry and electrometry. A new mechanism for the role of vanadyl sulphate which is in conformity with the observations made in coulometric titration of uranium, is proposed. Interference from deliberate additions of high concentrations of stable species of fission product elements is found negligible. Hence this method is considered highly suitable for remote analysis of uranium in intensely radioactive reprocessing solutions for control purposes, provided radioactivity does not pose new problems. (auth.)

  9. Surface-confined fluorescence enhancement of Au nanoclusters anchoring to a two-dimensional ultrathin nanosheet toward bioimaging

    Science.gov (United States)

    Tian, Rui; Yan, Dongpeng; Li, Chunyang; Xu, Simin; Liang, Ruizheng; Guo, Lingyan; Wei, Min; Evans, David G.; Duan, Xue

    2016-05-01

    Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC/ELDH hybrid material exhibits excellent imaging properties with good stability and biocompatibility in the intracellular environment. Therefore, this work provides a facile strategy to achieve highly luminescent Au NCs via surface-confined emission enhancement imposed by ultrathin inorganic nanosheets, which can be potentially used in bio-imaging and cell labelling.Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC

  10. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO2+ZrO2) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying very

  11. Comparation studies of uranium analysis method using spectrophotometer and voltammeter

    International Nuclear Information System (INIS)

    Comparation studies of uranium analysis method by spectrophotometer and voltammeter had been done. The objective of experiment is to examine the reliability of analysis method and instrument performance by evaluate parameters; linearity, accuracy, precision and detection limit. Uranyl nitrate hexahydrate is used as standard, and the sample is solvent mixture of tributyl phosphate and kerosene containing uranium (from phosphoric acid purification unit Petrokimia Gresik). Uranium (U) stripping in the sample use HN03 0,5 N and then was analyzed by using of both instrument. Analysis of standard show that both methods give a good linearity by correlation coefficient > 0,999. Spectrophotometry give accuration 99,34 - 101,05 % with ratio standard deviation (RSD) 1,03 %; detection limit (DL) 0,05 ppm. Voltammetry give accuration 95,63 -101,49 % with RSD 3,91 %; detection limit (DL) 0,509 ppm. On the analysis of sludge samples were given the significantly different in result; spectrophotometry give U concentration 4,445 ppm by RSD 6,74 % and voltammetry give U concentration 7,693 by RSD 19,53%. (author)

  12. Static Aeroelastic Analysis with an Inviscid Cartesian Method

    Science.gov (United States)

    Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.

    2014-01-01

    An embedded-boundary, Cartesian-mesh flow solver is coupled with a three degree-of-freedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves a nonlinear, aerostructural system of equations using a loosely-coupled strategy. An open-source, 3-D discrete-geometry engine is utilized to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The coupling interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. After verifying the structural model with comparisons to Euler beam theory, two applications of the analysis method are presented as validation. The first is a relatively stiff, transport wing model which was a subject of a recent workshop on aeroelasticity. The second is a very flexible model recently tested in a low speed wind tunnel. Both cases show that the aeroelastic analysis method produces results in excellent agreement with experimental data.

  13. Comparison of Method for the Simultaneous Analysis of Bioactive for the Eurycoma longifolia jack using different Analysis Methods

    International Nuclear Information System (INIS)

    Eurycoma longifolia jack (Tongkat Ali, Genus: Eurycoma; Family, Simaroubaceae) is one of the most popular tropical herbal plants. The plant contains a series of quassinoids, which are mainly responsible for its bitter taste. The plant extract, especially roots, are exclusively used (traditionally) for enhancing testosterone levels in men. The roots also have been used in indigenous traditional medicines for its unique anti-malarial, anti-pyretic, antiulcer, cytotoxic and aphrodisiac properties. As part of an on-going research on the bioactive compound of Eurycoma longifolia and evaluation for an optimized analysis method and parameter that influence in LC-MS analysis were carried out. Identification of the bioactive compounds was based on comparison of calculated retention time and mass spectral data with literature values. Examination of the Eurycoma longifolia sample showed some variations and differences in terms of parameters in LC-MS. However, combined method using methanol as the solvent with injection volume 1.0 μL and analysis in ultra scan mode and acetic acid as acidic modifier is the optimum method for LCMS analysis of Eurycoma longifolia because it successfully detected the optimum mass of compounds with good resolution and perfectly separated within a short analysis time. (author)

  14. Variational Methods in Sensitivity Analysis and Optimization for Aerodynamic Applications

    Science.gov (United States)

    Ibrahim, A. H.; Hou, G. J.-W.; Tiwari, S. N. (Principal Investigator)

    1996-01-01

    Variational methods (VM) sensitivity analysis, which is the continuous alternative to the discrete sensitivity analysis, is employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The determination of the sensitivity derivatives of the performance index or functional entails the coupled solutions of the state and costate equations. As the stable and converged numerical solution of the costate equations with their boundary conditions are a priori unknown, numerical stability analysis is performed on both the state and costate equations. Thereafter, based on the amplification factors obtained by solving the generalized eigenvalue equations, the stability behavior of the costate equations is discussed and compared with the state (Euler) equations. The stability analysis of the costate equations suggests that the converged and stable solution of the costate equation is possible only if the computational domain of the costate equations is transformed to take into account the reverse flow nature of the costate equations. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite

  15. Computer methods for geological analysis of radiometric data

    International Nuclear Information System (INIS)

    Whether an explorationist equates anomalies with potential uranium ore deposits or analyses radiometric data in terms of their relationships with other geochemical, geophysical, and geological data, the anomaly or anomalous zone is the most common starting point for subsequent study or field work. In its preliminary stages, the definition of meaningful anomalies from raw data is a statistical problem requiring the use of a computer. Because radiometric data, when properly collected and reduced, are truly geochemical, they can be expected to relate in part to changes in surface or near-surface geology. Data variations caused strictly by differences in gross chemistry of the lithologies sampled constitute a noise factor which must be removed for proper analysis. Texas Instruments Incorporated has developed an automated method of measuring the statistical significance of data by incorporating geological information in the process. This method of computerized geological analysis of radiometric data (CGARD) is similar to a basic method of the exploration geochemist and has been proved successful in its application to airborne radiometric data collected on four continents by Texas Instruments Incorporated. This beginning and its natural follow-on methods of automated or interpretive analysis are based simply on the perception of radiometric data as sets of statistically distributed data in both the frequency and spatial domains. (author)

  16. Methods for multielement analysis of high purity noble metals

    International Nuclear Information System (INIS)

    The current state of four main methods for analysis of high purity noble metals: atomic-absorption, atomic-emission, neutron-activation and spark source mass spectrometry methods. Most of impurities, 65 elements including Cs, Be, B, Sc, In, Zr, Hf, V, Nb, Ta, Mo, W, Tl, Te, Re, I, Ru, Th, U, RE and others, are determined by the method of spark source mass spectrometry. The detection limits for most of impurities are at the 10-6-10-8% level. The neutron-activation analysis possess the lowest detection limits. In the given case the detection limits can be reduced on account of sample irradiation in the 1-5x1014 n/cm2 x s neutron fluxes 10-15 times; on account of increasing the mass of samples analyzed up to 1g - 10 times; on account of using Ge(Li)-detectors with sensitive volume of 100-150 cm3 - 3-7 times. Thus under simultaneous realization of these conditions the detection limits are reduced approximately 5x103 times. The methods of extraction - atomic-absorption and echemical-spectral analysis - are inferior to spark source mass spectrometry and neutron activation analyses with regard to the attained detection limits but they are more simple and available

  17. Addition to the method of dimensional analysis in hydraulic problems

    Directory of Open Access Journals (Sweden)

    A.M. Kalyakin

    2013-03-01

    Full Text Available The modern engineering design, structures, and especially machines running of new technologies set to engineers the problems that require immediate solution. Therefore, the importance of the method of dimensional analysis as a tool for ordinary engineer is increasing, allows developers to get quick and quite simple solution of even very complex tasks.The method of dimensional analysis is being applied to almost any field of physics and engineering, but it is especially effective at solving problems of mechanics and applied mechanics – hydraulics, fluid mechanics, structural mechanics, etc.Until now the main obstacle to the application of the method of dimensional analysis in its classic form was a multifactorial problem (with many arguments, the solution of which was rather difficult and sometimes impossible. In order to overcome these difficulties, the authors of this study proposed a simple method – application of the combined option avoiding these difficulties.The main result of the study is a simple algorithm which application will make it possible to solve a large class of previously unsolvable problems.

  18. Radio occultation data analysis by the radioholographic method

    Science.gov (United States)

    Hocke, K.; Pavelyev, A. G.; Yakovlev, O. I.; Barthes, L.; Jakowski, N.

    1999-10-01

    The radioholographic method is briefly described and tested by using data of 4 radio occultation events observed by the GPS/MET experiment on 9 February 1997. The central point of the radioholographic method (Pavelyev, 1998) is the generation of a radiohologram along the LEO satellite trajectory which allows the calculation of angular spectra of the received GPS radio wave field at the LEO satellite. These spectra are promising in view of detection, analysis and reduction of multipath/diffraction effects, study of atmospheric irregularities and estimation of bending angle error. Initial analysis of angular spectra calculated by the multiple signal classification (MUSIC) method gives evidence that considerable multibeam propagation occurs at ray perigee heights below 20 km and at heights around 80-120 km for the 4 GPS/MET occultation events. Temperature profiles obtained by our analysis (radioholographic method, Abel inversion) are compared with those of the traditional retrieval by the UCAR GPS/MET team (bending angle from slope of phase front, Abel inversion). In 3 of 4 cases we found good agreement (standard deviation σT~1.5°K between both retrievals at heights 0-30 km).

  19. Validation study of core analysis methods for full MOX BWR

    International Nuclear Information System (INIS)

    JNES has been developing a technical data base used in reviewing validation of core analysis methods of LWRs in the occasions: (1) confirming core safety parameters of the initial core (one-third MOX core) through a full MOX core in Oma Nuclear Power Plant, which is under the construction, (2) licensing high-burnup MOX cores in the future and (3) reviewing topical reports on core analysis codes for safety design and evaluation. Based on the technical data base, JNES will issue a guide of reviewing the core analysis methods used for safety design and evaluation of LWRs. The data base will be also used for validation and improving of core analysis codes developed by JNES. JNES has progressed with the projects (1) measurements of Doppler reactivity in experimental MOX core simulating LWR cores, (2) measurement of isotopic compositions of fission product nuclides on high-burn up BWR UO2 fuels and (3) neutronics analysis of the experimental data that has been obtained in the international joint programs such as FUBILA and REBUS. (author)

  20. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  1. Synthetic methods in phase equilibria: A new apparatus and error analysis of the method

    DEFF Research Database (Denmark)

    Fonseca, José; von Solms, Nicolas

    2014-01-01

    equipment was confirmed through several tests, including measurements along the three phase co-existence line for the system ethane + methanol, the study of the solubility of methane in water, and of carbon dioxide in water. An analysis regarding the application of the synthetic isothermal method in the...

  2. Harmonic analysis method for nonlinear evolution equations, I

    CERN Document Server

    Wang, Baoxiang; Hao, Chengchun

    2011-01-01

    This monograph provides a comprehensive overview on a class of nonlinear evolution equations, such as nonlinear Schrödinger equations, nonlinear Klein-Gordon equations, KdV equations as well as Navier-Stokes equations and Boltzmann equations. The global wellposedness to the Cauchy problem for those equations is systematically studied by using the harmonic analysis methods. This book is self-contained and may also be used as an advanced textbook by graduate students in analysis and PDE subjects and even ambitious undergraduate students.

  3. HARMONIC ANALYSIS OF SVPWM INVERTER USING MULTIPLE-PULSES METHOD

    Directory of Open Access Journals (Sweden)

    Mehmet YUMURTACI

    2009-01-01

    Full Text Available Space Vector Modulation (SVM technique is a popular and an important PWM technique for three phases voltage source inverter in the control of Induction Motor. In this study harmonic analysis of Space Vector PWM (SVPWM is investigated using multiple-pulses method. Multiple-Pulses method calculates the Fourier coefficients of individual positive and negative pulses of the output PWM waveform and adds them together using the principle of superposition to calculate the Fourier coefficients of the all PWM output signal. Harmonic magnitudes can be calculated directly by this method without linearization, using look-up tables or Bessel functions. In this study, the results obtained in the application of SVPWM for values of variable parameters are compared with the results obtained with the multiple-pulses method.

  4. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  5. Nanosilicon properties, synthesis, applications, methods of analysis and control

    CERN Document Server

    Ischenko, Anatoly A; Aslalnov, Leonid A

    2015-01-01

    Nanosilicon: Properties, Synthesis, Applications, Methods of Analysis and Control examines the latest developments on the physics and chemistry of nanosilicon. The book focuses on methods for producing nanosilicon, its electronic and optical properties, research methods to characterize its spectral and structural properties, and its possible applications. The first part of the book covers the basic properties of semiconductors, including causes of the size dependence of the properties, structural and electronic properties, and physical characteristics of the various forms of silicon. It presents theoretical and experimental research results as well as examples of porous silicon and quantum dots. The second part discusses the synthesis of nanosilicon, modification of the surface of nanoparticles, and properties of the resulting particles. The authors give special attention to the photoluminescence of silicon nanoparticles. The third part describes methods used for studying and controlling the structure and pro...

  6. Chemical form analysis method of particulate nickel compounds

    International Nuclear Information System (INIS)

    Chemical form of nickel is metallic nickel, nickel oxide and nickel ferrite in the PWR primary chemistry condition. The distribution of chemical form depends on Ni/Fe ratio and chemistry condition, especially dissolved hydrogen concentration. Nickel is parent element of Co-58 and the chemical form is important for Co-58 generation. A method of chemical form analysis of nickel has been developed. This method uses the difference in dissolution characteristics of nickel compounds. Metallic nickel and others are separated by nitric acid, and others are divided to nickel oxide and nickel ferrite by oxalic acid. Some cruds in the primary coolant of a PWR were analyzed by using this method. The method is not complex and available at chemical laboratory in a nuclear power plant. (author)

  7. Comparative Analysis of Hydrogen Production Methods with Nuclear Reactors

    International Nuclear Information System (INIS)

    Hydrogen is highly effective and ecologically clean fuel. It can be produced by a variety of methods. Presently the most common are through electrolysis of water and through the steam reforming of natural gas. It is evident that the leading method for the future production of hydrogen is nuclear energy. Several types of reactors are being considered for hydrogen production, and several methods exist to produce hydrogen, including thermochemical cycles and high-temperature electrolysis. In the article the comparative analysis of various hydrogen production methods is submitted. It is considered the possibility of hydrogen production with the nuclear reactors and is proposed implementation of research program in this field at the IPPE sodium-potassium eutectic cooling high temperature experimental facility (VTS rig). (authors)

  8. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  9. Comparative analysis of different methods for graphene nanoribbon synthesis

    Directory of Open Access Journals (Sweden)

    Tošić Dragana D.

    2013-01-01

    Full Text Available Graphene nanoribbons (GNRs are thin strips of graphene that have captured the interest of scientists due to their unique structure and promising applications in electronics. This paper presents the results of a comparative analysis of morphological properties of graphene nanoribbons synthesized by different methods. Various methods have been reported for graphene nanoribons synthesis. Lithography methods usually include electron-beam (e-beam lithography, atomic force microscopy (AFM lithography, and scanning tunnelling microscopy (STM lithography. Sonochemical and chemical methods exist as well, namely chemical vapour deposition (CVD and anisotropic etching. Graphene nanoribbons can also be fabricated from unzipping carbon nanotubes (CNTs. We propose a new highly efficient method for graphene nanoribbons production by gamma irradiation of graphene dispersed in cyclopentanone (CPO. Surface morphology of graphene nanoribbons was visualized with atomic force and transmission electron microscopy. It was determined that dimensions of graphene nanoribbons are inversely proportional to applied gamma irradiation dose. It was established that the narrowest nanoribbons were 10-20 nm wide and 1 nm high with regular and smooth edges. In comparison to other synthesis methods, dimensions of graphene nanoribbons synthesized by gamma irradiation are slightly larger, but the yield of nanoribbons is much higher. Fourier transform infrared spectroscopy was used for structural analysis of graphene nanoribbons. Results of photoluminescence spectroscopy revealed for the first time that synthesized nanoribbons showed photoluminescence in the blue region of visible light in contrast to graphene nanoribbons synthesized by other methods. Based on disclosed facts, we believe that our synthesis method has good prospects for potential future mass production of graphene nanoribbons with uniform size, as well as for future investigations of carbon nanomaterials for

  10. The monostandard method in thermal neutron activation analysis

    International Nuclear Information System (INIS)

    A simple method is described for instrumental multielement thermal neutron activation analysis using a monostandard. For geological and air dust samples, iron is used as a comparator, while sodium has advantages for biological materials. To test the capabilities of this method, the values of the effective cross sections of the 23 elements determined were evaluated in a reactor site with an almost pure thermal neutron flux of about 9 x 1012 n x cm-2 x sec-1 and an epithermal neutron contribution of less than 0,03%. The obtained values were found to agree mostly well with the literature best values of thermal neutron cross sections. The results of an analysis by activation in the same site agree well with the relative method using multielement standard and for several standard reference materials with certified element contents. A comparison of the element contents obtained by the monostandard and relative methods together with corresponding precisions and accuracies is given. A brief survey of the monostandard method is presented. (orig.)

  11. Express method of nuclear safety analysis for VVER

    International Nuclear Information System (INIS)

    The original criterion method of nuclear safety analysis for WWER with Western nuclear fuel is presented. The method is based on adequate interdependence of safety criteria for fuel matrix and fuel cladding, and on conservative phenomenological criteria for nuclear fuel heat release power and heat transfer conditions in specific accident scenarios. The tolerability criteria for the temperature of nuclear fuel and fuel cladding from zirconium is analysed. The method based on the conservative criteria for analysis of nuclear safety is proposed. The heat balance equation and the boundary conditions of the external heat exchange are derived. The criteria for the safety for the temperature of fuel rod and cladding was obtained. The proposed method don’t require modeling of all possible accident sequences using detailed codes. Therefore, scope of computational studies are essentially reduced. In addition, it enables fast adaptation of criterion method for express-evaluation of the nuclear safety variations for different initial events and conditions, and at modification and/or change of nuclear fuel design. Keywords: nuclear safety, safety criterion, nuclear fuel, water-moderated water-cooled reactor (WWER), heat exchange

  12. Analysis on Large Deformation Compensation Method for Grinding Machine

    Directory of Open Access Journals (Sweden)

    Wang Ya-jie

    2013-08-01

    Full Text Available The positioning accuracy of computer numerical control machines tools and manufacturing systems is affected by structural deformations, especially for large sized systems. Structural deformations of the machine body are difficult to model and to predict. Researchs for the direct measurement of the amount of deformation and its compensation are farly limited in domestic and overseas,not involved to calculate the amount of deformation compensation. A new method to compensate large deformation caused by self-weight was presented in the paper. First of all, the compensation method is summarized; Then,static force analysis was taken on the large grinding machine through APDL(ANSYS Parameter Design Language. It could automatic extract results and form data files, getting the N points displacement in the working stroke of mechanical arm. Then, the mathematical model and corresponding flat rectangular function were established. The conclusion that the new compensation method is feasible was obtained through the analysis of displacement of N points. Finally, the MATLAB as a tool is used to calculate compensate amount and the accuracy of the proposed method is proved. Practice shows that the error caused by large deformatiion compensation method can meet the requirements of grinding.  

  13. Comparative analysis of redirection methods for asteroid resource exploitation

    Science.gov (United States)

    Bazzocchi, Michael C. F.; Emami, M. Reza

    2016-03-01

    An in-depth analysis and systematic comparison of asteroid redirection methods are performed within a resource exploitation framework using different assessment mechanisms. Through this framework, mission objectives and constraints are specified for the redirection of an asteroid from a near-Earth orbit to a stable orbit in the Earth-Moon system. The paper provides a detailed investigation of five redirection methods, i.e., ion beam, tugboat, gravity tractor, laser sublimation, and mass ejector, with respect to their capabilities for a redirection mission. A set of mission level criteria are utilized to assess the performance of each redirection method, and the means of assigning attributes to each criterion is discussed in detail. In addition, the uncertainty in physical characteristics of the asteroid population is quantified through the use of Monte Carlo analysis. The Monte Carlo simulation provides insight into the performance robustness of the redirection methods with respect to the targeted asteroid range. Lastly, the attributes for each redirection method are aggregated using three different multicriteria assessment approaches, i.e., the Analytical Hierarchy Process, a utility-based approach, and a fuzzy aggregation mechanism. The results of each assessment approach as well as recommendations for further studies are discussed in detail.

  14. Comparative analysis among sampling methods of underground water

    Directory of Open Access Journals (Sweden)

    Fábio Augusto Gomes Vieira Reis

    2005-06-01

    Full Text Available The monitoring of the underground water quality assumes importance because of the increasing use of it for the different purposes of the contemporaneous society. Different methods for sampling and monitoring of underground water can take to very distinct results. In this way, the present work has as objective to carry out comparative studies among three methods of sampling of underground waters. Bailer, Electric pump of high flowing and pump of low flowing. Quantitative and qualitative sampling techniques of water quality were opted through monitoring wells. The research involved the identification and analysis of the differences in the results obtained by the methods of sampling of underground waters; the geologic and hydrologic description of the monitoring well, on which the sampling was taken from; and the comparative analysis among the three methods of sampling, determining which ones present the best efficiency. There were made mensurations at the site, collection of samples and interpretation of the chemical analyses. At the end of the works, with the integration of the geologic and hydrologic data and analytic results was possible to indicate that the most precise method of sampling of underground water is the low flowing one.

  15. Methods for Analysis of Outdoor Performance Data (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  16. Synthesis of aircraft structures using integrated design and analysis methods

    Science.gov (United States)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  17. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V

    2016-01-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  18. Review of analysis methods to prevent thermal buckling

    International Nuclear Information System (INIS)

    This report is a State of the Art about practical methods to analyze buckling risks mainly due to thermal stresses in slender shell structures. A critical review of theoretical, numerical and experimental results available in open literature till 1986 is performed. They are particularly examined from the point of view of simplicity in the formulations and experimental validation. The final aim of this study is an attempt to propose analysis method of practical use for engineers. Most of used informations were obtained from aeronautic and nuclear (fast breeder reactors) domains

  19. Staining Method for Protein Analysis by Capillary Gel Electrophoresis

    OpenAIRE

    Wu, Shuqing; Lu, Joann J; Wang, Shili; Peck, Kristy L.; Li, Guigen; Liu, Shaorong

    2007-01-01

    A novel staining method and the associated fluorescent dye were developed for protein analysis by capillary SDS-PAGE. The method strategy is to synthesize a pseudo-SDS dye and use it to replace some of the SDS in SDS–protein complexes so that the protein can be fluorescently detected. The pseudo-SDS dye consists of a long, straight alkyl chain connected to a negative charged fluorescent head and binds to proteins just as SDS. The number of dye molecules incorporated with a protein depends on ...

  20. Dynamic Characteristic Analysis and Experiment for Integral Impeller Based on Cyclic Symmetry Analysis Method

    Institute of Scientific and Technical Information of China (English)

    WU Qiong; ZHANG Yidu; ZHANG Hongwei

    2012-01-01

    A cyclic symmetry analysis method is proposed for analyzing the dynamic characteristic problems of thin walled integral impeller.Reliability and feasibility of the present method are investigated by means of simulation and experiment.The fundamental cyclic symmetry equations and the solutions of these equations are derived for the cyclic symmetry structure.The computational efficiency analysis between whole and part is performed.Comparison of results obtained by the finite element analysis (FEA)and experiment shows that the local dynamic characteristic of integral impeller has consistency with the single cyclic symmetry blade.When the integral impeller is constrained and the thin walled blade becomes a concerned object in analysis,the dynamic characteristic of integral impeller can be replaced by the cyclic symmetry blade approximately.Hence,a cyclic symmetry analysis method is effectively used to improve efficiency and obtain more information of parameters for dynamic characteristic of integral impellers.

  1. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  2. SAMA: A Method for 3D Morphological Analysis.

    Science.gov (United States)

    Paulose, Tessie; Montévil, Maël; Speroni, Lucia; Cerruti, Florent; Sonnenschein, Carlos; Soto, Ana M

    2016-01-01

    Three-dimensional (3D) culture models are critical tools for understanding tissue morphogenesis. A key requirement for their analysis is the ability to reconstruct the tissue into computational models that allow quantitative evaluation of the formed structures. Here, we present Software for Automated Morphological Analysis (SAMA), a method by which epithelial structures grown in 3D cultures can be imaged, reconstructed and analyzed with minimum human intervention. SAMA allows quantitative analysis of key features of epithelial morphogenesis such as ductal elongation, branching and lumen formation that distinguish different hormonal treatments. SAMA is a user-friendly set of customized macros operated via FIJI (http://fiji.sc/Fiji), an open-source image analysis platform in combination with a set of functions in R (http://www.r-project.org/), an open-source program for statistical analysis. SAMA enables a rapid, exhaustive and quantitative 3D analysis of the shape of a population of structures in a 3D image. SAMA is cross-platform, licensed under the GPLv3 and available at http://montevil.theobio.org/content/sama. PMID:27035711

  3. Analysis and investigation to draw up design method by inelastic analysis

    International Nuclear Information System (INIS)

    To realize small simple plant equipment, FBR design by inelastic analysis was studied. With the constitutive equation and analysis procedure proposed as the design method by inelastic analysis, effects of loading history on the results of inelastic analysis was investigated using a simple model. It was confirmed that estimation by the classical inelastic constitutive equation belonged to the safe site of loading history of the real reactor. The problems of application of the detailed constitutive equation to design were investigated. The creep fatigue damage evaluation logic in the intermediate retaining state, which is problem of estimation of strength on the basis of inelastic analysis, is studied. (S.Y.)

  4. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  5. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  6. The Implementation of a Vulnerability Topology Analysis Method for ICS

    Directory of Open Access Journals (Sweden)

    Yang Yi Lin

    2016-01-01

    Full Text Available Nowadays Industrial Control System (ICS is becoming more and more important in significant fields. However, the using of the general facilities in these systems makes lots of security issues exposed. Because there are a number of differences between the original IT systems and ICSs. At the same time, the traditional vulnerability scan technology lacks the ability to recognize the interactions between vulnerabilities in the network. ICSs are always in a high risk state. So, this paper focuses on the vertex polymerization of the topological vulnerability analysis. We realized a topological analysis method oriented ICSs on the basis of achievements at home and abroad. The result shows that this method can solve the problem of the complex fragility correlation among industrial control networks.

  7. Computational methods for efficient structural reliability and reliability sensitivity analysis

    Science.gov (United States)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  8. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.

    2003-01-01

    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  9. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    farms by investigating the relationship between the elasticity of scale and the farm size. We use a balanced panel data set of 371~specialised crop farms for the years 2004-2007. A non-parametric specification test shows that neither the Cobb-Douglas function nor the Translog function are consistent......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas and...... biased parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be...

  10. Method for fractional solid-waste sampling and chemical analysis

    DEFF Research Database (Denmark)

    Riber, Christian; Rodushkin, I.; Spliid, Henrik;

    2007-01-01

    Chemical characterization of solid waste is a demanding task due to the heterogeneity of the waste. This article describes how 45 material fractions hand-sorted from Danish household waste were subsampled and prepared for chemical analysis of 61 substances. All material fractions were subject to...... repeated particle-size reduction, mixing, and mass reduction until a sufficiently small but representative sample was obtained for digestion prior to chemical analysis. The waste-fraction samples were digested according to their properties for maximum recognition of all the studied substances. By combining...... four subsampling methods and five digestion methods, paying attention to the heterogeneity and the material characteristics of the waste fractions, it was possible to determine 61 substances with low detection limits, reasonable variance, and high accuracy. For most of the substances of environmental...

  11. Printing metal-spiked inks for LA-ICP-MS bioimaging internal standardization: comparison of the different nephrotoxic behavior of cisplatin, carboplatin, and oxaliplatin.

    Science.gov (United States)

    Moraleja, Irene; Esteban-Fernández, Diego; Lázaro, Alberto; Humanes, Blanca; Neumann, Boris; Tejedor, Alberto; Mena, M Luz; Jakubowski, Norbert; Gómez-Gómez, M Milagros

    2016-03-01

    The study of the distribution of the cytostatic drugs cisplatin, carboplatin, and oxaliplatin along the kidney may help to understand their different nephrotoxic behavior. Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) allows the acquisition of trace element images in biological tissues. However, results obtained are affected by several variations concerning the sample matrix and instrumental drifts. In this work, an internal standardization method based on printing an Ir-spiked ink onto the surface of the sample has been developed to evaluate the different distributions and accumulation levels of the aforementioned drugs along the kidney of a rat model. A conventional ink-jet printer was used to print fresh sagittal kidney tissue slices of 4 μm. A reproducible and homogenous deposition of the ink along the tissue was observed. The ink was partially absorbed on top of the tissue. Thus, this approach provides a pseudo-internal standardization, due to the fact that the ablation sample and internal standard take place subsequently and not simultaneously. A satisfactory normalization of LA-ICP-MS bioimages and therefore a reliable comparison of the kidney treated with different Pt-based drugs were achieved even for tissues analyzed on different days. Due to the complete ablation of the sample, the transport of the ablated internal standard and tissue to the inductively coupled plasma-mass spectrometry (ICP-MS) is practically taking place at the same time. Pt accumulation in the kidney was observed in accordance to the dosages administered for each drug. Although the accumulation rate of cisplatin and oxaliplatin is high in both cases, their Pt distributions differ. The strong nephrotoxicity observed for cisplatin and the absence of such side effect in the case of oxaliplatin could explain these distribution differences. The homogeneous distribution of oxaliplatin in the cortical and medullar areas could be related with its higher affinity for

  12. Genome analysis methods - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods Genome analysis methods... Data detail Data name Genome analysis methods Description of data contents The current status and re...ion of genomic database are shown in this list. Data file File name: pgdbj_dna_marker_linkage_map_genome_analysis_methods...r-linkage-map/LATEST/pgdbj_dna_marker_linkage_map_genome_analysis_methods_en.zip File size: 5.8 KB Simple se...arch URL http://togodb.biosciencedbc.jp/togodb/view/pgdbj_dna_marker_linkage_map_genome_analysis_methods_en

  13. Application of the maximum entropy method to profile analysis

    International Nuclear Information System (INIS)

    Full text: A maximum entropy (MaxEnt) method for analysing crystallite size- and strain-induced x-ray profile broadening is presented. This method treats the problems of determining the specimen profile, crystallite size distribution, and strain distribution in a general way by considering them as inverse problems. A common difficulty faced by many experimenters is their inability to determine a well-conditioned solution of the integral equation, which preserves the positivity of the profile or distribution. We show that the MaxEnt method overcomes this problem, while also enabling a priori information, in the form of a model, to be introduced into it. Additionally, we demonstrate that the method is fully quantitative, in that uncertainties in the solution profile or solution distribution can be determined and used in subsequent calculations, including mean particle sizes and rms strain. An outline of the MaxEnt method is presented for the specific problems of determining the specimen profile and crystallite or strain distributions for the correspondingly broadened profiles. This approach offers an alternative to standard methods such as those of Williamson-Hall and Warren-Averbach. An application of the MaxEnt method is demonstrated in the analysis of alumina size-broadened diffraction data (from NIST, Gaithersburg). It is used to determine the specimen profile and column-length distribution of the scattering domains. Finally, these results are compared with the corresponding Williamson-Hall and Warren-Averbach analyses. Copyright (1999) Australian X-ray Analytical Association Inc

  14. Method for Complex Analysis of Banking Group Member Efficiency Rate

    OpenAIRE

    Marharyta Ambarchyan

    2013-01-01

    Banking Group is a body of legal entities with a common controller. Since the Banking Group financial result depends on financial results of each Group member financial result, the need for methodology arises which would make it possible to evaluate the rate of the Group member efficiency. Such evaluates are required for grounding the appropriateness of including this member into the Banking Group. The article proposes the methods of comprehensive analysis of Banking Group member efficiency. ...

  15. Big Data and Specific Analysis Methods for Insurance Fraud Detection

    Directory of Open Access Journals (Sweden)

    Ramona BOLOGA

    2014-02-01

    Full Text Available Analytics is the future of big data because only transforming data into information gives them value and can turn data in business in competitive advantage. Large data volumes, their variety and the increasing speed their growth, stretch the boundaries of traditional data warehouses and ETL tools. This paper investigates the benefits of Big Data technology and main methods of analysis that can be applied to the particular case of fraud detection in public health insurance system in Romania.

  16. Traditions and Alcohol Use: A Mixed-Methods Analysis

    OpenAIRE

    Castro, Felipe González; Coe, Kathryn

    2007-01-01

    An integrative mixed-methods analysis examined traditional beliefs as associated with beliefs about self-care during pregnancy and with alcohol abstinence among young adult women from two rural U.S.–Mexico border communities. Quantitative (measured scale) variables and qualitative thematic variables generated from open-ended responses served as within-time predictors of these health-related outcomes. A weaker belief that life is better in big cities was associated with stronger self-care beli...

  17. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    OpenAIRE

    DR.K.KUPPUSAMY; S. Murugan

    2010-01-01

    This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS). This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such sy...

  18. SYSTEMATIC REVIEW AND META-ANALYSIS: PITFALLS OF METHODS

    OpenAIRE

    Yu. V. Lukina; S. Yu. Martsevich; N. P. Kutishenko

    2016-01-01

    Methods that allow to systematize the accumulated information on specifically formulated questions, analyze it and make reliable findings are of great interest and popularity with the emergence of a huge number of studies and publications in various fields of medicine, a significant increase in information, failure to identify weak effects in the individual studies. The use of systematic reviews and high-quality meta-analysis is an analytical basis of evidence-based medicine, it is a very val...

  19. Pseudo-dynamic method for structural analysis of automobile seats

    OpenAIRE

    J. O. Carneiro; Melo, F. J. Q. de; Pereira, J. T.; Teixeira, V.

    2005-01-01

    This work describes the application of a pseudo-dynamic (PsD) method to the dynamic analysis of passenger seats for the automotive industry. The project of such components involves a structural test considering the action of dynamic forces arising from a crash scenario. The laboratory certification of these automotive components consists essentially on the inspection of the propagation and extension of plastic deformations zones in metallic members of the seat structure as cons...

  20. Computational Methods for Failure Analysis and Life Prediction

    Science.gov (United States)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.