WorldWideScience

Sample records for bioimage analysis methods

  1. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  2. Transforms and Operators for Directional Bioimage Analysis: A Survey.

    Science.gov (United States)

    Püspöki, Zsuzsanna; Storath, Martin; Sage, Daniel; Unser, Michael

    2016-01-01

    We give a methodology-oriented perspective on directional image analysis and rotation-invariant processing. We review the state of the art in the field and make connections with recent mathematical developments in functional analysis and wavelet theory. We unify our perspective within a common framework using operators. The intent is to provide image-processing methods that can be deployed in algorithms that analyze biomedical images with improved rotation invariance and high directional sensitivity. We start our survey with classical methods such as directional-gradient and the structure tensor. Then, we discuss how these methods can be improved with respect to robustness, invariance to geometric transformations (with a particular interest in scaling), and computation cost. To address robustness against noise, we move forward to higher degrees of directional selectivity and discuss Hessian-based detection schemes. To present multiscale approaches, we explain the differences between Fourier filters, directional wavelets, curvelets, and shearlets. To reduce the computational cost, we address the problem of matching directional patterns by proposing steerable filters, where one might perform arbitrary rotations and optimizations without discretizing the orientation. We define the property of steerability and give an introduction to the design of steerable filters. We cover the spectrum from simple steerable filters through pyramid schemes up to steerable wavelets. We also present illustrations on the design of steerable wavelets and their application to pattern recognition.

  3. A Novel Approach for Information Content Retrieval and Analysis of Bio-Images using Datamining techniques

    Directory of Open Access Journals (Sweden)

    Ayyagari Sri Nagesh

    2012-11-01

    Full Text Available In Bio-Medical image processing domain, content-based analysis and Information retrieval of bio-images is very critical for disease diagnosis. Content-Based Image Analysis and Information Retrieval (CBIAIR has become a significant part of information retrieval technology. One challenge in this area is that the ever-increasing number of bio-images acquired through the digital world makes the brute force searching almost impossible. Medical Image structural objects content and object identification plays significant role for image content analysis and information retrieval. There are basically three fundamental concepts for content-based bio-image retrieval, i.e. visual-feature extraction, multi-dimensional indexing, and retrieval system process. Each image has three contents such as: colour, texture and shape features. Colour and Texture both plays important image visual features used in Content-Based Image Retrieval to improve results. In this paper, we have presented an effective image retrieval system using features like texture, shape and color, called CBIAIR (Content-Based Image Analysis and Information Retrieval. Here, we have taken three different features such as texture, color and shape. Firstly, we have developed a new texture pattern feature for pixel based feature in CBIAIR system. Subsequently, we have used semantic color feature for color based feature and the shape based feature selection is done using the existing technique. For retrieving, these features are extracted from the query image and matched with the feature library using the feature weighted distance. After that, all feature vectors will be stored in the database using indexing procedure. Finally, the relevant images that have less matched distance than the predefined threshold value are retrieved from the image database after adapting the K-NN classifier.

  4. KNIME for Open-Source Bioimage Analysis: A Tutorial.

    Science.gov (United States)

    Dietz, Christian; Berthold, Michael R

    2016-01-01

    The open analytics platform KNIME is a modular environment that enables easy visual assembly and interactive execution of workflows. KNIME is already widely used in various areas of research, for instance in cheminformatics or classical data analysis. In this tutorial the KNIME Image Processing Extension is introduced, which adds the capabilities to process and analyse huge amounts of images. In combination with other KNIME extensions, KNIME Image Processing opens up new possibilities for inter-domain analysis of image data in an understandable and reproducible way.

  5. Chapter 17: bioimage informatics for systems pharmacology.

    Directory of Open Access Journals (Sweden)

    Fuhai Li

    2013-04-01

    Full Text Available Recent advances in automated high-resolution fluorescence microscopy and robotic handling have made the systematic and cost effective study of diverse morphological changes within a large population of cells possible under a variety of perturbations, e.g., drugs, compounds, metal catalysts, RNA interference (RNAi. Cell population-based studies deviate from conventional microscopy studies on a few cells, and could provide stronger statistical power for drawing experimental observations and conclusions. However, it is challenging to manually extract and quantify phenotypic changes from the large amounts of complex image data generated. Thus, bioimage informatics approaches are needed to rapidly and objectively quantify and analyze the image data. This paper provides an overview of the bioimage informatics challenges and approaches in image-based studies for drug and target discovery. The concepts and capabilities of image-based screening are first illustrated by a few practical examples investigating different kinds of phenotypic changes caEditorsused by drugs, compounds, or RNAi. The bioimage analysis approaches, including object detection, segmentation, and tracking, are then described. Subsequently, the quantitative features, phenotype identification, and multidimensional profile analysis for profiling the effects of drugs and targets are summarized. Moreover, a number of publicly available software packages for bioimage informatics are listed for further reference. It is expected that this review will help readers, including those without bioimage informatics expertise, understand the capabilities, approaches, and tools of bioimage informatics and apply them to advance their own studies.

  6. Upconverting nanophosphors for bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Shuang Fang; Zhuo Rui [Department of MAE, Princeton University, Princeton, NJ 08544 (United States); Riehn, Robert [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Tung, Chih-kuan; Dalland, Joanna; Austin, Robert H [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Ryu, William S [Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ 08544 (United States)

    2009-10-07

    Upconverting nanoparticles (UCNPs) when excited in the near-infrared (NIR) region display anti-Stokes emission whereby the emitted photon is higher in energy than the excitation energy. The material system achieves that by converting two or more infrared photons into visible photons. The use of the infrared confers benefits to bioimaging because of its deeper penetrating power in biological tissues and the lack of autofluorescence. We demonstrate here sub-10 nm, upconverting rare earth oxide UCNPs synthesized by a combustion method that can be stably suspended in water when amine modified. The amine modified UCNPs show specific surface immobilization onto patterned gold surfaces. Finally, the low toxicity of the UCNPs is verified by testing on the multi-cellular C. elegans nematode.

  7. Machine vision assisted analysis of structure-localization relationships in a combinatorial library of prospective bioimaging probes

    OpenAIRE

    Shedden, Kerby; Li, Qian; Liu, Fangyi; Chang, Young Tae; Rosania, Gus R.

    2009-01-01

    With a combinatorial library of bioimaging probes, it is now possible to use machine vision to analyze the contribution of different building blocks of the molecules to their cell-associated visual signals. For athis purpose, cell-permeant, fluorescent styryl molecules were synthesized by condensation of 168 aldehyde with 8 pyridinium/quinolinium building blocks. Images of cells incubated with fluorescent molecules were acquired with a high content screening instrument. Chemical and image fea...

  8. Chemical Address Tags of Fluorescent Bioimaging Probes

    OpenAIRE

    Shedden, Kerby; Rosania, Gus R.

    2010-01-01

    Chemical address tags can be defined as specific structural features shared by a set of bioimaging probes having a predictable influence on cell-associated visual signals obtained from these probes. Here, using a large image dataset acquired with a high content screening instrument, machine vision and cheminformatics analysis have been applied to reveal chemical address tags. With a combinatorial library of fluorescent molecules, fluorescence signal intensity, spectral, and spatial features c...

  9. Nanodiamonds for optical bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Hui, Yuen Yung; Chang, Huan-Cheng [Institute of Atomic and Molecular Sciences, Academia Sinica, Taipei 10617, Taiwan (China); Cheng, Chia-Liang, E-mail: yyhui@pub.iams.sinica.edu.t, E-mail: clcheng@mail.ndhu.edu.t, E-mail: hchang@gate.sinica.edu.t [Department of Physics, National Dong-Hwa University, Hualien 97401, Taiwan (China)

    2010-09-22

    Diamond has received increasing attention for its promising biomedical applications. The material is highly biocompatible and can be easily conjugated with bioactive molecules. Recently, nanoscale diamond has been applied as light scattering labels and luminescent optical markers. The luminescence, arising from photoexcitation of colour centres, can be substantially enhanced when type Ib diamond nanocrystals are bombarded by a high-energy particle beam and then annealed to form negatively charged nitrogen-vacancy centres. The centre absorbs strongly at 560 nm, fluoresces efficiently in the far-red region and is exceptionally photostable (without photoblinking and photobleaching). It is an ideal candidate for long-term imaging and tracking in complex cellular environments. This review summarizes recent advances in the development of fluorescent nanodiamonds for optical bioimaging with single particle sensitivity and nanometric resolution.

  10. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  11. Carbon dots: large-scale synthesis, sensing and bioimaging

    Directory of Open Access Journals (Sweden)

    Jia Zhang

    2016-09-01

    Full Text Available Emerging as a potent alternative to classical metal-based semiconductor quantum dots (Qdots, carbon dots (Cdots possess the distinctive advantages of convenient synthesis, prominent biocompatibility, colorful photoluminescence, and low cost. After almost a decade of extensive studies since their discovery, Cdots have widely been applied in bioimaging, sensing, catalysis, optoelectronics, energy conversion, etc. In this review, we first highlight the synthetic methods for Cdots in a macroscale manner. Second, we briefly discuss the fundamental mechanisms underlying the photoluminescence (PL. Third, we focus on their applications in sensing and bioimaging (including imaging-guided therapy. Some thoughts on future developments of Cdots are demonstrated as concluding remarks.

  12. Graphene Quantum Dots for Theranostics and Bioimaging.

    Science.gov (United States)

    Schroeder, Kathryn L; Goreham, Renee V; Nann, Thomas

    2016-10-01

    Since their advent in the early 1990s, nanomaterials hold promise to constitute improved technologies in the biomedical area. In particular, graphene quantum dots (GQDs) were conjectured to produce new or improve current methods used for bioimaging, drug delivery, and biomarker sensors for early detection of diseases. This review article critically compares and discusses current state-of-the-art use of GQDs in biology and health sciences. It shows the ability of GQDs to be easily functionalised for use as a targeted multimodal treatment and imaging platform. The in vitro and in vivo toxicity of GQDs are explored showing low toxicity for many types of GQDs.

  13. Open source bioimage informatics for cell biology.

    Science.gov (United States)

    Swedlow, Jason R; Eliceiri, Kevin W

    2009-11-01

    Significant technical advances in imaging, molecular biology and genomics have fueled a revolution in cell biology, in that the molecular and structural processes of the cell are now visualized and measured routinely. Driving much of this recent development has been the advent of computational tools for the acquisition, visualization, analysis and dissemination of these datasets. These tools collectively make up a new subfield of computational biology called bioimage informatics, which is facilitated by open source approaches. We discuss why open source tools for image informatics in cell biology are needed, some of the key general attributes of what make an open source imaging application successful, and point to opportunities for further operability that should greatly accelerate future cell biology discovery.

  14. Elemental bioimaging and speciation analysis for the investigation of Wilson's disease using μXRF and XANES.

    Science.gov (United States)

    Hachmöller, Oliver; Buzanich, Ana Guilherme; Aichler, Michaela; Radtke, Martin; Dietrich, Dörthe; Schwamborn, Kristina; Lutz, Lisa; Werner, Martin; Sperling, Michael; Walch, Axel; Karst, Uwe

    2016-07-13

    A liver biopsy specimen from a Wilson's disease (WD) patient was analyzed by means of micro-X-ray fluorescence (μXRF) spectroscopy to determine the elemental distribution. First, bench-top μXRF was utilized for a coarse scan of the sample under laboratory conditions. The resulting distribution maps of copper and iron enabled the determination of a region of interest (ROI) for further analysis. In order to obtain more detailed elemental information, this ROI was analyzed by synchrotron radiation (SR)-based μXRF with a beam size of 4 μm offering a resolution at the cellular level. Distribution maps of additional elements to copper and iron like zinc and manganese were obtained due to a higher sensitivity of SR-μXRF. In addition to this, X-ray absorption near edge structure spectroscopy (XANES) was performed to identify the oxidation states of copper in WD. This speciation analysis indicated a mixture of copper(i) and copper(ii) within the WD liver tissue.

  15. Chemically engineered persistent luminescence nanoprobes for bioimaging

    Science.gov (United States)

    Lécuyer, Thomas; Teston, Eliott; Ramirez-Garcia, Gonzalo; Maldiney, Thomas; Viana, Bruno; Seguin, Johanne; Mignet, Nathalie; Scherman, Daniel; Richard, Cyrille

    2016-01-01

    Imaging nanoprobes are a group of nanosized agents developed for providing improved contrast for bioimaging. Among various imaging probes, optical sensors capable of following biological events or progresses at the cellular and molecular levels are actually actively developed for early detection, accurate diagnosis, and monitoring of the treatment of diseases. The optical activities of nanoprobes can be tuned on demand by chemists by engineering their composition, size and surface nature. This review will focus on researches devoted to the conception of nanoprobes with particular optical properties, called persistent luminescence, and their use as new powerful bioimaging agents in preclinical assays. PMID:27877248

  16. Biomagnetics and bioimaging for medical applications

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shoogo [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)]. E-mail: ueno@medes.m.u-tokyo.ac.jp; Sekino, Masaki [Department of Biomedical Engineering, Graduate School of Medicine, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan)

    2006-09-15

    This paper reviews medical applications of the recently developed techniques in biomagnetics and bioimaging such as transcranial magnetic stimulation, magnetoencephalography, magnetic resonance imaging, cancer therapy based on magnetic stimulation, and magnetic control of cell orientation and cell growth. These techniques are leading medicine and biology into a new horizon through the novel applications of magnetism.

  17. Silicon nanomaterials platform for bioimaging, biosensing, and cancer therapy.

    Science.gov (United States)

    Peng, Fei; Su, Yuanyuan; Zhong, Yiling; Fan, Chunhai; Lee, Shuit-Tong; He, Yao

    2014-02-18

    Silicon nanomaterials are an important class of nanomaterials with great potential for technologies including energy, catalysis, and biotechnology, because of their many unique properties, including biocompatibility, abundance, and unique electronic, optical, and mechanical properties, among others. Silicon nanomaterials are known to have little or no toxicity due to favorable biocompatibility of silicon, which is an important precondition for biological and biomedical applications. In addition, huge surface-to-volume ratios of silicon nanomaterials are responsible for their unique optical, mechanical, or electronic properties, which offer exciting opportunities for design of high-performance silicon-based functional nanoprobes, nanosensors, and nanoagents for biological analysis and detection and disease treatment. Moreover, silicon is the second most abundant element (after oxygen) on earth, providing plentiful and inexpensive resources for large-scale and low-cost preparation of silicon nanomaterials for practical applications. Because of these attractive traits, and in parallel with a growing interest in their design and synthesis, silicon nanomaterials are extensively investigated for wide-ranging applications, including energy, catalysis, optoelectronics, and biology. Among them, bioapplications of silicon nanomaterials are of particular interest. In the past decade, scientists have made an extensive effort to construct a silicon nanomaterials platform for various biological and biomedical applications, such as biosensors, bioimaging, and cancer treatment, as new and powerful tools for disease diagnosis and therapy. Nonetheless, there are few review articles covering these important and promising achievements to promote the awareness of development of silicon nanobiotechnology. In this Account, we summarize recent representative works to highlight the recent developments of silicon functional nanomaterials for a new, powerful platform for biological and

  18. Advances in Bio-Imaging From Physics to Signal Understanding Issues State-of-the-Art and Challenges

    CERN Document Server

    Racoceanu, Daniel; Gouaillard, Alexandre

    2012-01-01

    Advances in Imaging Devices and Image processing stem from cross-fertilization between many fields of research such as Chemistry, Physics, Mathematics and Computer Sciences. This BioImaging Community feel the urge to integrate more intensively its various results, discoveries and innovation into ready to use tools that can address all the new exciting challenges that Life Scientists (Biologists, Medical doctors, ...) keep providing, almost on a daily basis. Devising innovative chemical probes, for example, is an archetypal goal in which image quality improvement must be driven by the physics of acquisition, the image processing and analysis algorithms and the chemical skills in order to design an optimal bioprobe. This book offers an overview of the current advances in many research fields related to bioimaging and highlights the current limitations that would need to be addressed in the next decade to design fully integrated BioImaging Device.

  19. Education in biomedical informatics: learning by doing bioimage archiving.

    Science.gov (United States)

    Marceglia, Sara; Bonacina, Stefano; Mazzola, Luca; Pinciroli, Francesco

    2007-01-01

    For a common user, bioimages seem to be very easy to treat, to read, to understand and, therefore, to archive. Conversely, bioimage archiving require a very complex design and implementation process that needs skilled and trained technicians. We proposed to a class of bioengineering students at the Politecnico University of Milan the implementation of a hand image repository specifically designed for highlighting the main features that should be taken into account when treating bioimage archives. Students were required to build the archive with software tools they had previously learned in other programming language courses and available at the university informatics class-rooms.

  20. Spatial-scanning hyperspectral imaging probe for bio-imaging applications.

    Science.gov (United States)

    Lim, Hoong-Ta; Murukeshan, Vadakke Matham

    2016-03-01

    The three common methods to perform hyperspectral imaging are the spatial-scanning, spectral-scanning, and snapshot methods. However, only the spectral-scanning and snapshot methods have been configured to a hyperspectral imaging probe as of today. This paper presents a spatial-scanning (pushbroom) hyperspectral imaging probe, which is realized by integrating a pushbroom hyperspectral imager with an imaging probe. The proposed hyperspectral imaging probe can also function as an endoscopic probe by integrating a custom fabricated image fiber bundle unit. The imaging probe is configured by incorporating a gradient-index lens at the end face of an image fiber bundle that consists of about 50,000 individual fiberlets. The necessary simulations, methodology, and detailed instrumentation aspects that are carried out are explained followed by assessing the developed probe's performance. Resolution test targets such as United States Air Force chart as well as bio-samples such as chicken breast tissue with blood clot are used as test samples for resolution analysis and for performance validation. This system is built on a pushbroom hyperspectral imaging system with a video camera and has the advantage of acquiring information from a large number of spectral bands with selectable region of interest. The advantages of this spatial-scanning hyperspectral imaging probe can be extended to test samples or tissues residing in regions that are difficult to access with potential diagnostic bio-imaging applications.

  1. ANALYSIS OF MULTISCALE METHODS

    Institute of Scientific and Technical Information of China (English)

    Wei-nan E; Ping-bing Ming

    2004-01-01

    The heterogeneous multiscale method gives a general framework for the analysis of multiscale methods. In this paper, we demonstrate this by applying this framework to two canonical problems: The elliptic problem with multiscale coefficients and the quasicontinuum method.

  2. Nanodiamonds and silicon quantum dots: ultrastable and biocompatible luminescent nanoprobes for long-term bioimaging.

    Science.gov (United States)

    Montalti, M; Cantelli, A; Battistelli, G

    2015-07-21

    Fluorescence bioimaging is a powerful, versatile, method for investigating, both in vivo and in vitro, the complex structures and functions of living organisms in real time and space, also using super-resolution techniques. Being poorly invasive, fluorescence bioimaging is suitable for long-term observation of biological processes. Long-term detection is partially prevented by photobleaching of organic fluorescent probes. Semiconductor quantum dots, in contrast, are ultrastable, fluorescent contrast agents detectable even at the single nanoparticle level. Emission color of quantum dots is size dependent and nanoprobes emitting in the near infrared (NIR) region are ideal for low back-ground in vivo imaging. Biocompatibility of nanoparticles, containing toxic elements, is debated. Recent safety concerns enforced the search for alternative ultrastable luminescent nanoprobes. Most recent results demonstrated that optimized silicon quantum dots (Si QDs) and fluorescent nanodiamonds (FNDs) show almost no photobleaching in a physiological environment. Moreover in vitro and in vivo toxicity studies demonstrated their unique biocompatibility. Si QDs and FNDs are hence ideal diagnostic tools and promising non-toxic vectors for the delivery of therapeutic cargos. Most relevant examples of applications of Si QDs and FNDs to long-term bioimaging are discussed in this review comparing the toxicity and the stability of different nanoprobes.

  3. Bioimaging mass spectrometry of trace elements – recent advance and applications of LA-ICP-MS: A review

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J.Sabine, E-mail: s.becker@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany); Matusch, Andreas, E-mail: a.matusch@fz-juelich.de [Institute for Neuroscience and Medicine (INM-2), Forschungszentrum Jülich, Jülich D-52425 (Germany); Wu, Bei, E-mail: b.wu@fz-juelich.de [Central Institute for Engineering, Electronics and Analytics (ZEA-3), Forschungszentrum Jülich, Jülich D-52425 (Germany)

    2014-07-04

    Highlights: • Bioimaging LA-ICP-MS is established for trace metals within biomedical specimens. • Trace metal imaging allows to study brain function and neurodegenerative diseases. • Laser microdissection ICP-MS was applied to mouse brain hippocampus and wheat root. - Abstract: Bioimaging using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) offers the capability to quantify trace elements and isotopes within tissue sections with a spatial resolution ranging about 10–100 μm. Distribution analysis adds to clarifying basic questions of biomedical research and enables bioaccumulation and bioavailability studies for ecological and toxicological risk assessment in humans, animals and plants. Major application fields of mass spectrometry imaging (MSI) and metallomics have been in brain and cancer research, animal model validation, drug development and plant science. Here we give an overview of latest achievements in methods and applications. Recent improvements in ablation systems, operation and cell design enabled progressively better spatial resolutions down to 1 μm. Meanwhile, a body of research has accumulated covering basic principles of the element architecture in animals and plants that could consistently be reproduced by several laboratories such as the distribution of Fe, Cu, Zn in rodent brain. Several studies investigated the distribution and delivery of metallo-drugs in animals. Hyper-accumulating plants and pollution indicator organisms have been the key topics in environmental science. Increasingly, larger series of samples are analyzed, may it be in the frame of comparisons between intervention and control groups, of time kinetics or of three-dimensional atlas approaches.

  4. Applications of quantum dots with upconverting luminescence in bioimaging.

    Science.gov (United States)

    Chen, Yunyun; Liang, Hong

    2014-06-05

    Quantum dots (QDs) have attracted great attention in recent years due to their promising applications in bioimaging. Compared with traditional ultraviolet excitation of QDs, near-infrared laser (NIR) excitation has many advantages, such as being less harmful, little blinking effects, zero autofluorescence and deep penetration in tissue. Composing QDs with upconverting properties is promising to enable NIR excitation. This article provides a review of QDs with upconverting luminescence and their applications in bioimaging. Based on the mechanisms of luminescence, discussion will be divided into four groups: nanoheterostructures/mixtures of QDs and upconverting nanoparticles, graphene quantum dots, lanthanide-doped QDs, and double QDs. The content includes synthetic routes, upconverting luminescence properties, and their applications in bioimaging.

  5. Gold nanoclusters with enhanced tunable fluorescence as bioimaging probes.

    Science.gov (United States)

    Palmal, Sharbari; Jana, Nikhil R

    2014-01-01

    Development of unique bioimaging probes offering essential information's about bio environments are an important step forward in biomedical science. Nanotechnology offers variety of novel imaging nanoprobes having high-photo stability as compared to conventional molecular probes which often experience rapid photo bleaching problem. Although great advances have been made on the development of semiconductor nanocrystals-based fluorescent imaging probes, potential toxicity issue by heavy metal component limits their in vivo therapeutic and clinical application. Recent works show that fluorescent gold clusters (FGCs) can be a promising nontoxic alternative of semiconductor nanocrystals. FGCs derived imaging nanoprobes offer stable and tunable visible emission, small hydrodynamic size, high biocompatibility and have been exploited in variety in vitro and in vivo imaging applications. In this review, we will focus on the synthetic advances and bioimaging application potentials of FGCs. In particular, we will emphasize on functional FGCs that are bright and stable enough to be useful as bioimaging probes.

  6. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  7. Bio-imaging of colorectal cancer models using near infrared labeled epidermal growth factor.

    Directory of Open Access Journals (Sweden)

    Gadi Cohen

    Full Text Available Novel strategies that target the epidermal growth factor receptor (EGFR have led to the clinical development of monoclonal antibodies, which treat metastatic colorectal cancer (mCRC but only subgroups of patients with increased wild type KRAS and EGFR gene copy, respond to these agents. Furthermore, resistance to EGFR blockade inevitably occurred, making future therapy difficult. Novel bio-imaging (BOI methods may assist in quantization of EGFR in mCRC tissue thus complementing the immunohistochemistry methodology, in guiding the future treatment of these patients. The aim of the present study was to explore the usefulness of near infrared-labeled EGF (EGF-NIR for bio-imaging of CRC using in vitro and in vivo orthotopic tumor CRC models and ex vivo human CRC tissues. We describe the preparation and characterization of EGF-NIR and investigate binding, using BOI of a panel of CRC cell culture models resembling heterogeneity of human CRC tissues. EGF-NIR was specifically and selectively bound by EGFR expressing CRC cells, the intensity of EGF-NIR signal to background ratio (SBR reflected EGFR levels, dose-response and time course imaging experiments provided optimal conditions for quantization of EGFR levels by BOI. EGF-NIR imaging of mice with HT-29 orthotopic CRC tumor indicated that EGF-NIR is more slowly cleared from the tumor and the highest SBR between tumor and normal adjacent tissue was achieved two days post-injection. Furthermore, images of dissected tissues demonstrated accumulation of EGF-NIR in the tumor and liver. EGF-NIR specifically and strongly labeled EGFR positive human CRC tissues while adjacent CRC tissue and EGFR negative tissues expressed weak NIR signals. This study emphasizes the use of EGF-NIR for preclinical studies. Combined with other methods, EGF-NIR could provide an additional bio-imaging specific tool in the standardization of measurements of EGFR expression in CRC tissues.

  8. New nanoplatforms based on UCNPs linking with polyhedral oligomeric silsesquioxane (POSS) for multimodal bioimaging

    Science.gov (United States)

    Ge, Xiaoqian; Dong, Liang; Sun, Lining; Song, Zhengmei; Wei, Ruoyan; Shi, Liyi; Chen, Haige

    2015-04-01

    A new and facile method was used to transfer upconversion luminescent nanoparticles from hydrophobic to hydrophilic using polyhedral oligomeric silsesquioxane (POSS) linking on the surface of upconversion nanoparticles. In comparison with the unmodified upconversion nanoparticles, the POSS modified upconversion nanoplatforms [POSS-UCNPs(Er), POSS-UCNPs(Tm)] displayed good monodispersion in water and exhibited good water-solubility, while their particle size did not change substantially. Due to the low cytotoxicity and good biocompatibility as determined by methyl thiazolyl tetrazolium (MTT) assay and histology and hematology analysis, the POSS modified upconversion nanoplatforms were successfully applied to upconversion luminescence imaging of living cells in vitro and nude mouse in vivo (upon excitation at 980 nm). In addition, the doped Gd3+ ion endows the POSS-UCNPs with effective T1 signal enhancement and the POSS-UCNPs were successfully applied to in vivo magnetic resonance imaging (MRI) for a Kunming mouse, which makes them potential MRI positive-contrast agents. More importantly, the corner organic groups of POSS can be easily modified, resulting in kinds of POSS-UCNPs with many potential applications. Therefore, the method and results may provide more exciting opportunities for multimodal bioimaging and multifunctional applications.A new and facile method was used to transfer upconversion luminescent nanoparticles from hydrophobic to hydrophilic using polyhedral oligomeric silsesquioxane (POSS) linking on the surface of upconversion nanoparticles. In comparison with the unmodified upconversion nanoparticles, the POSS modified upconversion nanoplatforms [POSS-UCNPs(Er), POSS-UCNPs(Tm)] displayed good monodispersion in water and exhibited good water-solubility, while their particle size did not change substantially. Due to the low cytotoxicity and good biocompatibility as determined by methyl thiazolyl tetrazolium (MTT) assay and histology and hematology

  9. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  10. Visible-light-excited and europium-emissive nanoparticles for highly-luminescent bioimaging in vivo.

    Science.gov (United States)

    Wu, Yongquan; Shi, Mei; Zhao, Lingzhi; Feng, Wei; Li, Fuyou; Huang, Chunhui

    2014-07-01

    Europium(III)-based material showing special milliseconds photoluminescence lifetime has been considered as an ideal time-gated luminescence probe for bioimaging, but is still limited in application in luminescent small-animal bioimaging in vivo. Here, a water-soluble, stable, highly-luminescent nanosystem, Ir-Eu-MSN (MSN = mesoporous silica nanoparticles, Ir-Eu = [Ir(dfppy)2(pic-OH)]3Eu·2H2O, dfppy = 2-(2,4-difluorophenyl)pyridine, pic-OH = 3-hydroxy-2-carboxypyridine), was developed by an in situ coordination reaction to form an insoluble dinuclear iridium(III) complex-sensitized-europium(III) emissive complex within mesoporous silica nanoparticles (MSNs) which had high loading efficiency. Compared with the usual approach of physical adsorption, this in-situ reaction strategy provided 20-fold the loading efficiency (43.2%) of the insoluble Ir-Eu complex in MSNs. These nanoparticles in solid state showed bright red luminescence with high quantum yield of 55.2%, and the excitation window extended up to 470 nm. These Ir-Eu-MSN nanoparticles were used for luminescence imaging in living cells under excitation at 458 nm with confocal microscopy, which was confirmed by flow cytometry. Furthermore, the Ir-Eu-MSN nanoparticles were successfully applied into high-contrast luminescent lymphatic imaging in vivo under low power density excitation of 5 mW cm(-2). This synthetic method provides a universal strategy of combining hydrophobic complexes with hydrophilic MSNs for in vivo bioimaging.

  11. Nanostructures Derived from Starch and Chitosan for Fluorescence Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Yinxue Zu

    2016-07-01

    Full Text Available Fluorescent nanostructures (NSs derived from polysaccharides have drawn great attention as novel fluorescent probes for potential bio-imaging applications. Herein, we reported a facile alkali-assisted hydrothermal method to fabricate polysaccharide NSs using starch and chitosan as raw materials. Transmission electron microscopy (TEM demonstrated that the average particle sizes are 14 nm and 75 nm for starch and chitosan NSs, respectively. Fourier transform infrared (FT-IR spectroscopy analysis showed that there are a large number of hydroxyl or amino groups on the surface of these polysaccharide-based NSs. Strong fluorescence with an excitation-dependent emission behaviour was observed under ultraviolet excitation. Interestingly, the photostability of the NSs was found to be superior to fluorescein and rhodamine B. The quantum yield of starch NSs could reach 11.12% under the excitation of 360 nm. The oxidative metal ions including Cu(II, Hg(IIand Fe(III exhibited a quench effect on the fluorescence intensity of the prepared NSs. Both of the two kinds of the multicoloured NSs showed a maximum fluorescence intensity at pH 7, while the fluorescence intensity decreased dramatically when they were put in an either acidic or basic environment (at pH 3 or 11. The cytotoxicity study of starch NSs showed that low cell cytotoxicity and 80% viability was found after 24 h incubation, when their concentration was less than 10 mg/mL. The study also showed the possibility of using the multicoloured starch NSs for mouse melanoma cells and guppy fish imaging.

  12. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  13. Color centers in diamond: versatile and powerful tools for bioimaging

    Science.gov (United States)

    Zhang, Huiliang; Glenn, David; Trifonov, Alexei; Pham, My; Le Sage, David; Kasthuri, Narayanan; Schalek, Richard; Lichtman, Jeff; Walsworth, Ronald; Walsworth Group Collaboration; Lichtman Lab Collaboration

    2011-05-01

    We present recent progress in the application of nitrogen vacancy (NV) and other color centers in diamond to demanding bioimaging applications, including: (i) nanodiamond cathodoluminescence (CL) to provide molecular-function correlated color to electron microscopy of the connections between neurons (``Connectomics'') (ii) super-resolution optical imaging of functionalized nanodiamonds in brain tissue using variants of STED, GSD or STORM techniques; and (iii) magnetic field sensing and imaging of neural activities using an NV- diamond magnetometer.

  14. Nanostructure materials for biosensing and bioimaging applications

    Science.gov (United States)

    Law, Wing Cheung

    not fully understand, three possible factors are concluded after systematic researches: (i) an increase of the absolute mass in each binding event, (ii) an increase in the bulk refractive index of the analyte, and (iii) coupling between the localized surface plasmon resonance (LSPR) of metallic nanoparticles and surface plasmon resonance (SPR) of the sensing film. Indeed, the role of plasmonic coupling in sensitivity enhancement is still an open question. In order to obtain a better understanding of this phenomenon, at the end of part I, extended studies were performed to investigate how the LSPR properties of metallic nanoparticle labels correlate with the enhancement factor. For this purpose, gold nanorods (Au-NRs) were chosen as the amplification labels because of the easy tunability of LSPR peak of Au-NR. After reading the "Result and Discussion" section, the readers will have better understanding of "plasmonic coupling" between the sensing film and the metallic labels with suitable operating laser source. In the second part of the thesis, the bioimaging part, the application of nanostructure materials in live cancer cell imaging and small animal imaging were demonstrated. There are different types of imaging technique available in laboratories and clinics: optical imaging, computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), thermography and ultrasound imaging. Although such imaging techniques have been well developed and used over a decade, improving the sensitivity, enhancing the contrast, decreasing the acquisition time and reducing the toxicity of the contrast agent are highly desirable. For optical imaging, the scientists discovered that the use of near infrared fluorescence materials can assist the surgeon to locate the tumor, the nerve and the lymph node more accurately. For CT scan, the use of Au-NR as the contrast agent can improve the sensitivity. Iron oxide nanoparticle or gadolinium ion containing

  15. Carnegie Mellon University bioimaging day 2014: Challenges and opportunities in digital pathology

    Directory of Open Access Journals (Sweden)

    Gustavo K Rohde

    2014-01-01

    Full Text Available Recent advances in digital imaging is impacting the practice of pathology. One of the key enabling technologies that is leading the way towards this transformation is the use of whole slide imaging (WSI which allows glass slides to be converted into large image files that can be shared, stored, and analyzed rapidly. Many applications around this novel technology have evolved in the last decade including education, research and clinical applications. This publication highlights a collection of abstracts, each corresponding to a talk given at Carnegie Mellon University′s (CMU Bioimaging Day 2014 co-sponsored by the Biomedical Engineering and Lane Center for Computational Biology Departments at CMU. Topics related specifically to digital pathology are presented in this collection of abstracts. These include topics related to digital workflow implementation, imaging and artifacts, storage demands, and automated image analysis algorithms.

  16. Super-Resolution Optical Fluctuation Bio-Imaging with Dual-Color Carbon Nanodots.

    Science.gov (United States)

    Chizhik, Anna M; Stein, Simon; Dekaliuk, Mariia O; Battle, Christopher; Li, Weixing; Huss, Anja; Platen, Mitja; Schaap, Iwan A T; Gregor, Ingo; Demchenko, Alexander P; Schmidt, Christoph F; Enderlein, Jörg; Chizhik, Alexey I

    2016-01-13

    Success in super-resolution imaging relies on a proper choice of fluorescent probes. Here, we suggest novel easily produced and biocompatible nanoparticles-carbon nanodots-for super-resolution optical fluctuation bioimaging (SOFI). The particles revealed an intrinsic dual-color fluorescence, which corresponds to two subpopulations of particles of different electric charges. The neutral nanoparticles localize to cellular nuclei suggesting their potential use as an inexpensive, easily produced nucleus-specific label. The single particle study revealed that the carbon nanodots possess a unique hybrid combination of fluorescence properties exhibiting characteristics of both dye molecules and semiconductor nanocrystals. The results suggest that charge trapping and redistribution on the surface of the particles triggers their transitions between emissive and dark states. These findings open up new possibilities for the utilization of carbon nanodots in the various super-resolution microscopy methods based on stochastic optical switching.

  17. Methods in algorithmic analysis

    CERN Document Server

    Dobrushkin, Vladimir A

    2009-01-01

    …helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010

  18. Aqueous synthesis of high bright and tunable near-infrared AgInSe2-ZnSe quantum dots for bioimaging.

    Science.gov (United States)

    Che, Dongchen; Zhu, Xiaoxu; Wang, Hongzhi; Duan, Yourong; Zhang, Qinghong; Li, Yaogang

    2016-02-01

    Efficient synthetic methods for near-infrared quantum dots with good biophysical properties as bioimaging agents are urgently required. In this work, a simple and fast synthesis of highly luminescent, near-infrared AgInSe2-ZnSe quantum dots (QDs) with tunable emissions in aqueous media is reported. This method avoids high temperature and pressure and organic solvents to directly generate water-dispersible AgInSe2-ZnSe QDs. The photoluminescence emission peak of the AgInSe2-ZnSe QDs ranged from 625 to 940nm, with quantum yields up to 31%. The AgInSe2-ZnSe QDs with high quantum yield, near-infrared and low cytotoxic could be used as good cell labels, showing great potential applications in bio-imaging.

  19. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  20. Engineering nanosilver as an antibacterial, biosensor and bioimaging material.

    Science.gov (United States)

    Sotiriou, Georgios A; Pratsinis, Sotiris E

    2011-10-01

    The capacity of nanosilver (Ag nanoparticles) to destroy infectious micro-organisms makes it one of the most powerful antimicrobial agents, an attractive feature against "super-bugs" resistant to antibiotics. Furthermore, its plasmonic properties facilitate its employment as a biosensor or bioimaging agent. Here, the interaction of nanosilver with biological systems including bacteria and mammalian cells is reviewed. The toxicity of nanosilver is discussed focusing on Ag(+) ion release in liquid solutions. Biomedical applications of nanosilver are also presented capitalizing on its antimicrobial and plasmonic properties and summarizing its advantages, limitations and challenges. Though a lot needs to be learned about the toxicity of nanosilver, enough is known to safely use it in a spectrum of applications with minimal impact to the environment and human health.

  1. An Organelle Correlation-Guided Feature Selection Approach for Classifying Multi-Label Subcellular Bio-images.

    Science.gov (United States)

    Shao, Wei; Liu, Mingxia; Xu, Ying-Ying; Shen, Hong-Bin; Zhang, Daoqiang

    2017-03-03

    Nowadays, with the advances in microscopic imaging, accurate classification of bioimage-based protein subcellular location pattern has attracted as much attention as ever. One of the basic challenging problems is how to select the useful feature components among thousands of potential features to describe the images. This is not an easy task especially considering there is a high ratio of multi-location proteins. Existing feature selection methods seldom take the correlation among different cellular compartments into consideration, and thus may miss some features that will be co-important for several subcellular locations. To deal with this problem, we make use of the important structural correlation among different cellular compartments and propose an organelle structural correlation regularized feature selection method CSF (Common-Sets of Features) in this paper. We formulate the multi-label classification problem by adopting a group-sparsity regularizer to select common subsets of relevant features from different cellular compartments. In addition, we also add a cell structural correlation regularized Laplacian term, which utilizes the prior biological structural information to capture the intrinsic dependency among different cellular compartments. The CSF provides a new feature selection strategy for multi-label bio-image subcellular pattern classifications, and the experimental results also show its superiority when comparing with several existing algorithms.

  2. A way toward analyzing high-content bioimage data by means of semantic annotation and visual data mining

    Science.gov (United States)

    Herold, Julia; Abouna, Sylvie; Zhou, Luxian; Pelengaris, Stella; Epstein, David B. A.; Khan, Michael; Nattkemper, Tim W.

    2009-02-01

    In the last years, bioimaging has turned from qualitative measurements towards a high-throughput and highcontent modality, providing multiple variables for each biological sample analyzed. We present a system which combines machine learning based semantic image annotation and visual data mining to analyze such new multivariate bioimage data. Machine learning is employed for automatic semantic annotation of regions of interest. The annotation is the prerequisite for a biological object-oriented exploration of the feature space derived from the image variables. With the aid of visual data mining, the obtained data can be explored simultaneously in the image as well as in the feature domain. Especially when little is known of the underlying data, for example in the case of exploring the effects of a drug treatment, visual data mining can greatly aid the process of data evaluation. We demonstrate how our system is used for image evaluation to obtain information relevant to diabetes study and screening of new anti-diabetes treatments. Cells of the Islet of Langerhans and whole pancreas in pancreas tissue samples are annotated and object specific molecular features are extracted from aligned multichannel fluorescence images. These are interactively evaluated for cell type classification in order to determine the cell number and mass. Only few parameters need to be specified which makes it usable also for non computer experts and allows for high-throughput analysis.

  3. Bayesian Methods for Statistical Analysis

    OpenAIRE

    Puza, Borek

    2015-01-01

    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  4. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T......, which is shown to be approximated by a χ2 distribution. Application of this test to the results of determinations of manganese in human serum by a method of established precision, led to the detection of airborne pollution of the serum during the sampling process. The subsequent improvement in sampling...... conditions was shown to give not only increased precision, but also improved accuracy of the results....

  5. Absence of Fungal Spore Internalization by Bronchial Epithelium in Mouse Models Evidenced by a New Bioimaging Approach and Transmission Electronic Microscopy.

    Science.gov (United States)

    Rammaert, Blandine; Jouvion, Grégory; de Chaumont, Fabrice; Garcia-Hermoso, Dea; Szczepaniak, Claire; Renaudat, Charlotte; Olivo-Marin, Jean-Christophe; Chrétien, Fabrice; Dromer, Françoise; Bretagne, Stéphane

    2015-09-01

    Clinical data and experimental studies suggest that bronchial epithelium could serve as a portal of entry for invasive fungal infections. We therefore analyzed the interactions between molds and the bronchial/bronchiolar epithelium at the early steps after inhalation. We developed invasive aspergillosis (Aspergillus fumigatus) and mucormycosis (Lichtheimia corymbifera) murine models that mimic the main clinical risk factors for these infections. Histopathology studies were completed with a specific computer-assisted morphometric method to quantify bronchial and alveolar spores and with transmission electron microscopy. Morphometric analysis revealed a higher number of bronchial/bronchiolar spores for A. fumigatus than L. corymbifera. The bronchial/bronchiolar spores decreased between 1 and 18 hours after inoculation for both fungi, except in corticosteroid-treated mice infected with A. fumigatus, suggesting an effect of cortisone on bronchial spore clearance. No increase in the number of spores of any species was observed over time at the basal pole of the epithelium, suggesting the lack of transepithelial crossing. Transmission electron microscopy did not show spore internalization by bronchial epithelial cells. Instead, spores were phagocytized by mononuclear cells on the apical pole of epithelial cells. Early epithelial internalization of fungal spores in vivo cannot explain the bronchial/bronchiolar epithelium invasion observed in some invasive mold infections. The bioimaging approach provides a useful means to accurately enumerate and localize the fungal spores in the pulmonary tissues.

  6. G-quadruplex enhanced fluorescence of DNA-silver nanoclusters and their application in bioimaging

    Science.gov (United States)

    Zhu, Jinbo; Zhang, Libing; Teng, Ye; Lou, Baohua; Jia, Xiaofang; Gu, Xiaoxiao; Wang, Erkang

    2015-07-01

    Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially AS1411. Benefiting from the high brightness of AS1411 enhanced DNA-AgNCs and the specific binding affinity of AS1411 for nucleolin, the AS1411 enhanced AgNCs can stain cancer cells for bioimaging.Guanine proximity based fluorescence enhanced DNA-templated silver nanoclusters (AgNCs) have been reported and applied for bioanalysis. Herein, we studied the G-quadruplex enhanced fluorescence of DNA-AgNCs and gained several significant conclusions, which will be helpful for the design of future probes. Our results demonstrate that a G-quadruplex can also effectively stimulate the fluorescence potential of AgNCs. The major contribution of the G-quadruplex is to provide guanine bases, and its special structure has no measurable impact. The DNA-templated AgNCs were further analysed by native polyacrylamide gel electrophoresis and the guanine proximity enhancement mechanism could be visually verified by this method. Moreover, the fluorescence emission of C3A (CCCA)4 stabilized AgNCs was found to be easily and effectively enhanced by G-quadruplexes, such as T30695, AS1411 and TBA, especially

  7. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  8. Chemically doped fluorescent carbon and graphene quantum dots for bioimaging, sensor, catalytic and photoelectronic applications

    Science.gov (United States)

    Du, Yan; Guo, Shaojun

    2016-01-01

    Doping fluorescent carbon dots (DFCDs) with heteroatoms have recently become of great interest compared to traditional fluorescent materials because it provides a feasible and new way to tune the intrinsic properties of carbon quantum dots (CQDs) and graphene quantum dots (GQDs) to achieve new applications for them in different fields. Since the first report on nitrogen (N) doped GQDs in 2012, more effort is being focused on exploring different procedures for making new types of DFCDs with different heteroatoms. This mini review will summarize recent research progress on DFCDs. It first reviews various doping categories achieved up to now, looking back on the synthesis method and comparing the differences in synthesis approaches between the DFCDs and the undoped ones. Then it focuses on the advances on how the doping affects the optical properties, especially DFCDs doped with N, which have been investigated the most. Finally, different applications of DFCDs involving bio-imaging, sensing, catalysis and photoelectronic devices will be discussed. This review will give new insights into how to use different synthetic methods for tuning the structure of DFCDs, understanding the correlation between the doping and properties, and achieving new applications.

  9. Organising multi-dimensional biological image information: the BioImage Database.

    Science.gov (United States)

    Carazo, J M; Stelzer, E H; Engel, A; Fita, I; Henn, C; Machtynger, J; McNeil, P; Shotton, D M; Chagoyen, M; de Alarcón, P A; Fritsch, R; Heymann, J B; Kalko, S; Pittet, J J; Rodriguez-Tomé, P; Boudier, T

    1999-01-01

    Nowadays it is possible to unravel complex information at all levels of cellular organization by obtaining multi-dimensional image information. At the macromolecular level, three-dimensional (3D) electron microscopy, together with other techniques, is able to reach resolutions at the nanometer or subnanometer level. The information is delivered in the form of 3D volumes containing samples of a given function, for example, the electron density distribution within a given macromolecule. The same situation happens at the cellular level with the new forms of light microscopy, particularly confocal microscopy, all of which produce biological 3D volume information. Furthermore, it is possible to record sequences of images over time (videos), as well as sequences of volumes, bringing key information on the dynamics of living biological systems. It is in this context that work on BioImage started two years ago, and that its first version is now presented here. In essence, BioImage is a database specifically designed to contain multi-dimensional images, perform queries and interactively work with the resulting multi-dimensional information on the World Wide Web, as well as accomplish the required cross-database links. Two sister home pages of BioImage can be accessed at http://www. bioimage.org and http://www-embl.bioimage.org

  10. Statistical methods for bioimpedance analysis

    Directory of Open Access Journals (Sweden)

    Christian Tronstad

    2014-04-01

    Full Text Available This paper gives a basic overview of relevant statistical methods for the analysis of bioimpedance measurements, with an aim to answer questions such as: How do I begin with planning an experiment? How many measurements do I need to take? How do I deal with large amounts of frequency sweep data? Which statistical test should I use, and how do I validate my results? Beginning with the hypothesis and the research design, the methodological framework for making inferences based on measurements and statistical analysis is explained. This is followed by a brief discussion on correlated measurements and data reduction before an overview is given of statistical methods for comparison of groups, factor analysis, association, regression and prediction, explained in the context of bioimpedance research. The last chapter is dedicated to the validation of a new method by different measures of performance. A flowchart is presented for selection of statistical method, and a table is given for an overview of the most important terms of performance when evaluating new measurement technology.

  11. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2003-01-01

    Comprehensive and complete, this overview provides a single-volume treatment of key algorithms and theories. The author provides clear explanations of all theoretical aspects, with rigorous proof of most results. The two-part treatment begins with the derivation of optimality conditions and discussions of convex programming, duality, generalized convexity, and analysis of selected nonlinear programs. The second part concerns techniques for numerical solutions and unconstrained optimization methods, and it presents commonly used algorithms for constrained nonlinear optimization problems. This g

  12. Analysis methods for airborne radioactivity

    OpenAIRE

    Ala-Heikkilä, Jarmo J

    2008-01-01

    High-resolution gamma-ray spectrometry is an analysis method well suitable for monitoring airborne radioactivity. Many of the natural radionuclides and a majority of anthropogenic nuclides are prominent gamma-ray emitters. With gamma-ray spectrometry different radionuclides are readily observed at minute concentrations that are far from health hazards. The gamma-ray spectrometric analyses applied in air monitoring programmes can be divided into particulate measurements and gas measurements. I...

  13. Bioimaging techniques for subcellular localization of plant hemoglobins and measurement of hemoglobin-dependent nitric oxide scavenging in planta.

    Science.gov (United States)

    Hebelstrup, Kim H; Østergaard-Jensen, Erik; Hill, Robert D

    2008-01-01

    Plant hemoglobins are ubiquitous in all plant families. They are expressed at low levels in specific tissues. Several studies have established that plant hemoglobins are scavengers of nitric oxide (NO) and that varying the endogenous level of hemoglobin in plant cells negatively modulates bioactivity of NO generated under hypoxic conditions or during cellular signaling. Earlier methods for determination of hemoglobin-dependent scavenging in planta were based on measuring activity in whole plants or organs. Plant hemoglobins do not contain specific organelle localization signals; however, earlier reports on plant hemoglobin have demonstrated either cytosolic or nuclear localization, depending on the method or cell type investigated. We have developed two bioimaging techniques: one for visualization of hemoglobin-catalyzed scavenging of NO in specific cells and another for visualization of subcellular localization of green fluorescent protein-tagged plant hemoglobins in transformed Arabidopsis thaliana plants.

  14. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    Science.gov (United States)

    Gongalsky, M. B.; Osminkina, L. A.; Pereira, A.; Manankov, A. A.; Fedorenko, A. A.; Vasiliev, A. N.; Solovyev, V. V.; Kudryavtsev, A. A.; Sentis, M.; Kabashin, A. V.; Timoshenko, V. Yu.

    2016-04-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparency near 800 nm. Based on the laser ablation of crystalline Si targets in gaseous helium, followed by ultrasound-assisted dispersion of the deposited films in physiological saline, the proposed method avoids any toxic by-products during the synthesis. We demonstrate efficient contrast of the Si QDs in living cells by following the exciton PL. We also show that the prepared QDs do not provoke any cytoxicity effects while penetrating into the cells and efficiently accumulating near the cell membrane and in the cytoplasm. Combined with the possibility of enabling parallel therapeutic channels, ultrapure laser-synthesized Si nanostructures present unique object for cancer theranostic applications.

  15. Oleyl-hyaluronan micelles loaded with upconverting nanoparticles for bio-imaging

    Energy Technology Data Exchange (ETDEWEB)

    Pospisilova, Martina, E-mail: martina.pospisilova@contipro.com; Mrazek, Jiri; Matuska, Vit; Kettou, Sofiane; Dusikova, Monika; Svozil, Vit; Nesporova, Kristina; Huerta-Angeles, Gloria; Vagnerova, Hana; Velebny, Vladimir [Contipro Biotech (Czech Republic)

    2015-09-15

    Hyaluronan (HA) represents an interesting polymer for nanoparticle coating due to its biocompatibility and enhanced cell interaction via CD44 receptor. Here, we describe incorporation of oleate-capped β–NaYF{sub 4}:Yb{sup 3+}, Er{sup 3+} nanoparticles (UCNP-OA) into amphiphilic HA by microemulsion method. Resulting structures have a spherical, micelle-like appearance with a hydrodynamic diameter of 180 nm. UCNP-OA-loaded HA micelles show a good stability in PBS buffer and cell culture media. The intensity of green emission of UCNP-OA-loaded HA micelles in water is about five times higher than that of ligand-free UCNP, indicating that amphiphilic HA effectively protects UCNP luminescence from quenching by water molecules. We found that UCNP-OA-loaded HA micelles in concentrations up to 50 μg mL{sup −1} increase cell viability of normal human dermal fibroblasts (NHDF), while viability of human breast adenocarcinoma cells MDA–MB–231 is reduced at these concentrations. The utility of UCNP-OA-loaded HA micelles as a bio-imaging probe was demonstrated in vitro by successful labelling of NHDF and MDA–MB–231 cells overexpressing the CD44 receptor.

  16. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging

    Science.gov (United States)

    Zhou, Tao; Jia, Lei; Luo, Yi-Feng; Xu, Jun; Chen, Ru-Hua; Ge, Zhi-Jun; Ma, Tie-Liang; Chen, Hong; Zhu, Tao-Feng

    2016-01-01

    A novel multifunctional halloysite nanotube (HNT)-based Fe3O4@HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane)3 nanocomposite (Fe-HNT-Eu NC) with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI) contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers. PMID:27698562

  17. Bifunctional Luminomagnetic Rare-Earth Nanorods for High-Contrast Bioimaging Nanoprobes

    Science.gov (United States)

    Gupta, Bipin Kumar; Singh, Satbir; Kumar, Pawan; Lee, Yean; Kedawat, Garima; Narayanan, Tharangattu N.; Vithayathil, Sajna Antony; Ge, Liehui; Zhan, Xiaobo; Gupta, Sarika; Martí, Angel A.; Vajtai, Robert; Ajayan, Pulickel M.; Kaipparettu, Benny Abraham

    2016-09-01

    Nanoparticles exhibiting both magnetic and luminescent properties are need of the hour for many biological applications. A single compound exhibiting this combination of properties is uncommon. Herein, we report a strategy to synthesize a bifunctional luminomagnetic Gd2‑xEuxO3 (x = 0.05 to 0.5) nanorod, with a diameter of ~20 nm and length in ~0.6 μm, using hydrothermal method. Gd2O3:Eu3+ nanorods have been characterized by studying its structural, optical and magnetic properties. The advantage offered by photoluminescent imaging with Gd2O3:Eu3+ nanorods is that this ultrafine nanorod material exhibits hypersensitive intense red emission (610 nm) with good brightness (quantum yield more than 90%), which is an essential parameter for high-contrast bioimaging, especially for overcoming auto fluorescent background. The utility of luminomagnetic nanorods for biological applications in high-contrast cell imaging capability and cell toxicity to image two human breast cancer cell lines T47D and MDA-MB-231 are also evaluated. Additionally, to understand the significance of shape of the nanostructure, the photoluminescence and paramagnetic characteristic of Gd2O3:Eu3+ nanorods were compared with the spherical nanoparticles of Gd2O3:Eu3+.

  18. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging.

    Science.gov (United States)

    Gongalsky, M B; Osminkina, L A; Pereira, A; Manankov, A A; Fedorenko, A A; Vasiliev, A N; Solovyev, V V; Kudryavtsev, A A; Sentis, M; Kabashin, A V; Timoshenko, V Yu

    2016-04-22

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparency near 800 nm. Based on the laser ablation of crystalline Si targets in gaseous helium, followed by ultrasound-assisted dispersion of the deposited films in physiological saline, the proposed method avoids any toxic by-products during the synthesis. We demonstrate efficient contrast of the Si QDs in living cells by following the exciton PL. We also show that the prepared QDs do not provoke any cytoxicity effects while penetrating into the cells and efficiently accumulating near the cell membrane and in the cytoplasm. Combined with the possibility of enabling parallel therapeutic channels, ultrapure laser-synthesized Si nanostructures present unique object for cancer theranostic applications.

  19. Gait analysis methods in rehabilitation

    Directory of Open Access Journals (Sweden)

    Baker Richard

    2006-03-01

    Full Text Available Abstract Introduction Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. Measurement methods in clinical gait analysis The state of the art of optical systems capable of measuring the positions of retro-reflective markers placed on the skin is sufficiently advanced that they are probably no longer a significant source of error in clinical gait analysis. Determining the anthropometry of the subject and compensating for soft tissue movement in relation to the under-lying bones are now the principal problems. Techniques for using functional tests to determine joint centres and axes of rotation are starting to be used successfully. Probably the last great challenge for optical systems is in using computational techniques to compensate for soft tissue measurements. In the long term future it is possible that direct imaging of bones and joints in three dimensions (using MRI or fluoroscopy may replace marker based systems. Methods for interpreting gait analysis data There is still not an accepted general theory of why we walk the way we do. In the absence of this, many explanations of walking address the mechanisms by which specific movements are achieved by particular muscles. A whole new methodology is developing to determine the functions of individual muscles. This needs further development and validation. A particular requirement is for subject specific models incorporating 3-dimensional imaging data of the musculo-skeletal anatomy with kinematic and kinetic data. Methods for understanding the effects of intervention Clinical gait analysis is extremely limited if it does not allow clinicians to choose between alternative possible interventions or to predict outcomes. This can be achieved either by rigorously planned clinical trials or using

  20. Advanced bioimaging technologies in assessment of the quality of bone and scaffold materials. Techniques and applications

    Energy Technology Data Exchange (ETDEWEB)

    Qin Ling; Leung, Kwok Sui (eds.) [Chinese Univ. of Hong Kong (China). Dept. of Orthopaedics and Traumatology; Genant, H.K. [California Univ., San Francisco, CA (United States); Griffith, J.F. [Chinese Univ. of Hong Kong (China). Dept. of Radiology and Organ Imaging

    2007-07-01

    This book provides a perspective on the current status of bioimaging technologies developed to assess the quality of musculoskeletal tissue with an emphasis on bone and cartilage. It offers evaluations of scaffold biomaterials developed for enhancing the repair of musculoskeletal tissues. These bioimaging techniques include micro-CT, nano-CT, pQCT/QCT, MRI, and ultrasound, which provide not only 2-D and 3-D images of the related organs or tissues, but also quantifications of the relevant parameters. The advance bioimaging technologies developed for the above applications are also extended by incorporating imaging contrast-enhancement materials. Thus, this book will provide a unique platform for multidisciplinary collaborations in education and joint R and D among various professions, including biomedical engineering, biomaterials, and basic and clinical medicine. (orig.)

  1. Metal and Complementary Molecular Bioimaging in Alzheimer’s Disease

    Directory of Open Access Journals (Sweden)

    Nady eBraidy

    2014-07-01

    Full Text Available Alzheimer’s disease (AD is the leading cause of dementia in the elderly. AD represents a complex neurological disorder which is best understood as the consequence of a number of interconnected genetic and lifestyle variables, which culminate in multiple changes to brain structure and function. At a molecular level, metal dyshomeostasis is frequently observed in AD due to anomalous binding of metals such as Iron (Fe, Copper (Cu and Zinc (Zn, or impaired regulation of redox-active metals which can induce the formation of cytotoxic reactive oxygen species and neuronal damage. Neuroimaging of metals in a variety of intact brain cells and tissues is emerging as an important tool for increasing our understanding of the role of metal dysregulation in AD. Several imaging techniques have been used to study the cerebral metallo-architecture in biological specimens to obtain spatially resolved data on chemical elements present in a sample. Hyperspectral techniques, such as particle-induced X-ray emission (PIXE, energy dispersive X-ray spectroscopy (EDS, X-ray fluorescence microscopy (XFM, synchrotron X-ray fluorescence (SXRF, secondary ion mass spectrometry (SIMS, and laser ablation inductively coupled mass spectrometry (LA-ICPMS can reveal relative intensities and even semi-quantitative concentrations of a large set of elements with differing spatial resolution and detection sensitivities. Other mass spectrometric and spectroscopy imaging techniques such as laser ablation electrospray ionisation mass spectrometry (LA ESI-MS, MALDI imaging mass spectrometry (MALDI-IMS, and Fourier transform infrared spectroscopy (FTIR can be used to correlate changes in elemental distribution with the underlying pathology in AD brain specimens. The current review aims to discuss the advantages and challenges of using these emerging elemental and molecular imaging techniques, and highlight clinical achievements in AD research using bioimaging techniques.

  2. Ultrastable green fluorescence carbon dots with a high quantum yield for bioimaging and use as theranostic carriers

    DEFF Research Database (Denmark)

    Yang, Chuanxu; Thomsen, Rasmus Peter; Ogaki, Ryosuke

    2015-01-01

    in biomedical applications. Oligoethylenimine (OEI)–β-cyclodextrin (βCD) Cdots were synthesised using a simple and fast heating method in phosphoric acid. The synthesised Cdots showed strong green fluorescence under UV excitation with a 30% quantum yield and exhibited superior stability over a wide pH range. We......Carbon dots (Cdots) have recently emerged as a novel platform of fluorescent nanomaterials. These carbon nanoparticles have great potential in biomedical applications such as bioimaging as they exhibit excellent photoluminescence properties, chemical inertness and low cytotoxicity in comparison...... to widely used semiconductor quantum dots. However, it remains a great challenge to prepare highly stable, water-soluble green luminescent Cdots with a high quantum yield. Herein we report a new synthesis route for green luminescent Cdots imbuing these desirable properties and demonstrate their potential...

  3. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  4. Ultra-bright emission from hexagonal boron nitride defects as a new platform for bio-imaging and bio-labelling

    Science.gov (United States)

    Elbadawi, Christopher; Tran, Trong Toan; Shimoni, Olga; Totonjian, Daniel; Lobo, Charlene J.; Grosso, Gabriele; Moon, Hyowan; Englund, Dirk R.; Ford, Michael J.; Aharonovich, Igor; Toth, Milos

    2016-12-01

    Bio-imaging requires robust ultra-bright probes without causing any toxicity to the cellular environment, maintain their stability and are chemically inert. In this work we present hexagonal boron nitride (hBN) nanoflakes which exhibit narrowband ultra-bright single photon emitters1. The emitters are optically stable at room temperature and under ambient environment. hBN has also been noted to be noncytotoxic and seen significant advances in functionalization with biomolecules2,3. We further demonstrate two methods of engineering this new range of extremely robust multicolour emitters across the visible and near infrared spectral ranges for large scale sensing and biolabeling applications.

  5. Recent developments in gold(I) coordination chemistry: luminescence properties and bioimaging opportunities.

    Science.gov (United States)

    Langdon-Jones, Emily E; Pope, Simon J A

    2014-09-18

    The fascinating biological activity of gold coordination compounds has led to the development of a wide range of complexes. The precise biological action of such species is often poorly understood and the ability to map gold distribution in cellular environments is key. This article discusses the recent progress in luminescent Au(I) complexes whilst considering their utility in bioimaging and therapeutics.

  6. From Science History and Applications Developments of Life System to Bio-Imaging Technology

    Institute of Scientific and Technical Information of China (English)

    YAN Li-min; LOU Wei; HE Guo-sen

    2004-01-01

    This paper presents brief science history and application developments of imaging technology,and discusses the bio-imaging technology. Real-time image measurement techniques and parallel processing of a realistic example is given. Finally coming converging technology of nano-bio-Info-cgno (NBIC) is extended for future trend.

  7. Upconverting and NIR emitting rare earth based nanostructures for NIR-bioimaging.

    Science.gov (United States)

    Hemmer, Eva; Venkatachalam, Nallusamy; Hyodo, Hiroshi; Hattori, Akito; Ebina, Yoshie; Kishimoto, Hidehiro; Soga, Kohei

    2013-12-07

    In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near-infrared (NIR) range under NIR excitation may overcome those problems. Due to the outstanding optical and magnetic properties of lanthanide ions (Ln(3+)), nanoscopic host materials doped with Ln(3+), e.g. Y2O3:Er(3+),Yb(3+), are promising candidates for NIR-NIR bioimaging. Ln(3+)-doped gadolinium-based inorganic nanostructures, such as Gd2O3:Er(3+),Yb(3+), have a high potential as opto-magnetic markers allowing the combination of time-resolved optical imaging and magnetic resonance imaging (MRI) of high spatial resolution. Recent progress in our research on over-1000 nm NIR fluorescent nanoprobes for in vivo NIR-NIR bioimaging will be discussed in this review.

  8. Alexa fluor-labeled fluorescent cellulose nanocrystals for bioimaging solid cellulose in spatially structured microenvironments.

    Science.gov (United States)

    Grate, Jay W; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G; Kelly, Ryan T; Orr, Galya; Hu, Dehong; Dehoff, Karl J; Brockman, Fred J; Wilkins, Michael J

    2015-03-18

    Methods to covalently conjugate Alexa Fluor dyes to cellulose nanocrystals, at limiting amounts that retain the overall structure of the nanocrystals as model cellulose materials, were developed using two approaches. In the first, aldehyde groups are created on the cellulose surfaces by reaction with limiting amounts of sodium periodate, a reaction well-known for oxidizing vicinal diols to create dialdehyde structures. Reductive amination reactions were then applied to bind Alexa Fluor dyes with terminal amino-groups on the linker section. In the absence of the reductive step, dye washes out of the nanocrystal suspension, whereas with the reductive step, a colored product is obtained with the characteristic spectral bands of the conjugated dye. In the second approach, Alexa Fluor dyes were modified to contain chloro-substituted triazine ring at the end of the linker section. These modified dyes then were reacted with cellulose nanocrystals in acetonitrile at elevated temperature, again isolating material with the characteristic spectral bands of the Alexa Fluor dye. Reactions with Alexa Fluor 546 are given as detailed examples, labeling on the order of 1% of the total glucopyranose rings of the cellulose nanocrystals at dye loadings of ca. 5 μg/mg cellulose. Fluorescent cellulose nanocrystals were deposited in pore network microfluidic structures (PDMS) and proof-of-principle bioimaging experiments showed that the spatial localization of the solid cellulose deposits could be determined, and their disappearance under the action of Celluclast enzymes or microbes could be observed over time. In addition, single molecule fluorescence microscopy was demonstrated as a method to follow the disappearance of solid cellulose deposits over time, following the decrease in the number of single blinking dye molecules with time instead of fluorescent intensity.

  9. Multifunctional nanocomposite based on halloysite nanotubes for efficient luminescent bioimaging and magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Zhou T

    2016-09-01

    Full Text Available Tao Zhou,1 Lei Jia,1 Yi-Feng Luo,2 Jun Xu,1 Ru-Hua Chen,2 Zhi-Jun Ge,2 Tie-Liang Ma,2 Hong Chen,2 Tao-Feng Zhu2 1Department of Physics and Chemistry, Henan Polytechnic University, Jiaozuo, Henan, 2The Affiliated Yixing Hospital of Jiangsu University, Yixing, Jiangsu, People’s Republic of China Abstract: A novel multifunctional halloysite nanotube (HNT-based Fe3O4@HNT-polyethyleneimine-Tip-Eu(dibenzoylmethane3 nanocomposite (Fe-HNT-Eu NC with both photoluminescent and magnetic properties was fabricated by a simple one-step hydrothermal process combined with the coupling grafting method, which exhibited high suspension stability and excellent photophysical behavior. The as-prepared multifunctional Fe-HNT-Eu NC was characterized using various techniques. The results of cell viability assay, cell morphological observation, and in vivo toxicity assay indicated that the NC exhibited excellent biocompatibility over the studied concentration range, suggesting that the obtained Fe-HNT-Eu NC was a suitable material for bioimaging and biological applications in human hepatic adenocarcinoma cells. Furthermore, the biocompatible Fe-HNT-Eu NC displayed superparamagnetic behavior with high saturation magnetization and also functioned as a magnetic resonance imaging (MRI contrast agent in vitro and in vivo. The results of the MRI tests indicated that the Fe-HNT-Eu NC can significantly decrease the T2 signal intensity values of the normal liver tissue and thus make the boundary between the normal liver and transplanted cancer more distinct, thus effectively improving the diagnosis effect of cancers. Keywords: halloysite nanotube, lanthanide complex, iron oxide, luminescence, contrast agent

  10. Rapid solid-phase microwave synthesis of highly photoluminescent nitrogen-doped carbon dots for Fe3+ detection and cellular bioimaging

    Science.gov (United States)

    He, Guili; Xu, Minghan; Shu, Mengjun; Li, Xiaolin; Yang, Zhi; Zhang, Liling; Su, Yanjie; Hu, Nantao; Zhang, Yafei

    2016-09-01

    Recently, carbon dots (CDs) have been playing an increasingly important role in industrial production and biomedical field because of their excellent properties. As such, finding an efficient method to quickly synthesize a large scale of relatively high purity CDs is of great interest. Herein, a facile and novel microwave method has been applied to prepare nitrogen doped CDs (N-doped CDs) within 8 min using L-glutamic acid as the sole reaction precursor in the solid phase condition. The as-prepared N-doped CDs with an average size of 1.64 nm are well dispersed in aqueous solution. The photoluminescence of N-doped CDs is pH-sensitive and excitation-dependent. The N-doped CDs show a strong blue fluorescence with relatively high fluorescent quantum yield of 41.2%, which remains stable even under high ionic strength. Since the surface is rich in oxygen-containing functional groups, N-doped CDs can be applied to selectively detect Fe3+ with the limit of detection of 10-5 M. In addition, they are also used for cellular bioimaging because of their high fluorescent intensity and nearly zero cytotoxicity. The solid-phase microwave method seems to be an effective strategy to rapidly obtain high quality N-doped CDs and expands their applications in ion detection and cellular bioimaging.

  11. A cyclometalated iridium(III) complex with enhanced phosphorescence emission in the solid state (EPESS): synthesis, characterization and its application in bioimaging.

    Science.gov (United States)

    Wu, Huazhou; Yang, Tianshe; Zhao, Qiang; Zhou, Jing; Li, Chunyan; Li, Fuyou

    2011-03-07

    Iridium(III) complexes with intense phosphorescence in solution have been widely applied in organic light-emitting diodes, chemosensors and bioimaging. However, little attention has been paid to iridium(III) complexes showing weak phosphorescence in solution and enhanced phosphorescence emission in the solid state (EPESS). In the present study, two β-diketonate ligands with different degrees of conjugation, 1-phenyl-3-methyl-4-benzoyl-5-pyrazolone (HL1) and 1-phenyl-3-methyl-4-phenylacetyl-5-pyrazolone (HL2), have been synthesized to be used as ancillary ligands for two iridium(III) complexes, Ir(ppy)(2)(L1) and Ir(ppy)(2)(L2) (Hppy = 2-phenylpyridine). The two complexes have been characterized by single-crystal X-ray crystallography, (1)H NMR and elemental analysis. Interestingly, Ir(ppy)(2)(L1) is EPESS-active whereas Ir(ppy)(2)(L2) exhibits moderately intense emission both in solution and as a neat film, indicating that the degree of conjugation of the β-diketone ligands determines the EPESS-activity. The single-crystal X-ray analysis has indicated that there are π-π interactions between the adjacent ppy ligands in Ir(ppy)(2)(L1) but not in Ir(ppy)(2)(L2). Finally, EPESS-active Ir(ppy)(2)(L1) has been successfully embedded in polymer nanoparticles and used as a luminescent label in bioimaging.

  12. Tissue cartography: compressing bio-image data by dimensional reduction.

    Science.gov (United States)

    Heemskerk, Idse; Streichan, Sebastian J

    2015-12-01

    The high volumes of data produced by state-of-the-art optical microscopes encumber research. We developed a method that reduces data size and processing time by orders of magnitude while disentangling signal by taking advantage of the laminar structure of many biological specimens. Our Image Surface Analysis Environment automatically constructs an atlas of 2D images for arbitrarily shaped, dynamic and possibly multilayered surfaces of interest. Built-in correction for cartographic distortion ensures that no information on the surface is lost, making the method suitable for quantitative analysis. We applied our approach to 4D imaging of a range of samples, including a Drosophila melanogaster embryo and a Danio rerio beating heart.

  13. Stable and Size-Tunable Aggregation-Induced Emission Nanoparticles Encapsulated with Nanographene Oxide and Applications in Three-Photon Fluorescence Bioimaging.

    Science.gov (United States)

    Zhu, Zhenfeng; Qian, Jun; Zhao, Xinyuan; Qin, Wei; Hu, Rongrong; Zhang, Hequn; Li, Dongyu; Xu, Zhengping; Tang, Ben Zhong; He, Sailing

    2016-01-26

    Organic fluorescent dyes with high quantum yield are widely applied in bioimaging and biosensing. However, most of them suffer from a severe effect called aggregation-caused quenching (ACQ), which means that their fluorescence is quenched at high molecular concentrations or in the aggregation state. Aggregation-induced emission (AIE) is a diametrically opposite phenomenon to ACQ, and luminogens with this feature can effectively solve this problem. Graphene oxide has been utilized as a quencher for many fluorescent dyes, based on which biosensing can be achieved. However, using graphene oxide as a surface modification agent of fluorescent nanoparticles is seldom reported. In this article, we used nanographene oxide (NGO) to encapsulate fluorescent nanoparticles, which consisted of a type of AIE dye named TPE-TPA-FN (TTF). NGO significantly improved the stability of nanoparticles in aqueous dispersion. In addition, this method could control the size of nanoparticles' flexibly as well as increase their emission efficiency. We then used the NGO-modified TTF nanoparticles to achieve three-photon fluorescence bioimaging. The architecture of ear blood vessels in mice and the distribution of nanoparticles in zebrafish could be observed clearly. Furthermore, we extended this method to other AIE luminogens and showed it was widely feasible.

  14. Rapid coal proximate analysis by thermogravimetric method

    Energy Technology Data Exchange (ETDEWEB)

    Mao Jianxiong; Yang Dezhong; Zhao Baozhong

    1987-09-01

    A rapid coal proximate analysis by thermogravimetric analysis (TGA) can be used as an alternative method for the standard proximate analysis. This paper presents a program set up to rapidly perform coal proximate analysis by using a thermal analyzer and TGA module. A comparison between coal proximate analyses by standard method (GB) and TGA is also given. It shows that most data from TGA fall within the tolerance limit of standard method.

  15. Sustainable, Rapid Synthesis of Bright-Luminescent CuInS2-ZnS Alloyed Nanocrystals: Multistage Nano-xenotoxicity Assessment and Intravital Fluorescence Bioimaging in Zebrafish-Embryos

    Science.gov (United States)

    Chetty, S. Shashank; Praneetha, S.; Basu, Sandeep; Sachidanandan, Chetana; Murugan, A. Vadivel

    2016-05-01

    Near-infrared (NIR) luminescent CuInS2-ZnS alloyed nanocrystals (CIZS-NCs) for highly fluorescence bioimaging have received considerable interest in recent years. Owing, they became a desirable alternative to heavy-metal based-NCs and organic dyes with unique optical properties and low-toxicity for bioimaging and optoelectronic applications. In the present study, bright and robust CIZS-NCs have been synthesized within 5 min, as-high-as 230 °C without requiring any inert-gas atmosphere via microwave-solvothermal (MW-ST) method. Subsequently, the in vitro and in vivo nano-xenotoxicity and cellular uptake of the MUA-functionalized CIZS-NCs were investigated in L929, Vero, MCF7 cell lines and zebrafish-embryos. We observed minimal toxicity and acute teratogenic consequences upto 62.5 μg/ml of the CIZS-NCs in zebrafish-embryos. We also observed spontaneous uptake of the MUA-functionalized CIZS-NCs by 3 dpf older zebrafish-embryos that are evident through bright red fluorescence-emission at a low concentration of 7.8 μg/mL. Hence, we propose that the rapid, low-cost, large-scale “sustainable” MW-ST synthesis of CIZS-NCs, is an ideal bio-nanoprobe with good temporal and spatial resolution for rapid labeling, long-term in vivo tracking and intravital-fluorescence-bioimaging (IVBI).

  16. Prognostic Analysis System and Methods of Operation

    Science.gov (United States)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  17. Convergence analysis of combinations of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Y. [Clarkson Univ., Potsdam, NY (United States)

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  18. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  19. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  20. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  1. Hydrothermal synthesis of NaLuF4:153Sm,Yb,Tm nanoparticles and their application in dual-modality upconversion luminescence and SPECT bioimaging.

    Science.gov (United States)

    Yang, Yang; Sun, Yun; Cao, Tianye; Peng, Juanjuan; Liu, Ying; Wu, Yongquan; Feng, Wei; Zhang, Yingjian; Li, Fuyou

    2013-01-01

    Upconversion luminescence (UCL) properties and radioactivity have been integrated into NaLuF(4):(153)Sm,Yb,Tm nanoparticles by a facile one-step hydrothermal method, making these nanoparticles potential candidates for UCL and single-photon emission computed tomography (SPECT) dual-modal bioimaging in vivo. The introduction of small amount of radioactive (153)Sm(3+) can hardly vary the upconversion luminescence properties of the nanoparticles. The as-designed nanoparticles showed very low cytotoxicity, no obvious tissue damage in 7 days, and excellent in vitro and in vivo performances in dual-modal bioimaging. By means of a combination of UCL and SPECT imaging in vivo, the distribution of the nanoparticles in living animals has been studied, and the results indicated that these particles were mainly accumulated in the liver and spleen. Therefore, the concept of (153)Sm(3+)/Yb(3+)/Tm(3+) co-doped NaLuF(4) nanoparticles for UCL and SPECT dual-modality imaging in vivo of whole-body animals may serve as a platform for next-generation probes for ultra-sensitive molecular imaging from the cellular scale to whole-body evaluation. It also introduces an easy methodology to quantify in vivo biodistribution of nanomaterials which still needs further understanding as a community.

  2. Sulfobetaine-Vinylimidazole Block Copolymers: A Robust Quantum Dot Surface Chemistry Expanding Bioimaging's Horizons.

    Science.gov (United States)

    Tasso, Mariana; Giovanelli, Emerson; Zala, Diana; Bouccara, Sophie; Fragola, Alexandra; Hanafi, Mohamed; Lenkei, Zsolt; Pons, Thomas; Lequeux, Nicolas

    2015-11-24

    Long-term inspection of biological phenomena requires probes of elevated intra- and extracellular stability and target biospecificity. The high fluorescence and photostability of quantum dot (QD) nanoparticles contributed to foster their promise as bioimaging tools that could overcome limitations associated with traditional fluorophores. However, QDs' potential as a bioimaging platform relies upon a precise control over the surface chemistry modifications of these nano-objects. Here, a zwitterion-vinylimidazole block copolymer ligand was synthesized, which regroups all anchoring groups in one compact terminal block, while the rest of the chain is endowed with antifouling and bioconjugation moieties. By further application of an oriented bioconjugation approach with whole IgG antibodies, QD nanobioconjugates were obtained that display outstanding intra- and extracellular stability as well as biorecognition capacity. Imaging the internalization and intracellular dynamics of a transmembrane cell receptor, the CB1 brain cannabinoid receptor, both in HEK293 cells and in neurons, illustrates the breadth of potential applications of these nanoprobes.

  3. Upconverting and NIR emitting rare earth based nanostructures for NIR-bioimaging

    Science.gov (United States)

    Hemmer, Eva; Venkatachalam, Nallusamy; Hyodo, Hiroshi; Hattori, Akito; Ebina, Yoshie; Kishimoto, Hidehiro; Soga, Kohei

    2013-11-01

    In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near-infrared (NIR) range under NIR excitation may overcome those problems. Due to the outstanding optical and magnetic properties of lanthanide ions (Ln3+), nanoscopic host materials doped with Ln3+, e.g. Y2O3:Er3+,Yb3+, are promising candidates for NIR-NIR bioimaging. Ln3+-doped gadolinium-based inorganic nanostructures, such as Gd2O3:Er3+,Yb3+, have a high potential as opto-magnetic markers allowing the combination of time-resolved optical imaging and magnetic resonance imaging (MRI) of high spatial resolution. Recent progress in our research on over-1000 nm NIR fluorescent nanoprobes for in vivo NIR-NIR bioimaging will be discussed in this review.In recent years, significant progress was achieved in the field of nanomedicine and bioimaging, but the development of new biomarkers for reliable detection of diseases at an early stage, molecular imaging, targeting and therapy remains crucial. The disadvantages of commonly used organic dyes include photobleaching, autofluorescence, phototoxicity and scattering when UV (ultraviolet) or visible light is used for excitation. The limited penetration depth of the excitation light and the visible emission into and from the biological tissue is a further drawback with regard to in vivo bioimaging. Lanthanide containing inorganic nanostructures emitting in the near

  4. Nanoparticles prepared from porous silicon nanowires for bio-imaging and sonodynamic therapy.

    Science.gov (United States)

    Osminkina, Liubov A; Sivakov, Vladimir A; Mysov, Grigory A; Georgobiani, Veronika A; Natashina, Ulyana А; Talkenberg, Florian; Solovyev, Valery V; Kudryavtsev, Andrew A; Timoshenko, Victor Yu

    2014-01-01

    Evaluation of cytotoxicity, photoluminescence, bio-imaging, and sonosensitizing properties of silicon nanoparticles (SiNPs) prepared by ultrasound grinding of porous silicon nanowires (SiNWs) have been investigated. SiNWs were formed by metal (silver)-assisted wet chemical etching of heavily boron-doped (100)-oriented single crystalline silicon wafers. The prepared SiNWs and aqueous suspensions of SiNPs exhibit efficient room temperature photoluminescence (PL) in the spectral region of 600 to 1,000 nm that is explained by the radiative recombination of excitons confined in small silicon nanocrystals, from which SiNWs and SiNPs consist of. On the one hand, in vitro studies have demonstrated low cytotoxicity of SiNPs and possibilities of their bio-imaging applications. On the other hand, it has been found that SiNPs can act as efficient sensitizers of ultrasound-induced suppression of the viability of Hep-2 cancer cells.

  5. Neodymium-doped nanoparticles for infrared fluorescence bioimaging: The role of the host

    Energy Technology Data Exchange (ETDEWEB)

    Rosal, Blanca del; Pérez-Delgado, Alberto; Rocha, Ueslen; Martín Rodríguez, Emma; Jaque, Daniel, E-mail: daniel.jaque@uam.es [Fluorescence Imaging Group, Dpto. de Física de Materiales, Facultad de Ciencias, Universidad Autónoma de Madrid, Campus de Cantoblanco, Madrid 28049 (Spain); Misiak, Małgorzata; Bednarkiewicz, Artur [Wroclaw Research Centre EIT+, ul. Stabłowicka 147, 54-066 Wrocław (Poland); Institute of Physics, University of Tartu, 14c Ravila Str., 50411 Tartu (Estonia); Vanetsev, Alexander S. [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Orlovskii, Yurii [Institute of Low Temperature and Structure Research, PAS, ul. Okólna 2, 50-422 Wrocław (Poland); Prokhorov General Physics Institute RAS, 38 Vavilov Str., 119991 Moscow (Russian Federation); Jovanović, Dragana J.; Dramićanin, Miroslav D. [Vinča Institute of Nuclear Sciences, University of Belgrade, P.O. Box 522, Belgrade 11001 (Serbia); Upendra Kumar, K.; Jacinto, Carlos [Grupo de Fotônica e Fluidos Complexos, Instituto de Física, Universidade Federal de Alagoas, 57072-900 Maceió-AL (Brazil); Navarro, Elizabeth [Depto. de Química, Eco Catálisis, UAM-Iztapalapa, Sn. Rafael Atlixco 186, México 09340, D.F (Mexico); and others

    2015-10-14

    The spectroscopic properties of different infrared-emitting neodymium-doped nanoparticles (LaF{sub 3}:Nd{sup 3+}, SrF{sub 2}:Nd{sup 3+}, NaGdF{sub 4}: Nd{sup 3+}, NaYF{sub 4}: Nd{sup 3+}, KYF{sub 4}: Nd{sup 3+}, GdVO{sub 4}: Nd{sup 3+}, and Nd:YAG) have been systematically analyzed. A comparison of the spectral shapes of both emission and absorption spectra is presented, from which the relevant role played by the host matrix is evidenced. The lack of a “universal” optimum system for infrared bioimaging is discussed, as the specific bioimaging application and the experimental setup for infrared imaging determine the neodymium-doped nanoparticle to be preferentially used in each case.

  6. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    N R B Krishnam Raju; J Nagabhushanam

    2000-08-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems are considered. The characteristic matrices of the elements used in these problems are developed and later these structures are analysed. The results of the analysis are compared with the results of the displacement method. It has been demonstrated that the integrated force method is equally viable and efficient as compared to the displacement method.

  7. Cost Analysis: Methods and Realities.

    Science.gov (United States)

    Cummings, Martin M.

    1989-01-01

    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  8. Carbon dots as a luminescence sensor for ultrasensitive detection of phosphate and their bioimaging properties.

    Science.gov (United States)

    Xu, Jingyi; Zhou, Ying; Cheng, Guifang; Dong, Meiting; Liu, Shuxian; Huang, Chaobiao

    2015-06-01

    Highly blue fluorescence carbon dots were synthesized by one-step hydrothermal treatment of potatoes. The as-obtained C-dots have been applied to bioimaging of HeLa cells, which shows their excellent biocompatibility and low cytotoxicity. The results reveal that C-dots are promising for real cell imaging applications. In addition, the carbon dots can be utilized as a probe for sensing phosphate.

  9. Laser-synthesized oxide-passivated bright Si quantum dots for bioimaging

    OpenAIRE

    2016-01-01

    Crystalline silicon (Si) nanoparticles present an extremely promising object for bioimaging based on photoluminescence (PL) in the visible and near-infrared spectral regions, but their efficient PL emission in aqueous suspension is typically observed after wet chemistry procedures leading to residual toxicity issues. Here, we introduce ultrapure laser-synthesized Si-based quantum dots (QDs), which are water-dispersible and exhibit bright exciton PL in the window of relative tissue transparenc...

  10. Alexa Fluor-labeled Fluorescent Cellulose Nanocrystals for Bioimaging Solid Cellulose in Spatially Structured Microenvironments

    Energy Technology Data Exchange (ETDEWEB)

    Grate, Jay W.; Mo, Kai-For; Shin, Yongsoon; Vasdekis, Andreas; Warner, Marvin G.; Kelly, Ryan T.; Orr, Galya; Hu, Dehong; Dehoff, Karl J.; Brockman, Fred J.; Wilkins, Michael J.

    2015-03-18

    Cellulose nanocrystal materials have been labeled with modern Alexa Fluor dyes in a process that first links the dye to a cyanuric chloride molecule. Subsequent reaction with cellulose nanocrystals provides dyed solid microcrystalline cellulose material that can be used for bioimaging and suitable for deposition in films and spatially structured microenvironments. It is demonstrated with single molecular fluorescence microscopy that these films are subject to hydrolysis by cellulose enzymes.

  11. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  12. Facile Synthesis of Amine-Functionalized Eu3+-Doped La(OH3 Nanophosphors for Bioimaging

    Directory of Open Access Journals (Sweden)

    Sun Conroy

    2011-01-01

    Full Text Available Abstract Here, we report a straightforward synthesis process to produce colloidal Eu3+-activated nanophosphors (NPs for use as bioimaging probes. In this procedure, poly(ethylene glycol serves as a high-boiling point solvent allowing for nanoscale particle formation as well as a convenient medium for solvent exchange and subsequent surface modification. The La(OH3:Eu3+ NPs produced by this process were ~3.5 nm in diameter as determined by transmission electron microscopy. The NP surface was coated with aminopropyltriethoxysilane to provide chemical functionality for attachment of biological ligands, improve chemical stability and prevent surface quenching of luminescent centers. Photoluminescence spectroscopy of the NPs displayed emission peaks at 597 and 615 nm (λex = 280 nm. The red emission, due to 5D0 → 7F1 and 5D0 → 7F2 transitions, was linear with concentration as observed by imaging with a conventional bioimaging system. To demonstrate the feasibility of these NPs to serve as optical probes in biological applications, an in vitro experiment was performed with HeLa cells. NP emission was observed in the cells by fluorescence microscopy. In addition, the NPs displayed no cytotoxicity over the course of a 48-h MTT cell viability assay. These results suggest that La(OH3:Eu3+ NPs possess the potential to serve as a luminescent bioimaging probe.

  13. Design and synthesis of polymer-functionalized NIR fluorescent dyes--magnetic nanoparticles for bioimaging.

    Science.gov (United States)

    Yen, Swee Kuan; Jańczewski, Dominik; Lakshmi, Jeeva Lavanya; Dolmanan, Surani Bin; Tripathy, Sudhiranjan; Ho, Vincent H B; Vijayaragavan, Vimalan; Hariharan, Anushya; Padmanabhan, Parasuraman; Bhakoo, Kishore K; Sudhaharan, Thankiah; Ahmed, Sohail; Zhang, Yong; Tamil Selvan, Subramanian

    2013-08-27

    The fluorescent probes having complete spectral separation between absorption and emission spectra (large Stokes shift) are highly useful for solar concentrators and bioimaging. In bioimaging application, NIR fluorescent dyes have a greater advantage in tissue penetration depth compared to visible-emitting organic dyes or inorganic quantum dots. Here we report the design, synthesis, and characterization of an amphiphilic polymer, poly(isobutylene-alt-maleic anhyride)-functionalized near-infrared (NIR) IR-820 dye and its conjugates with iron oxide (Fe3O4) magnetic nanoparticles (MNPs) for optical and magnetic resonance (MR) imaging. Our results demonstrate that the Stokes shift of unmodified dye can be tuned (from ~106 to 208 nm) by the functionalization of the dye with polymer and MNPs. The fabrication of bimodal probes involves (i) the synthesis of NIR fluorescent dye (IR-820 cyanine) functionalized with ethylenediamine linker in high yield, >90%, (ii) polymer conjugation to the functionalized NIR fluorescent dye, and (iii) grafting the polymer-conjugated dyes on iron oxide MNPs. The resulting uniform, small-sized (ca. 6 nm) NIR fluorescent dye-magnetic hybrid nanoparticles (NPs) exhibit a wider emissive range (800-1000 nm) and minimal cytotoxicity. Our preliminary studies demonstrate the potential utility of these NPs in bioimaging by means of direct labeling of cancerous HeLa cells via NIR fluorescence microscopy and good negative contrast enhancement in T2-weighted MR imaging of a murine model.

  14. The Intersection of CMOS Microsystems and Upconversion Nanoparticles for Luminescence Bioimaging and Bioassays

    Directory of Open Access Journals (Sweden)

    Liping Wei

    2014-09-01

    Full Text Available Organic fluorophores and quantum dots are ubiquitous as contrast agents for bio-imaging and as labels in bioassays to enable the detection of biological targets and processes. Upconversion nanoparticles (UCNPs offer a different set of opportunities as labels in bioassays and for bioimaging. UCNPs are excited at near-infrared (NIR wavelengths where biological molecules are optically transparent, and their luminesce in the visible and ultraviolet (UV wavelength range is suitable for detection using complementary metal-oxide-semiconductor (CMOS technology. These nanoparticles provide multiple sharp emission bands, long lifetimes, tunable emission, high photostability, and low cytotoxicity, which render them particularly useful for bio-imaging applications and multiplexed bioassays. This paper surveys several key concepts surrounding upconversion nanoparticles and the systems that detect and process the corresponding luminescence signals. The principle of photon upconversion, tuning of emission wavelengths, UCNP bioassays, and UCNP time-resolved techniques are described. Electronic readout systems for signal detection and processing suitable for UCNP luminescence using CMOS technology are discussed. This includes recent progress in miniaturized detectors, integrated spectral sensing, and high-precision time-domain circuits. Emphasis is placed on the physical attributes of UCNPs that map strongly to the technical features that CMOS devices excel in delivering, exploring the interoperability between the two technologies.

  15. Green synthesis of multifunctional carbon dots from coriander leaves and their potential application as antioxidants, sensors and bioimaging agents.

    Science.gov (United States)

    Sachdev, Abhay; Gopinath, P

    2015-06-21

    In the present study, a facile one-step hydrothermal treatment of coriander leaves for preparing carbon dots (CDs) has been reported. Optical and structural properties of the CDs have been extensively studied by UV-visible and fluorescence spectroscopic, microscopic (transmission electron microscopy, scanning electron microscopy) and X-ray diffraction techniques. Surface functionality and composition of the CDs have been illustrated by elemental analysis and Fourier transform infrared spectroscopy (FTIR). Quenching of the fluorescence of the CDs in the presence of metal ions is of prime significance, hence CDs have been used as a fluorescence probe for sensitive and selective detection of Fe(3+) ions. Eventually, biocompatibility and bioimaging aspects of CDs have been evaluated on lung normal (L-132) and cancer (A549) cell lines. Qualitative analysis of cellular uptake of CDs has been pursued through fluorescence microscopy, while quantitative analysis using a flow cytometer provided an insight into the concentration and cell-type dependent uptake of CDs. The article further investigates the antioxidant activity of CDs. Therefore, we have validated the practicality of CDs obtained from a herbal carbon source for versatile applications.

  16. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...

  17. The Functional Methods of Discourse Analysis

    Institute of Scientific and Technical Information of China (English)

    覃卓敏

    2008-01-01

    From the macroscopic angle of function, methods of discourse analysis are clarified to find out two important methods in pragmatics and through which will better used in the understanding of discourse.

  18. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  19. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  20. Probabilistic structural analysis by extremum methods

    Science.gov (United States)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  1. Sensitivity Analysis Using Simple Additive Weighting Method

    Directory of Open Access Journals (Sweden)

    Wayne S. Goodridge

    2016-05-01

    Full Text Available The output of a multiple criteria decision method often has to be analyzed using some sensitivity analysis technique. The SAW MCDM method is commonly used in management sciences and there is a critical need for a robust approach to sensitivity analysis in the context that uncertain data is often present in decision models. Most of the sensitivity analysis techniques for the SAW method involve Monte Carlo simulation methods on the initial data. These methods are computationally intensive and often require complex software. In this paper, the SAW method is extended to include an objective function which makes it easy to analyze the influence of specific changes in certain criteria values thus making easy to perform sensitivity analysis.

  2. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  3. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...

  4. Matrix methods for bare resonator eigenvalue analysis.

    Science.gov (United States)

    Latham, W P; Dente, G C

    1980-05-15

    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  5. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    Science.gov (United States)

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  6. Antibiotic Conjugated Fluorescent Carbon Dots as a Theranostic Agent for Controlled Drug Release, Bioimaging, and Enhanced Antimicrobial Activity

    Directory of Open Access Journals (Sweden)

    Mukeshchand Thakur

    2014-01-01

    Full Text Available A novel report on microwave assisted synthesis of bright carbon dots (C-dots using gum arabic (GA and its use as molecular vehicle to ferry ciprofloxacin hydrochloride, a broad spectrum antibiotic, is reported in the present work. Density gradient centrifugation (DGC was used to separate different types of C-dots. After careful analysis of the fractions obtained after centrifugation, ciprofloxacin was attached to synthesize ciprofloxacin conjugated with C-dots (Cipro@C-dots conjugate. Release of ciprofloxacin was found to be extremely regulated under physiological conditions. Cipro@C-dots were found to be biocompatible on Vero cells as compared to free ciprofloxacin (1.2 mM even at very high concentrations. Bare C-dots (∼13 mg mL−1 were used for microbial imaging of the simplest eukaryotic model—Saccharomyces cerevisiae (yeast. Bright green fluorescent was obtained when live imaging was performed to view yeast cells under fluorescent microscope suggesting C-dots incorporation inside the cells. Cipro@C-dots conjugate also showed enhanced antimicrobial activity against both model gram positive and gram negative microorganisms. Thus, the Cipro@C-dots conjugate paves not only a way for bioimaging but also an efficient new nanocarrier for controlled drug release with high antimicrobial activity, thereby serving potential tool for theranostics.

  7. Beam-propagation method - Analysis and assessment

    Science.gov (United States)

    van Roey, J.; van der Donk, J.; Lagasse, P. E.

    1981-07-01

    A method for the calculation of the propagation of a light beam through an inhomogeneous medium is presented. A theoretical analysis of this beam-propagation method is given, and a set of conditions necessary for the accurate application of the method is derived. The method is illustrated by the study of a number of integrated-optic structures, such as thin-film waveguides and gratings.

  8. Fractal methods in image analysis and coding

    OpenAIRE

    Neary, David

    2001-01-01

    In this thesis we present an overview of image processing techniques which use fractal methods in some way. We show how these fields relate to each other, and examine various aspects of fractal methods in each area. The three principal fields of image processing and analysis th a t we examine are texture classification, image segmentation and image coding. In the area of texture classification, we examine fractal dimension estimators, comparing these methods to other methods in use, a...

  9. Water-soluble photoluminescent fullerene capped mesoporous silica for pH-responsive drug delivery and bioimaging

    Science.gov (United States)

    Tan, Lei; Wu, Tao; Tang, Zhao-Wen; Xiao, Jian-Yun; Zhuo, Ren-Xi; Shi, Bin; Liu, Chuan-Jun

    2016-08-01

    In this paper, a biocompatible and water-soluble fluorescent fullerene (C60-TEG-COOH) coated mesoporous silica nanoparticle (MSN) was successfully fabricated for pH-sensitive drug release and fluorescent cell imaging. The MSN was first reacted with 3-aminopropyltriethoxysilane to obtain an amino-modified MSN, and then the water-soluble C60 with a carboxyl group was used to cover the surface of the MSN through electrostatic interaction with the amino group in PBS solution (pH = 7.4). The release of doxorubicin hydrochloride (DOX) could be triggered under a mild acidic environment (lysosome, pH = 5.0) due to the protonation of C60-TEG-COO-, which induced the dissociation of the C60-TEG-COOH modified MSN (MSN@C60). Furthermore, the uptake of nanoparticles by cells could be tracked because of the green fluorescent property of the C60-modified MSN. In an in vitro study, the prepared materials showed excellent biocompatibility and the DOX-loaded nanocarrier exhibited efficient anticancer ability. This work offered a simple method for designing a simultaneous pH-responsive drug delivery and bioimaging system.

  10. LANDSCAPE ANALYSIS METHOD OF RIVERINE TERRITORIES

    OpenAIRE

    Fedoseeva O. S.

    2013-01-01

    The article proposes a method for landscape area analysis, which consists of four stages. Technique is proposed as a tool for the practical application of pre-project research materials in the design solutions for landscape areas planning and organization

  11. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, James F

    2013-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentralblatt MATH "". . . carefully structured with many detailed worked examples.""-The Mathematical Gazette The Second Edition of the highly regarded An Introduction to Numerical Methods and Analysis provides a fully revised guide to numerical approximation. The book continues to be accessible and expertly guides readers through the many available techniques of numerical methods and analysis. An Introduction to

  12. [Framework analysis method in qualitative research].

    Science.gov (United States)

    Liao, Xing; Liu, Jian-ping; Robison, Nicola; Xie, Ya-ming

    2014-05-01

    In recent years a number of qualitative research methods have gained popularity within the health care arena. Despite this popularity, different qualitative analysis methods pose many challenges to most researchers. The present paper responds to the needs expressed by recent Chinese medicine researches. The present paper is mainly focused on the concepts, nature, application of framework analysis, especially on how to use it, in such a way to assist the newcomer of Chinese medicine researchers to engage with the methodology.

  13. Two MIS Analysis Methods: An Experimental Comparison.

    Science.gov (United States)

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  14. An Analysis Method of Business Application Framework

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We discuss the evolution of object-oriented software developmentpr o cess based on software pattern. For developing mature software fra mework and component, we advocate to elicit and incorporate software patterns fo r ensuing quality and reusability of software frameworks. On the analysis base o f requirement specification for business application domain, we present analysis method and basic role model of software framework. We also elicit analysis patt ern of framework architecture, and design basic role classes and their structure .

  15. Two-photon bioimaging utilizing supercontinuum light generated by a high-peak-power picosecond semiconductor laser source.

    Science.gov (United States)

    Yokoyama, Hiroyuki; Tsubokawa, Hiroshi; Guo, Hengchang; Shikata, Jun-ichi; Sato, Ki-ichi; Takashima, Keijiro; Kashiwagi, Kaori; Saito, Naoaki; Taniguchi, Hirokazu; Ito, Hiromasa

    2007-01-01

    We developed a novel scheme for two-photon fluorescence bioimaging. We generated supercontinuum (SC) light at wavelengths of 600 to 1200 nm with 774-nm light pulses from a compact turn-key semiconductor laser picosecond light pulse source that we developed. The supercontinuum light was sliced at around 1030- and 920-nm wavelengths and was amplified to kW-peak-power level using laboratory-made low-nonlinear-effects optical fiber amplifiers. We successfully demonstrated two-photon fluorescence bioimaging of mouse brain neurons containing green fluorescent protein (GFP).

  16. Causal Moderation Analysis Using Propensity Score Methods

    Science.gov (United States)

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  17. Error Analysis of Band Matrix Method

    OpenAIRE

    Taniguchi, Takeo; Soga, Akira

    1984-01-01

    Numerical error in the solution of the band matrix method based on the elimination method in single precision is investigated theoretically and experimentally, and the behaviour of the truncation error and the roundoff error is clarified. Some important suggestions for the useful application of the band solver are proposed by using the results of above error analysis.

  18. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy anal

  19. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  20. Cyanines as new fluorescent probes for DNA detection and two-photon excited bioimaging.

    Science.gov (United States)

    Feng, Xin Jiang; Wu, Po Lam; Bolze, Frédéric; Leung, Heidi W C; Li, King Fai; Mak, Nai Ki; Kwong, Daniel W J; Nicoud, Jean-François; Cheah, Kok Wai; Wong, Man Shing

    2010-05-21

    A series of cyanine fluorophores based on fused aromatics as an electron donor for DNA sensing and two-photon bioimaging were synthesized, among which the carbazole-based biscyanine exhibits high sensitivity and efficiency as a fluorescent light-up probe for dsDNA, which shows selective binding toward the AT-rich regions. The synergetic effect of the bischromophoric skeleton gives a several-fold enhancement in a two-photon absorption cross-section as well as a 25- to 100-fold enhancement in two-photon excited fluorescence upon dsDNA binding.

  1. Size effects in the quantum yield of Cd Te quantum dots for optimum fluorescence bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Jacinto, C.; Rocha, U.S. [Universidade Federal de Alagoas (UFAL), Maceio, AL (Brazil). Inst. de Fisica. Grupo de Fotonica e Fluidos Complexos; Maestro, L.M.; Garcia-Sole, J.; Jaque, D. [Universidad Autonoma de Madrid (Spain). Dept. de Fisica de Materiales. Fluorescence Imaging Group

    2011-07-01

    Full text: Semiconductor nano-crystals, usually referred as Quantum Dots (QDs) are nowadays regarded as one of the building-blocks in modern photonics. They constitute bright and photostable fluorescence sources whose emission and absorption properties can be adequately tailored through their size. Recent advances on the controlled modification of their surface has made possible the development of water soluble QDs, without causing any deterioration in their fluorescence properties. This has made them excellent optical selective markers to be used in fluorescence bio-imaging experiments. The suitability of colloidal QDs for bio-imaging is pushed forward by their large two-photon absorption cross section so that their visible luminescence (associated to the recombination of electro-hole pairs) can be also efficiently excited under infrared excitation (two-photon excitation). This, in turns, allows for large penetration depths in tissues, minimization of auto-fluorescence and achievement of superior spatial imaging resolution. In addition, recent works have demonstrated the ability of QDs to act as nano-thermometers based on the thermal sensitivity of their fluorescence bands. Based on all these outstanding properties, QDs have been successfully used to mark individual receptors in cell membranes, to intracellular temperature measurements and to label living embryos at different stages. Most of the QD based bio-images reported up to now were obtained by using whether CdSe or CdTe QDs since both are currently commercial available with a high degree of quality. They show similar fluorescence properties and optical performance when used in bio-imaging. Nevertheless, CdTe-QDs have very recently attracted much attention since the hyper-thermal sensitivity of their fluorescence bands was discovered. Based on this, it has been postulated that intracellular thermal sensing with resolutions as large as 0.25 deg C can be achieved based on CdTe-QDs, three times better than

  2. Three-photon-excited luminescence from unsymmetrical cyanostilbene aggregates: morphology tuning and targeted bioimaging.

    Science.gov (United States)

    Mandal, Amal Kumar; Sreejith, Sivaramapanicker; He, Tingchao; Maji, Swarup Kumar; Wang, Xiao-Jun; Ong, Shi Li; Joseph, James; Sun, Handong; Zhao, Yanli

    2015-05-26

    We report an experimental observation of aggregation-induced enhanced luminescence upon three-photon excitation in aggregates formed from a class of unsymmetrical cyanostilbene derivatives. Changing side chains (-CH3, -C6H13, -C7H15O3, and folic acid) attached to the cyanostilbene core leads to instantaneous formation of aggregates with sizes ranging from micrometer to nanometer scale in aqueous conditions. The crystal structure of a derivative with a methyl side chain reveals the planarization in the unsymmetrical cyanostilbene core, causing luminescence from corresponding aggregates upon three-photon excitation. Furthermore, folic acid attached cyanostilbene forms well-dispersed spherical nanoaggregates that show a high three-photon cross-section of 6.0 × 10(-80) cm(6) s(2) photon(-2) and high luminescence quantum yield in water. In order to demonstrate the targeted bioimaging capability of the nanoaggregates, three cell lines (HEK293 healthy cell line, MCF7 cancerous cell line, and HeLa cancerous cell line) were employed for the investigations on the basis of their different folate receptor expression level. Two kinds of nanoaggregates with and without the folic acid targeting ligand were chosen for three-photon bioimaging studies. The cell viability of three types of cells incubated with high concentration of nanoaggregates still remained above 70% after 24 h. It was observed that the nanoaggregates without the folic acid unit could not undergo the endocytosis by both healthy and cancerous cell lines. No obvious endocytosis of folic acid attached nanoaggregates was observed from the HEK293 and MCF7 cell lines having a low expression of the folate receptor. Interestingly, a significant amount of endocytosis and internalization of folic acid attached nanoaggregates was observed from HeLa cells with a high expression of the folate receptor under three-photon excitation, indicating targeted bioimaging of folic acid attached nanoaggregates to the cancer cell line

  3. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  4. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  5. Advanced Analysis Methods in Particle Physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab

    1900-01-01

    Each generation of high energy physics experiments is grander in scale than the previous – more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  6. Application of Software Safety Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, S. J.; Koo, Y. H. [Doosan Heavy Industries and Construction Co., Daejeon (Korea, Republic of)

    2009-05-15

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  7. Analysis of mixed data methods & applications

    CERN Document Server

    de Leon, Alexander R

    2013-01-01

    A comprehensive source on mixed data analysis, Analysis of Mixed Data: Methods & Applications summarizes the fundamental developments in the field. Case studies are used extensively throughout the book to illustrate interesting applications from economics, medicine and health, marketing, and genetics. Carefully edited for smooth readability and seamless transitions between chaptersAll chapters follow a common structure, with an introduction and a concluding summary, and include illustrative examples from real-life case studies in developmental toxicolog

  8. Chromatographic methods for analysis of triazine herbicides.

    Science.gov (United States)

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  9. An introduction to numerical methods and analysis

    CERN Document Server

    Epperson, J F

    2007-01-01

    Praise for the First Edition "". . . outstandingly appealing with regard to its style, contents, considerations of requirements of practice, choice of examples, and exercises.""-Zentrablatt Math "". . . carefully structured with many detailed worked examples . . .""-The Mathematical Gazette "". . . an up-to-date and user-friendly account . . .""-Mathematika An Introduction to Numerical Methods and Analysis addresses the mathematics underlying approximation and scientific computing and successfully explains where approximation methods come from, why they sometimes work (or d

  10. A special issue on reviews in biomedical applications of nanomaterials, tissue engineering, stem cells, bioimaging, and toxicity.

    Science.gov (United States)

    Nalwa, Hari Singh

    2014-10-01

    This second special issue of the Journal of Biomedical Nanotechnology in a series contains another 30 state-of-the-art reviews focused on the biomedical applications of nanomaterials, biosensors, bone tissue engineering, MRI and bioimaging, single-cell detection, stem cells, endothelial progenitor cells, toxicity and biosafety of nanodrugs, nanoparticle-based new therapeutic approaches for cancer, hepatic and cardiovascular disease.

  11. Cadmium-free quantum dots as time-gated bioimaging probes in highly-autofluorescent human breast cancer cells.

    Science.gov (United States)

    Mandal, Gopa; Darragh, Molly; Wang, Y Andrew; Heyes, Colin D

    2013-01-21

    We report cadmium-free, biocompatible (Zn)CuInS(2) quantum dots with long fluorescence lifetimes as superior bioimaging probes using time-gated detection to suppress cell autofluorescence and improve the signal : background ratio by an order of magnitude. These results will be important for developing non-toxic fluorescence imaging probes for ultrasensitive biomedical diagnostics.

  12. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E. [Emory Univ. School of Public Health, Atlanta, GA (United States); Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  13. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  14. Single-cell analysis - Methods and protocols

    OpenAIRE

    Carlo Alberto Redi

    2013-01-01

    This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  15. Modern methods of wine quality analysis

    Directory of Open Access Journals (Sweden)

    Галина Зуфарівна Гайда

    2015-06-01

    Full Text Available  In this paper physical-chemical and enzymatic methods of quantitative analysis of the basic wine components were reviewed. The results of own experiments were presented for the development of enzyme- and cell-based amperometric sensors on ethanol, lactate, glucose, arginine

  16. Systems and methods for sample analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  17. Spectroscopic Chemical Analysis Methods and Apparatus

    Science.gov (United States)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor); Lane, Arthur L. (Inventor)

    2017-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  18. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  19. Practical Fourier analysis for multigrid methods

    CERN Document Server

    Wienands, Roman

    2004-01-01

    Before applying multigrid methods to a project, mathematicians, scientists, and engineers need to answer questions related to the quality of convergence, whether a development will pay out, whether multigrid will work for a particular application, and what the numerical properties are. Practical Fourier Analysis for Multigrid Methods uses a detailed and systematic description of local Fourier k-grid (k=1,2,3) analysis for general systems of partial differential equations to provide a framework that answers these questions.This volume contains software that confirms written statements about convergence and efficiency of algorithms and is easily adapted to new applications. Providing theoretical background and the linkage between theory and practice, the text and software quickly combine learning by reading and learning by doing. The book enables understanding of basic principles of multigrid and local Fourier analysis, and also describes the theory important to those who need to delve deeper into the detai...

  20. Digital Forensics Analysis of Spectral Estimation Methods

    CERN Document Server

    Mataracioglu, Tolga

    2011-01-01

    Steganography is the art and science of writing hidden messages in such a way that no one apart from the intended recipient knows of the existence of the message. In today's world, it is widely used in order to secure the information. In this paper, the traditional spectral estimation methods are introduced. The performance analysis of each method is examined by comparing all of the spectral estimation methods. Finally, from utilizing those performance analyses, a brief pros and cons of the spectral estimation methods are given. Also we give a steganography demo by hiding information into a sound signal and manage to pull out the information (i.e, the true frequency of the information signal) from the sound by means of the spectral estimation methods.

  1. Simple gas chromatographic method for furfural analysis.

    Science.gov (United States)

    Gaspar, Elvira M S M; Lopes, João F

    2009-04-03

    A new, simple, gas chromatographic method was developed for the direct analysis of 5-hydroxymethylfurfural (5-HMF), 2-furfural (2-F) and 5-methylfurfural (5-MF) in liquid and water soluble foods, using direct immersion SPME coupled to GC-FID and/or GC-TOF-MS. The fiber (DVB/CAR/PDMS) conditions were optimized: pH effect, temperature, adsorption and desorption times. The method is simple and accurate (RSDTOF-MS: 0.3, 1.2 and 0.9 ngmL(-1) for 2-F, 5-MF and 5-HMF, respectively). It was applied to different commercial food matrices: honey, white, demerara, brown and yellow table sugars, and white and red balsamic vinegars. This one-step, sensitive and direct method for the analysis of furfurals will contribute to characterise and quantify their presence in the human diet.

  2. Heteroscedastic regression analysis method for mixed data

    Institute of Scientific and Technical Information of China (English)

    FU Hui-min; YUE Xiao-rui

    2011-01-01

    The heteroscedastic regression model was established and the heteroscedastic regression analysis method was presented for mixed data composed of complete data, type- I censored data and type- Ⅱ censored data from the location-scale distribution. The best unbiased estimations of regression coefficients, as well as the confidence limits of the location parameter and scale parameter were given. Furthermore, the point estimations and confidence limits of percentiles were obtained. Thus, the traditional multiple regression analysis method which is only suitable to the complete data from normal distribution can be extended to the cases of heteroscedastic mixed data and the location-scale distribution. So the presented method has a broad range of promising applications.

  3. Gold-Speckled Multimodal Nanoparticles for Noninvasive Bioimaging

    Science.gov (United States)

    2008-01-01

    In this report the synthesis, characterization, and functional evaluation of a multimodal nanoparticulate contrast agent for noninvasive imaging through both magnetic resonance imaging (MRI) and photoacoustic tomography (PAT) is presented. The nanoparticles described herein enable high resolution and highly sensitive three-dimensional diagnostic imaging through the synergistic coupling of MRI and PAT capabilities. Gadolinium (Gd)-doped gold-speckled silica (GSS) nanoparticles, ranging from 50 to 200 nm, have been prepared in a simple one-pot synthesis using nonionic microemulsions. The photoacoustic signal is generated from a nonuniform, discontinuous gold nanodomains speckled across the silica surface, whereas the MR contrast is provided through Gd incorporated in the silica matrix. The presence of a discontinuous speckled surface, as opposed to a continuous gold shell, allows sufficient bulk water exchange with the Gd ions to generate a strong MR contrast. The dual imaging capabilities of the particles have been demonstrated through in silicio and in vitro methods. The described particles also have the capacity for therapeutic applications including the thermal ablation of tumors through the absorption of irradiated light. PMID:19466201

  4. Multifunctional chitin nanogels for simultaneous drug delivery, bioimaging, and biosensing.

    Science.gov (United States)

    Rejinold N, Sanoj; Chennazhi, Krishna Prasad; Tamura, Hiroshi; Nair, Shantikumar V; Rangasamy, Jayakumar

    2011-09-01

    In this work, we developed biodegradable chitin nanogels (CNGs) by controlled regeneration method. For multifunctionalization, we have conjugated CNGs with MPA-capped-CdTe-QDs (QD-CNGs) for the in vitro cellular localization studies. In addition, the Bovine Serum Albumin (BSA) was loaded on to QD-CNGs (BSA-QD-CNGs). The CNGs, QD-CNGs, and BSA-QD-CNGs were well-characterized by SEM and AFM, which shows that the nanogels are in the range of Cyclic Voltametry. The cytocompatibility assay showed that the nanogels are nontoxic to L929, NIH-3T3, KB, MCF-7, PC3, and VERO cells. The cell uptake studies of the QD-CNGs were analyzed, which showed retention of these nanogels inside the cells (L929, PC3, and VERO). In addition, the protein loading efficiency of the nano gels has also been analyzed. Our preliminary studies reveal that these multifunctionalized nanogels could be useful for drug delivery with simultaneous imaging and biosensing.

  5. Progress of MEMS Scanning Micromirrors for Optical Bio-Imaging

    Directory of Open Access Journals (Sweden)

    Lih Y. Lin

    2015-11-01

    Full Text Available Microelectromechanical systems (MEMS have an unmatched ability to incorporate numerous functionalities into ultra-compact devices, and due to their versatility and miniaturization, MEMS have become an important cornerstone in biomedical and endoscopic imaging research. To incorporate MEMS into such applications, it is critical to understand underlying architectures involving choices in actuation mechanism, including the more common electrothermal, electrostatic, electromagnetic, and piezoelectric approaches, reviewed in this paper. Each has benefits and tradeoffs and is better suited for particular applications or imaging schemes due to achievable scan ranges, power requirements, speed, and size. Many of these characteristics are fabrication-process dependent, and this paper discusses various fabrication flows developed to integrate additional optical functionality beyond simple lateral scanning, enabling dynamic control of the focus or mirror surface. Out of this provided MEMS flexibility arises some challenges when obtaining high resolution images: due to scanning non-linearities, calibration of MEMS scanners may become critical, and inherent image artifacts or distortions during scanning can degrade image quality. Several reviewed methods and algorithms have been proposed to address these complications from MEMS scanning. Given their impact and promise, great effort and progress have been made toward integrating MEMS and biomedical imaging.

  6. Methods for genetic linkage analysis using trisomies

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  7. Method of thermal derivative gradient analysis (TDGA

    Directory of Open Access Journals (Sweden)

    M. Cholewa

    2009-07-01

    Full Text Available In this work a concept of thermal analysis was shown, using for crystallization kinetics description the temperature derivatives after time and direction. Method of thermal derivative gradient analysis (TDGA is assigned for alloys and metals investigation as well as cast composites in range of solidification. The construction and operation characteristics were presented for the test stand including processing modules and probes together with thermocouples location. Authors presented examples of results interpretation for AlSi11 alloy castings with diversified wall thickness and at different pouring temperature.

  8. A DECOMPOSITION METHOD OF STRUCTURAL DECOMPOSITION ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    LI Jinghua

    2005-01-01

    Over the past two decades,structural decomposition analysis(SDA)has developed into a major analytical tool in the field of input-output(IO)techniques,but the method was found to suffer from one or more of the following problems.The decomposition forms,which are used to measure the contribution of a specific determinant,are not unique due to the existence of a multitude of equivalent forms,irrational due to the weights of different determinants not matching,inexact due to the existence of large interaction terms.In this paper,a decomposition method is derived to overcome these deficiencies,and we prove that the result of this approach is equal to the Shapley value in cooperative games,and so some properties of the method are obtained.Beyond that,the two approaches that have been used predominantly in the literature have been proved to be the approximate solutions of the method.

  9. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  10. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link......-attribute and event-based approaches, cluster identification, and risk exposure....

  11. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  12. Text analysis devices, articles of manufacture, and text analysis methods

    Science.gov (United States)

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  13. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  14. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  15. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  16. Single-cell analysis - Methods and protocols

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2013-06-01

    Full Text Available This is certainly a timely volume in the Methods in molecular biology series: we already entered the synthetic biology era and thus we need to be aware of the new methodological advances able to fulfill the new and necessary needs for biologists, biotechnologists and nano-biotechnologists. Notably, among these, the possibility to perform single cell analysis allow researchers to capture single cell responses....

  17. Competitive Performance of Carbon “Quantum” Dots in Optical Bioimaging

    Directory of Open Access Journals (Sweden)

    Li Cao, Sheng-Tao Yang, Xin Wang, Pengju G. Luo, Jia-Hui Liu, Sushant Sahu, Yamin Liu, Ya-Ping Sun

    2012-01-01

    Full Text Available Carbon-based “quantum” dots or carbon dots are surface-functionalized small carbon nanoparticles. For bright fluorescence emissions, the carbon nanoparticles may be surface-doped with an inorganic salt and then the same organic functionalization. In this study, carbon dots without and with the ZnS doping were prepared, followed by gel-column fractionation to harvest dots of 40% and 60% in fluorescence quantum yields, respectively. These highly fluorescent carbon dots were evaluated for optical imaging in mice, from which bright fluorescence images were obtained. Of particular interest was the observed competitive performance of the carbon dots in vivo to that of the well-established CdSe/ZnS QDs. The results suggest that carbon dots may be further developed into a new class of high-performance yet nontoxic contrast agents for optical bioimaging.

  18. Graphene quantum dots: emergent nanolights for bioimaging, sensors, catalysis and photovoltaic devices.

    Science.gov (United States)

    Shen, Jianhua; Zhu, Yihua; Yang, Xiaoling; Li, Chunzhong

    2012-04-18

    Similar to the popular older cousins, luminescent carbon dots (C-dots), graphene quantum dots or graphene quantum discs (GQDs) have generated enormous excitement because of their superiority in chemical inertness, biocompatibility and low toxicity. Besides, GQDs, consisting of a single atomic layer of nano-sized graphite, have the excellent performances of graphene, such as high surface area, large diameter and better surface grafting using π-π conjugation and surface groups. Because of the structure of graphene, GQDs have some other special physical properties. Therefore, studies on GQDs in aspects of chemistry, physical, materials, biology and interdisciplinary science have been in full flow in the past decade. In this Feature Article, recent developments in preparation of GQDs are discussed, focusing on the main two approaches (top-down and bottom-down). Emphasis is given to their future and potential development in bioimaging, electrochemical biosensors and catalysis, and specifically in photovoltaic devices that can solve increasingly serious energy problems.

  19. Space Debris Reentry Analysis Methods and Tools

    Institute of Scientific and Technical Information of China (English)

    WU Ziniu; HU Ruifeng; QU Xi; WANG Xiang; WU Zhe

    2011-01-01

    The reentry of uncontrolled spacecraft may be broken into many pieces of debris at an altitude in the range of 75-85 km.The surviving fragments could pose great hazard and risk to ground and people.In recent years,methods and tools for predicting and analyzing debris reentry and ground risk assessment have been studied and developed in National Aeronautics and Space Administration(NASA),European Space Agency(ESA) and other organizations,including the group of the present authors.This paper reviews the current progress on this topic of debris reentry briefly.We outline the Monte Carlo method for uncertainty analysis,breakup prediction,and parameters affecting survivability of debris.The existing analysis tools can be classified into two categories,i.e.the object-oriented and the spacecraft-oriented methods,the latter being more accurate than the first one.The past object-oriented tools include objects of only simple shapes.For more realistic simulation,here we present an object-oriented tool debris reentry and ablation prediction system(DRAPS) developed by the present authors,which introduces new object shapes to 15 types,as well as 51 predefined motions and relevant aerodynamic and aerothermal models.The aerodynamic and aerothermal models in DRAPS are validated using direct simulation Monte Carlo(DSMC) method.

  20. Analysis and estimation of risk management methods

    Directory of Open Access Journals (Sweden)

    Kankhva Vadim Sergeevich

    2016-05-01

    Full Text Available At the present time risk management is an integral part of state policy in all the countries with developed market economy. Companies dealing with consulting services and implementation of the risk management systems carve out a niche. Unfortunately, conscious preventive risk management in Russia is still far from the level of standardized process of a construction company activity, which often leads to scandals and disapproval in case of unprofessional implementation of projects. The authors present the results of the investigation of the modern understanding of the existing methodology classification and offer the authorial concept of classification matrix of risk management methods. Creation of the developed matrix is based on the analysis of the method in the context of incoming and outgoing transformed information, which may include different elements of risk control stages. So the offered approach allows analyzing the possibilities of each method.

  1. Data Analysis Methods for Library Marketing

    Science.gov (United States)

    Minami, Toshiro; Kim, Eunja

    Our society is rapidly changing to information society, where the needs and requests of the people on information access are different widely from person to person. Library's mission is to provide its users, or patrons, with the most appropriate information. Libraries have to know the profiles of their patrons, in order to achieve such a role. The aim of library marketing is to develop methods based on the library data, such as circulation records, book catalogs, book-usage data, and others. In this paper we discuss the methodology and imporatnce of library marketing at the beginning. Then we demonstrate its usefulness through some examples of analysis methods applied to the circulation records in Kyushu University and Guacheon Library, and some implication that obtained as the results of these methods. Our research is a big beginning towards the future when library marketing is an unavoidable tool.

  2. Optical methods for the analysis of dermatopharmacokinetics

    Science.gov (United States)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  3. A high-efficiency aerothermoelastic analysis method

    Science.gov (United States)

    Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

    2014-06-01

    In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

  4. Thermal Analysis Methods for Aerobraking Heating

    Science.gov (United States)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on

  5. Digital methods for mediated discourse analysis

    DEFF Research Database (Denmark)

    Kjær, Malene; Larsen, Malene Charlotte

    2015-01-01

    , restrictions or privately mediated settings. Having used mediated discourse analysis (Scollon 2002, Scollon & Scollon, 2004) as a framework in two different research projects, we show how the framework, in correlation with digital resources for data gathering, provides new understandings of 1) the daily...... practice of health care professionals (Author 1, 2014) and 2) young people’s identity construction on social media platforms (Author 2, 2010, 2015, in press). The paper’s contribution is a methodological discussion on digital data collection using methods such as online interviewing (via e-mail or chat......) and online questionnaire data in order to capture mediated actions and discourses in practice....

  6. Numerical analysis method for linear induction machines.

    Science.gov (United States)

    Elliott, D. G.

    1972-01-01

    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  7. FUZZY METHOD FOR FAILURE CRITICALITY ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The greatest benefit is realized from failure mode, effect and criticality analysis (FMECA) when it is done early in the design phase and tracks product changes as they evolve; design changes can then be made more economically than if the problems are discovered after the design has been completed. However, when the discovered design flaws must be prioritized for corrective actions, precise information on their probability of occurrence, the effect of the failure, and their detectability often are not availabe. To solve this problem, this paper described a new method, based on fuzzy sets, for prioritizing failures for corrective actions in a FMCEA. Its successful application to the container crane shows that the proposed method is both reasonable and practical.

  8. Optimizing the synthesis of CdS/ZnS core/shell semiconductor nanocrystals for bioimaging applications

    Directory of Open Access Journals (Sweden)

    Li-wei Liu

    2014-06-01

    Full Text Available In this study, we report on CdS/ZnS nanocrystals as a luminescence probe for bioimaging applications. CdS nanocrystals capped with a ZnS shell had enhanced luminescence intensity, stronger stability and exhibited a longer lifetime compared to uncapped CdS. The CdS/ZnS nanocrystals were stabilized in Pluronic F127 block copolymer micelles, offering an optically and colloidally stable contrast agents for in vitro and in vivo imaging. Photostability test exhibited that the ZnS protective shell not only enhances the brightness of the QDs but also improves their stability in a biological environment. An in-vivo imaging study showed that F127-CdS/ZnS micelles had strong luminescence. These results suggest that these nanoparticles have significant advantages for bioimaging applications and may offer a new direction for the early detection of cancer in humans.

  9. Multiphoton excitation of disc shaped quantum dot in presence of laser (THz) and magnetic field for bioimaging

    Energy Technology Data Exchange (ETDEWEB)

    Lahon, Siddhartha; Gambhir, Monica; Jha, P.K.; Mohan, Man [Department of Physics and Astrophysics, University of Delhi, Delhi 110 007 (India)

    2010-04-15

    Recently, multiphoton processes in nanostructures have attracted much attention for their promising applications, especially in growing field of bioimaging. Here we investigate the optical response of quantum disc (QD) in the presence of laser and a static magnetic field. Floquet theory is employed to solve the equation of motion for laser driven intraband transitions between the states of the conduction band. Several interesting features namely dynamic stark shift, power broadening, and hole burning on excited levels degeneracy breaking are observed with variation of electric and magnetic field strengths. Enhancement and power broadening observed for excited states probabilities with increase of external fields are directly linked to the emission spectra of QD and will be useful for making future bioimaging devices. (Abstract Copyright [2010], Wiley Periodicals, Inc.)

  10. Facile Peptides Functionalization of Lanthanide-Based Nanocrystals through Phosphorylation Tethering for Efficient in Vivo NIR-to-NIR Bioimaging.

    Science.gov (United States)

    Yao, Chi; Wang, Peiyuan; Wang, Rui; Zhou, Lei; El-Toni, Ahmed Mohamed; Lu, Yiqing; Li, Xiaomin; Zhang, Fan

    2016-02-02

    Peptide modification of nanoparticles is a challenging task for bioapplications. Here, we show that noncovalent surface engineering based on ligand exchange of peptides for lanthanide based upconversion and downconversion near-infrared (NIR) luminescent nanoparticles can be efficiently realized by modifying the hydroxyl functional group of a side grafted serine of peptides into a phosphate group (phosphorylation). By using the phosphorylated peptide with the arginine-glycine-aspartic acid (RGD) targeting motifs as typical examples, the modification allows improving the selectivity, sensitivity, and signal-to-noise ratio for the cancer targeting and bioimaging and reducing the toxicity derived from nonspecific interactions of nanoparticles with cells. The in vivo NIR bioimaging signal could even be detected at low injection amounts down to 20 μg per animal.

  11. Mesoporous silica nanoparticles with organo-bridged silsesquioxane framework as innovative platforms for bioimaging and therapeutic agent delivery.

    Science.gov (United States)

    Du, Xin; Li, Xiaoyu; Xiong, Lin; Zhang, Xueji; Kleitz, Freddy; Qiao, Shi Zhang

    2016-06-01

    Mesoporous silica material with organo-bridged silsesquioxane frameworks is a kind of synergistic combination of inorganic silica, mesopores and organics, resulting in some novel or enhanced physicochemical and biocompatible properties compared with conventional mesoporous silica materials with pure Si-O composition. With the rapid development of nanotechnology, monodispersed nanoscale periodic mesoporous organosilica nanoparticles (PMO NPs) and organo-bridged mesoporous silica nanoparticles (MSNs) with various organic groups and structures have recently been synthesized from 100%, or less, bridged organosilica precursors, respectively. Since then, these materials have been employed as carrier platforms to construct bioimaging and/or therapeutic agent delivery nanosystems for nano-biomedical application, and they demonstrate some unique and/or enhanced properties and performances. This review article provides a comprehensive overview of the controlled synthesis of PMO NPs and organo-bridged MSNs, physicochemical and biocompatible properties, and their nano-biomedical application as bioimaging agent and/or therapeutic agent delivery system.

  12. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  13. A concise method for mine soils analysis

    Energy Technology Data Exchange (ETDEWEB)

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  14. Generalized Analysis of a Distribution Separation Method

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2016-04-01

    Full Text Available Separating two probability distributions from a mixture model that is made up of the combinations of the two is essential to a wide range of applications. For example, in information retrieval (IR, there often exists a mixture distribution consisting of a relevance distribution that we need to estimate and an irrelevance distribution that we hope to get rid of. Recently, a distribution separation method (DSM was proposed to approximate the relevance distribution, by separating a seed irrelevance distribution from the mixture distribution. It was successfully applied to an IR task, namely pseudo-relevance feedback (PRF, where the query expansion model is often a mixture term distribution. Although initially developed in the context of IR, DSM is indeed a general mathematical formulation for probability distribution separation. Thus, it is important to further generalize its basic analysis and to explore its connections to other related methods. In this article, we first extend DSM’s theoretical analysis, which was originally based on the Pearson correlation coefficient, to entropy-related measures, including the KL-divergence (Kullback–Leibler divergence, the symmetrized KL-divergence and the JS-divergence (Jensen–Shannon divergence. Second, we investigate the distribution separation idea in a well-known method, namely the mixture model feedback (MMF approach. We prove that MMF also complies with the linear combination assumption, and then, DSM’s linear separation algorithm can largely simplify the EM algorithm in MMF. These theoretical analyses, as well as further empirical evaluation results demonstrate the advantages of our DSM approach.

  15. Analysis Method for Quantifying Vehicle Design Goals

    Science.gov (United States)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  16. Spatial methods in areal administrative data analysis

    Directory of Open Access Journals (Sweden)

    Haijun Ma

    2006-12-01

    Full Text Available Administrative data often arise as electronic copies of paid bills generated from insurance companies including the Medicare and Medicaid programs. Such data are widely seen and analyzed in the public health area, as in investigations of cancer control, health service accessibility, and spatial epidemiology. In areas like political science and education, administrative data are also important. Administrative data are sometimes more readily available as summaries over each administrative unit (county, zip code, etc. in a particular set determined by geopolitical boundaries, or what statisticians refer to as areal data. However, the spatial dependence often present in administrative data is often ignored by health services researchers. This can lead to problems in estimating the true underlying spatial surface, including inefficient use of data and biased conclusions. In this article, we review hierarchical statistical modeling and boundary analysis (wombling methods for areal-level spatial data that can be easily carried out using freely available statistical computing packages. We also propose a new edge-domain method designed to detect geographical boundaries corresponding to abrupt changes in the areal-level surface. We illustrate our methods using county-level breast cancer late detection data from the state of Minnesota.

  17. Water Hammer Analysis by Characteristic Method

    Directory of Open Access Journals (Sweden)

    A. R. Lohrasbi

    2008-01-01

    Full Text Available Rapid changes in the velocity of fluid in closed conduits generate large pressure, which are transmitted through the system with the speed of sound. When the fluid medium is a liquid the pressure surges and related phenomena are described as water hammer. Water hammer is caused by normal operation of the system, such as valve opening or closure, pump starts and stoppages and by abnormal condition, such as power failure. Problem statement: Water hammer causes the additional pressure in water networks. This pressure maybe defects on pipes and connections. The likely effects of water hammer must be taken into account in the structural design of pipelines and in the design of operating procedures for pumps, valves, etc. Approach: The physical phenomena of water hammer and the mathematical model which provides the basis for design computations are described. Most water hammer analysis involves computer solution by the method of characteristics. In this study water hammer is modelled with this method and effect of valve opening and closure will be surveyed with a program that is used for this purpose and with a numerical example. Results: The more rapid the closure of the valve, the more rapid is the change in momentum and hence, greater is the additional pressure developed. Conclusions/Recommendations: For preventing of water hammer defects, is recommended that valves should be open or closed slowly. Also with using the method of characteristics, we can modelled all pipe networks, and see the affects of water hammer.

  18. Multi-Spacecraft Turbulence Analysis Methods

    Science.gov (United States)

    Horbury, Tim S.; Osman, Kareem T.

    Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to

  19. Method and tool for network vulnerability analysis

    Science.gov (United States)

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  20. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  1. Buckling analysis of composite cylindrical shell using numerical analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hae Young; Bae, Won Byung [Pusan Nat' l Univ., Busan (Korea, Republic of); Cho, Jong Rae [Korea Maritime Univ., Busan (Korea, Republic of); Lee, Woo Hyung [Underwater Vehicle Research Center, Busan (Korea, Republic of)

    2012-01-15

    The objective of this paper is to predict the buckling pressure of a composite cylindrical shell using buckling formulas (ASME 2007, NASA SP 8007) and finite element analysis. The model in this study uses a stacking angle of [0/90]12t and USN 125 composite material. All specimens were made using a prepreg method. First, finite element analysis was conducted, and the results were verified through comparison with the hydrostatic pressure bucking experiment results. Second, the values obtained from the buckling formula and the buckling pressure values obtained from the finite element analysis were compared as the stacking angle was changed in 5 .deg. increments from 20 .deg. to 90 .deg. The linear and nonlinear results of the finite element analysis were consistent with the results of the experiment, with a safety factor of 0.85-1. Based on the above result, the ASME 2007 formula, a simplified version of the NASA SP 8007 formula, is regarded as a buckling formula that provides a reliable safety factor.

  2. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...

  3. System and method for making quantum dots

    KAUST Repository

    Bakr, Osman M.

    2015-05-28

    Embodiments of the present disclosure provide for methods of making quantum dots (QDs) (passivated or unpassivated) using a continuous flow process, systems for making QDs using a continuous flow process, and the like. In one or more embodiments, the QDs produced using embodiments of the present disclosure can be used in solar photovoltaic cells, bio-imaging, IR emitters, or LEDs.

  4. Fluorescence-Magnetism Functional EuS Nanocrystals with Controllable Morphologies for Dual Bioimaging.

    Science.gov (United States)

    Sun, Yuanqing; Wang, Dandan; Zhao, Tianxin; Jiang, Yingnan; Zhao, Yueqi; Wang, Chuanxi; Sun, Hongchen; Yang, Bai; Lin, Quan

    2016-12-14

    Multiple functions incorporated in one single component material offer important applications in biosystems. Here we prepared a divalent state of rare earth EuS nanocrystals (NCs), which provides luminescent and magnetic properties, using both 1-Dodecanethiol (DT) and oleylamine (OLA) as reducing agents. The resultant EuS NCs exhibit controllable shapes, uniform size, and bright luminescence with a quantum yield as high as 3.5%. OLA as a surface ligand plays an important role in tunable morphologies, such as nanowires, nanorods, nanospheres et al. Another attractive nature of the EuS NCs is their paramagnetism at room temperature. In order to expand the biological applications, the resultant EuS NCs were modified with amphiphilic block copolymer F127 and transferred from oil to water phase. The excellent biocompatibility of EuS NCs is demonstrated as well as preservation of their luminescence and paramagnetic properties. The EuS NCs offer multifunction and great advantages of bright luminescence, paramagnetic, controllable morphologies, and good biocompatibility promising applications in the field of simultaneous magnetic resonance and fluorescence bioimaging.

  5. Optimization of a dedicated bio-imaging beamline at the European X-ray FEL

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2012-01-01

    We recently proposed a basic concept for design and layout of the undulator source for a dedicated bio-imaging beamline at the European XFEL. The goal of the optimized scheme proposed here is to enable experimental simplification and performance improvement. The core of the scheme is composed by soft and hard X-ray self-seeding setups. Based on the use of an improved design for both monochromators it is possible to increase the design electron energy up to 17.5 GeV in photon energy range between 2 keV and 13 keV, which is the most preferable for life science experiments. An advantage of operating at such high electron energy is the increase of the X-ray output peak power. Another advantage is that 17.5 GeV is the preferred operation energy for SASE1 and SASE2 beamline users. Since it will be necessary to run all the XFEL lines at the same electron energy, this choice will reduce the interference with other undulator lines and increase the total amount of scheduled beam time. In this work we also propose a stu...

  6. Aqueous synthesis and biostabilization of CdS@ZnS quantum dots for bioimaging applications

    Science.gov (United States)

    Chen, L.; Liu, Y.; Lai, C.; Berry, R. M.; Tam, K. C.

    2015-10-01

    Bionanohybrids, combining biocompatible natural polymers with inorganic materials, have aroused interest because of their structural, functional, and environmental advantages. In this work, we report on the stabilization of CdS@ZnS core-shell quantum dots (QDs) using carboxylated cellulose nanocrytals (CNCs) as nanocarrieers in aqueous phase. The high colloidal stability was achieved with sufficient negative charge on CNC surface and the coordination of Cd2+ to carboxylate groups. This coordination allows the in-situ nucleation and growth of QDs on CNC surface. The influences of QD to CNC ratio, pH and ZnS coating on colloidal stability and photoluminescence property of CNC/QD nanohybirds were also studied. The results showed that products obtained at pH 8 with a CdS to CNC weight ratio of 0.19 and a ZnS/CdS molar ratio of 1.5 possessed excellent colloidal stability and highest photoluminescence intensity. By anchoring QDs on rigid bionanotemplates, CNC/CdS@ZnS exhibited long-term colloidal and optical stability. Using biocompatible CNC as nanocarriers, the products have been demonstrated to exhibit low cytotoxicity towards HeLa cells and can serve as promising red-emitting fluorescent bioimaging probes.

  7. Surface chemistry manipulation of gold nanorods preserves optical properties for bio-imaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Polito, Anthony B.; Maurer-Gardner, Elizabeth I.; Hussain, Saber M., E-mail: saber.hussain@us.af.mil [Air Force Research Laboratory, Molecular Bioeffects Branch, Bioeffects Division, Human Effectiveness Directorate (United States)

    2015-12-15

    Due to their anisotropic shape, gold nanorods (GNRs) possess a number of advantages for biosystem use including, enhanced surface area and tunable optical properties within the near-infrared (NIR) region. However, cetyl trimethylammonium bromide-related cytotoxicity, overall poor cellular uptake following surface chemistry modifications, and loss of NIR optical properties due to material intracellular aggregation in combination remain as obstacles for nanobased biomedical GNR applications. In this article, we report that tannic acid-coated 11-mercaptoundecyl trimethylammonium bromide (MTAB) GNRs (MTAB-TA) show no significant decrease in either in vitro cell viability or stress activation after exposures to A549 human alveolar epithelial cells. In addition, MTAB-TA GNRs demonstrate a substantial level of cellular uptake while displaying a unique intracellular clustering pattern. This clustering pattern significantly reduces intracellular aggregation, preserving the GNRs NIR optical properties, vital for biomedical imaging applications. These results demonstrate how surface chemistry modifications enhance biocompatibility, allow for higher rate of internalization with low intracellular aggregation of MTAB-TA GNRs, and identify them as prime candidates for use in nanobased bio-imaging applications.Graphical Abstract.

  8. Fluorescent probe based on heteroatom containing styrylcyanine: pH-sensitive properties and bioimaging in vivo

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xiaodong [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China); Gao, Ya; Huang, Zhibing; Chen, Xiaohui; Ke, Zhiyong [School of Basic Medical Science, Southern Medical University, Guangzhou 510515 (China); Zhao, Peiliang; Yan, Yichen [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Liu, Ruiyuan, E-mail: ruiyliu@smu.edu.cn [Department of Organic Pharmaceutical Chemistry, School of Pharmaceutical Sciences, Southern Medical University, Guangzhou 510515 (China); Qu, Jinqing, E-mail: cejqqu@scut.edu.cn [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou 510640 (China)

    2015-07-01

    A novel fluorescent probe based on heteroatom containing styrylcyanine is synthesized. The fluorescence of probe is bright green in basic and neutral media but dark orange in strong acidic environments, which could be reversibly switched. Such behavior enables it to work as a fluorescent pH sensor in the solution state and a chemosensor for detecting acidic and basic volatile organic compounds. Analyses by NMR spectroscopy confirm that the protonation or deprotonation of pyridinyl moiety is responsible for the sensing process. In addition, the fluorescent microscopic images of probe in live cells and zebrafish are achieved successfully, suggesting that the probe has good cell membrane permeability and low cytotoxicity. - Graphical abstract: A novel styrylcyanine-based fluorescent pH probe was designed and synthesized, the fluorescence of which is bright green in basic and neutral media but dark orange in strong acidic environments. Such behavior enables it to work as a fluorescent pH sensor in solution states, and a chemosensor for detecting volatile organic compounds with high acidity and basicity in solid state. In addition, it can be used for fluorescent imaging in living cell and living organism. - Highlights: • Bright green fluorescence was observed in basic and neutral media. • Dark orange fluorescence was found in strong acidic environments. • Volatile organic compounds with high acidity and basicity could be detected. • Bioimaging in living cell and living organism was achieved successfully.

  9. Ultra-bright and stimuli-responsive fluorescent nanoparticles for bioimaging.

    Science.gov (United States)

    Battistelli, Giulia; Cantelli, Andrea; Guidetti, Gloria; Manzi, Jeannette; Montalti, Marco

    2016-01-01

    Fluorescent nanoparticles (NPs) are unique contrast agents for bioimaging. Examples of molecular-based fluorescent NPs with brightness similar or superior to semiconductor quantum dots have been reported. These ultra-bright NPs consist of a silica or polymeric matrix that incorporate the emitting dyes as individual moieties or aggregates and promise to be more biocompatible than semiconductor quantum dots. Ultra-bright materials result from heavy doping of the structural matrix, a condition that entails a close mutual proximity of the doping dyes. Ground state and excited state interactions between the molecular emitters yield aggregation-caused quenching (ACQ) and proximity-caused quenching (PCQ). In combination with Föster resonance energy transfer (FRET) ACQ and PCQ originate collective phenomena that produce amplified quenching of the nanoprobes. In this focus article, we discuss strategies to achieve ultra-bright nanoprobes avoiding ACQ and PCQ also exploiting aggregation-induced emission (AIE). Amplified quenching, on the other hand, is also proposed as a strategy to design stimuli-responsive fluorogenic probes through disaggregation-induced emission (DIE) in alternative to AIE. As an advantage, DIE consents to design stimuli-responsive materials starting from a large variety of precursors. On the contrary, AIE is characteristic of a limited number of species. Examples of stimuli-responsive fluorogenic probes based on DIE are discussed.

  10. Methods for the proximate analysis of peat

    Energy Technology Data Exchange (ETDEWEB)

    Sheppard, J.D.; Tibbetts, T.E.; Forgeron, D.W.

    1986-01-01

    An investigation was conducted into methods for determining the percentages of volatile matter and ash in peat. Experiments were performed on two types of sphagnum peat, a decomposed fuel peat and a commercial horticultural grade peat. The heating apparatus consisted of both a standard programmable furnace (Fisher Coal Analyser) and a thermogravimetric analyser with a module for differential scanning calorimetry (Mettler TA 3000 system). The results indicate that the seven minute test for volatile matter at either 900 C or 950 C does not fully differentiate volatiles from fixed carbon and, depending on the degree of decomposition, up to sixty minutes at 900 C may be required. The TGA system is very useful in discriminating between different fractions of volatile matter. The relative fractions are more important in determining burning characteristics than the total percentage of volatiles. Ashing must be performed under conditions sufficiently severe to ensure complete combustion of organics. The severity that is required is mainly dependent on the degree of decomposition and sample size. Use of TGA and DSC for studying the combustion of peat provides much more information than the standard proximate analysis. 14 refs.

  11. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    Directory of Open Access Journals (Sweden)

    Azadeh Manayi

    2015-01-01

    Full Text Available Echinacea purpurea (Asteraceae is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists′ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant′s mechanism of action using new, complementary methods.

  12. An Analysis of the SURF Method

    Directory of Open Access Journals (Sweden)

    Edouard Oyallon

    2015-07-01

    Full Text Available The SURF method (Speeded Up Robust Features is a fast and robust algorithm for local, similarity invariant representation and comparison of images. Similarly to many other local descriptor-based approaches, interest points of a given image are defined as salient features from a scale-invariant representation. Such a multiple-scale analysis is provided by the convolution of the initial image with discrete kernels at several scales (box filters. The second step consists in building orientation invariant descriptors, by using local gradient statistics (intensity and orientation. The main interest of the SURF approach lies in its fast computation of operators using box filters, thus enabling real-time applications such as tracking and object recognition. The SURF framework described in this paper is based on the PhD thesis of H. Bay [ETH Zurich, 2009], and more specifically on the paper co-written by H. Bay, A. Ess, T. Tuytelaars and L. Van Gool [Computer Vision and Image Understanding, 110 (2008, pp. 346–359]. An implementation is proposed and used to illustrate the approach for image matching. A short comparison with a state-of-the-art approach is also presented, the SIFT algorithm of D. Lowe [International Journal of Computer Vision, 60 (2004, pp. 91–110], with which SURF shares a lot in common.

  13. Mapping Cigarettes Similarities using Cluster Analysis Methods

    Directory of Open Access Journals (Sweden)

    Lorentz Jäntschi

    2007-09-01

    Full Text Available The aim of the research was to investigate the relationship and/or occurrences in and between chemical composition information (tar, nicotine, carbon monoxide, market information (brand, manufacturer, price, and public health information (class, health warning as well as clustering of a sample of cigarette data. A number of thirty cigarette brands have been analyzed. Six categorical (cigarette brand, manufacturer, health warnings, class and four continuous (tar, nicotine, carbon monoxide concentrations and package price variables were collected for investigation of chemical composition, market information and public health information. Multiple linear regression and two clusterization techniques have been applied. The study revealed interesting remarks. The carbon monoxide concentration proved to be linked with tar and nicotine concentration. The applied clusterization methods identified groups of cigarette brands that shown similar characteristics. The tar and carbon monoxide concentrations were the main criteria used in clusterization. An analysis of a largest sample could reveal more relevant and useful information regarding the similarities between cigarette brands.

  14. New numerical analysis method in computational mechanics: composite element method

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A new type of FEM, called CEM (composite element method), is proposed to solve the static and dynamic problems of engineering structures with high accuracy and efficiency. The core of this method is to define two sets of coordinate systems for DOF's description after discretizing the structure, i.e. the nodal coordinate system UFEM(ξ) for employing the conventional FEM, and the field coordinate system UCT(ξ) for utilizing classical theory. Then, coupling these two sets of functional expressions could obtain the composite displacement field U(ξ) of CEM. The computations of the stiffness and mass matrices can follow the conventional procedure of FEM. Since the CEM inherents some good properties of the conventional FEM and classical analytical method, it has the powerful versatility to various complex geometric shapes and excellent approximation. Many examples are presented to demonstrate the ability of CEM.

  15. New numerical analysis method in computational mechanics: composite element method

    Institute of Scientific and Technical Information of China (English)

    曾攀

    2000-01-01

    A new type of FEM, called CEM (composite element method), is proposed to solve the static and dynamic problems of engineering structures with high accuracy and efficiency. The core of this method is to define two sets of coordinate systems for DOF’ s description after discretizing the structure, i.e. the nodal coordinate system UFEM(ζ) for employing the conventional FEM, and the field coordinate system UCT(ζ) for utilizing classical theory. Then, coupling these two sets of functional expressions could obtain the composite displacement field U(ζ) of CEM. The computations of the stiffness and mass matrices can follow the conventional procedure of FEM. Since the CEM inherents some good properties of the conventional FEM and classical analytical method, it has the powerful versatility to various complex geometric shapes and excellent approximation. Many examples are presented to demonstrate the ability of CEM.

  16. Bio-image warehouse system: concept and implementation of a diagnosis-based data warehouse for advanced imaging modalities in neuroradiology.

    Science.gov (United States)

    Minati, L; Ghielmetti, F; Ciobanu, V; D'Incerti, L; Maccagnano, C; Bizzi, A; Bruzzone, M G

    2007-03-01

    Advanced neuroimaging techniques, such as functional magnetic resonance imaging (fMRI), chemical shift spectroscopy imaging (CSI), diffusion tensor imaging (DTI), and perfusion-weighted imaging (PWI) create novel challenges in terms of data storage and management: huge amounts of raw data are generated, the results of analysis may depend on the software and settings that have been used, and most often intermediate files are inherently not compliant with the current DICOM (digital imaging and communication in medicine) standard, as they contain multidimensional complex and tensor arrays and various other types of data structures. A software architecture, referred to as Bio-Image Warehouse System (BIWS), which can be used alongside a radiology information system/picture archiving and communication system (RIS/PACS) system to store neuroimaging data for research purposes, is presented. The system architecture is conceived with the purpose of enabling to query by diagnosis according to a predefined two-layered classification taxonomy. The operational impact of the system and the time needed to get acquainted with the web-based interface and with the taxonomy are found to be limited. The development of modules enabling automated creation of statistical templates is proposed.

  17. Critical Security Methods : New Frameworks for Analysis

    NARCIS (Netherlands)

    Voelkner, Nadine; Huysmans, Jef; Claudia, Aradau; Neal, Andrew

    2015-01-01

    Critical Security Methods offers a new approach to research methods in critical security studies. It argues that methods are not simply tools to bridge the gap between security theory and security practice. Rather, to practise methods critically means engaging in a more free and experimental interpl

  18. Advanced Software Methods for Physics Analysis

    Science.gov (United States)

    Lista, L.

    2006-01-01

    Unprecedented data analysis complexity is experienced in modern High Energy Physics experiments. The complexity arises from the growing size of recorded data samples, the large number of data analyses performed by different users in each single experiment, and the level of complexity of each single analysis. For this reason, the requirements on software for data analysis impose a very high level of reliability. We present two concrete examples: the former from BaBar experience with the migration to a new Analysis Model with the definition of a new model for the Event Data Store, the latter about a toolkit for multivariate statistical and parametric Monte Carlo analysis developed using generic programming.

  19. Solving Generalised Riccati Differential Equations by Homotopy Analysis Method

    Directory of Open Access Journals (Sweden)

    J. Vahidi

    2013-07-01

    Full Text Available In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM. Comparisons are made between Adomian’s decomposition method (ADM and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

  20. 21 CFR 163.5 - Methods of analysis.

    Science.gov (United States)

    2010-04-01

    ... CONSUMPTION CACAO PRODUCTS General Provisions § 163.5 Methods of analysis. Shell and cacao fat content in cacao products shall be determined by the following methods of analysis prescribed in “Official Methods..._locations.html. (a) Shell content—12th ed. (1975), methods 13.010-13.014, under the heading “Shell in...

  1. Methods for analysis of fluoroquinolones in biological fluids

    Science.gov (United States)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  2. Methods for Mediation Analysis with Missing Data

    Science.gov (United States)

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  3. Reliability Analysis of Slope Stability by Central Point Method

    OpenAIRE

    Li, Chunge; WU Congliang

    2015-01-01

    Given uncertainty and variability of the slope stability analysis parameter, the paper proceed from the perspective of probability theory and statistics based on the reliability theory. Through the central point method of reliability analysis, performance function about the reliability of slope stability analysis is established. What’s more, the central point method and conventional limit equilibrium methods do comparative analysis by calculation example. The approach’s numerical ...

  4. Multiscale Methods for Nuclear Reactor Analysis

    Science.gov (United States)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly

  5. Chemical Analysis Methods for Silicon Carbide

    Institute of Scientific and Technical Information of China (English)

    Shen Keyin

    2006-01-01

    @@ 1 General and Scope This Standard specifies the determination method of silicon dioxide, free silicon, free carbon, total carbon, silicon carbide, ferric sesquioxide in silicon carbide abrasive material.

  6. Numerical analysis in electromagnetics the TLM method

    CERN Document Server

    Saguet, Pierre

    2013-01-01

    The aim of this book is to give a broad overview of the TLM (Transmission Line Matrix) method, which is one of the "time-domain numerical methods". These methods are reputed for their significant reliance on computer resources. However, they have the advantage of being highly general.The TLM method has acquired a reputation for being a powerful and effective tool by numerous teams and still benefits today from significant theoretical developments. In particular, in recent years, its ability to simulate various situations with excellent precision, including complex materials, has been

  7. Steel mill products analysis using qualities methods

    Directory of Open Access Journals (Sweden)

    B. Gajdzik

    2016-10-01

    Full Text Available The article presents the subject matter of steel mill product analysis using quality tools. The subject of quality control were bolts and a ball bushing. The Pareto chart and fault mode and effect analysis (FMEA were used to assess faultiness of the products. The faultiness analysis in case of the bolt enabled us to detect the following defects: failure to keep the dimensional tolerance, dents and imprints, improper roughness, lack of pre-machining, non-compatibility of the electroplating and faults on the surface. Analysis of the ball bushing has also revealed defects such as: failure to keep the dimensional tolerance, dents and imprints, improper surface roughness, lack of surface premachining as well as sharp edges and splitting of the material.

  8. Analysis of queues methods and applications

    CERN Document Server

    Gautam, Natarajan

    2012-01-01

    Introduction Analysis of Queues: Where, What, and How?Systems Analysis: Key ResultsQueueing Fundamentals and Notations Psychology in Queueing Reference Notes Exercises Exponential Interarrival and Service Times: Closed-Form Expressions Solving Balance Equations via Arc CutsSolving Balance Equations Using Generating Functions Solving Balance Equations Using Reversibility Reference Notes ExercisesExponential Interarrival and Service Times: Numerical Techniques and Approximations Multidimensional Birth and Death ChainsMultidimensional Markov Chains Finite-State Markov ChainsReference Notes Exerci

  9. The simulation study of three typical time frequency analysis methods

    Directory of Open Access Journals (Sweden)

    Li Yifeng

    2017-01-01

    Full Text Available The principals and characteristics of three typical time frequency analysis methods that Short Time Furious transformation, wavelet transformation and Hilbert-Huang transformation are introduced, and the mathematical definition, characteristics and application ranges of these analysis methods and so on are pointed out, then their time-frequency local performance is made analysis and comparison through computer programming and simulation.

  10. Methods of Fourier analysis and approximation theory

    CERN Document Server

    Tikhonov, Sergey

    2016-01-01

    Different facets of interplay between harmonic analysis and approximation theory are covered in this volume. The topics included are Fourier analysis, function spaces, optimization theory, partial differential equations, and their links to modern developments in the approximation theory. The articles of this collection were originated from two events. The first event took place during the 9th ISAAC Congress in Krakow, Poland, 5th-9th August 2013, at the section “Approximation Theory and Fourier Analysis”. The second event was the conference on Fourier Analysis and Approximation Theory in the Centre de Recerca Matemàtica (CRM), Barcelona, during 4th-8th November 2013, organized by the editors of this volume. All articles selected to be part of this collection were carefully reviewed.

  11. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    Science.gov (United States)

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  12. Wavelet methods in mathematical analysis and engineering

    CERN Document Server

    Damlamian, Alain

    2010-01-01

    This book gives a comprehensive overview of both the fundamentals of wavelet analysis and related tools, and of the most active recent developments towards applications. It offers a stateoftheart in several active areas of research where wavelet ideas, or more generally multiresolution ideas have proved particularly effective. The main applications covered are in the numerical analysis of PDEs, and signal and image processing. Recently introduced techniques such as Empirical Mode Decomposition (EMD) and new trends in the recovery of missing data, such as compressed sensing, are also presented.

  13. Game data analysis tools and methods

    CERN Document Server

    Coupart, Thibault

    2013-01-01

    This book features an introduction to the basic theoretical tenets of data analysis from a game developer's point of view, as well as a practical guide to performing gameplay analysis on a real-world game.This book is ideal for video game developers who want to try and experiment with the game analytics approach for their own productions. It will provide a good overview of the themes you need to pay attention to, and will pave the way for success. Furthermore, the book also provides a wide range of concrete examples that will be useful for any game data analysts or scientists who want to impro

  14. Statistical Smoothing Methods and Image Analysis

    Science.gov (United States)

    1988-12-01

    83 - 111. Rosenfeld, A. and Kak, A.C. (1982). Digital Picture Processing. Academic Press,Qrlando. Serra, J. (1982). Image Analysis and Mat hematical ...hypothesis testing. IEEE Trans. Med. Imaging, MI-6, 313-319. Wicksell, S.D. (1925) The corpuscle problem. A mathematical study of a biometric problem

  15. Analysis of Two Methods to Evaluate Antioxidants

    Science.gov (United States)

    Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor

    2012-01-01

    This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content…

  16. Mixed Methods Analysis of Enterprise Social Networks

    DEFF Research Database (Denmark)

    Behrendt, Sebastian; Richter, Alexander; Trier, Matthias

    2014-01-01

    The increasing use of enterprise social networks (ESN) generates vast amounts of data, giving researchers and managerial decision makers unprecedented opportunities for analysis. However, more transparency about the available data dimensions and how these can be combined is needed to yield accurate...

  17. Root selection methods in flood analysis

    Directory of Open Access Journals (Sweden)

    B. Parmentier

    2003-01-01

    Full Text Available In the 1970s, de Laine developed a root-matching procedure for estimating unit hydrograph ordinates from estimates of the fast component of the total runoff from multiple storms. Later, Turner produced a root selection method which required only data from one storm event and was based on recognising a pattern typical of unit hydrograph roots. Both methods required direct runoff data, i.e. prior separation of the slow response. This paper introduces a further refinement, called root separation, which allows the estimation of both the unit hydrograph ordinates and the effective precipitation from the full discharge hydrograph. It is based on recognising and separating the quicker component of the response from the much slower components due to interflow and/or baseflow. The method analyses the z-transform roots of carefully selected segments of the full hydrograph. The root patterns of these separate segments tend to be dominated by either the fast response or the slow response. This paper shows how their respective time-scales can be distinguished with an accuracy sufficient for practical purposes. As an illustration, theoretical equations are derived for a conceptual rainfall-runoff system with the input split between fast and slow reservoirs in parallel. These are solved analytically to identify the reservoir constants and the input splitting parameter. The proposed method, called 'root separation', avoids the subjective selection of rainfall roots in the Turner method as well as the subjective matching of roots in the original de Laine method. Keywords: unit hydrograph,identification methods, z-transform, polynomial roots, root separation, fast andslow response, Nash cascade

  18. Review of Methods of Critical Discourse Analysis (second edition)

    Institute of Scientific and Technical Information of China (English)

    邵晓巍

    2014-01-01

    This article, through giving an overview of Methods of Critical Discourse Analysis(second edition), introduces six methodology of doing Critical Discourse Analysis. It is hoped that this article will help those interested in CDA better usher into this field.

  19. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  20. MULTIPHYSICAL ANALYSIS METHODS OF TRANSPORT MACHINES

    Directory of Open Access Journals (Sweden)

    L. Avtonomova

    2009-01-01

    Full Text Available The complex of theoretical, calculable and applied questions of elements transport machine are studied. Coupled-field analyses are useful for solving problems where the coupled interaction of phenomena from various disciplines of physical science is significant. There are basically 3 methods of coupling distinguished by the finite element formulation techniques used to develop the matrix equations.

  1. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  2. Phase analysis method for burst onset prediction

    Science.gov (United States)

    Stellino, Flavio; Mazzoni, Alberto; Storace, Marco

    2017-02-01

    The response of bursting neurons to fluctuating inputs is usually hard to predict, due to their strong nonlinearity. For the same reason, decoding the injected stimulus from the activity of a bursting neuron is generally difficult. In this paper we propose a method describing (for neuron models) a mechanism of phase coding relating the burst onsets with the phase profile of the input current. This relation suggests that burst onset may provide a way for postsynaptic neurons to track the input phase. Moreover, we define a method of phase decoding to solve the inverse problem and estimate the likelihood of burst onset given the input state. Both methods are presented here in a unified framework, describing a complete coding-decoding procedure. This procedure is tested by using different neuron models, stimulated with different inputs (stochastic, sinusoidal, up, and down states). The results obtained show the efficacy and broad range of application of the proposed methods. Possible applications range from the study of sensory information processing, in which phase-of-firing codes are known to play a crucial role, to clinical applications such as deep brain stimulation, helping to design stimuli in order to trigger or prevent neural bursting.

  3. An Autoregressive Method for Simulation Output Analysis.

    Science.gov (United States)

    1982-12-01

    Spectral Density Function 24 3 THE AUTOREGRESSIVE METHOD AND ITS APPLICATIONS...precision of point estimates can be approximated arbitrarily closely by the spectral density function at zero of a finite order autoregressive process...also develop some approximation theorems for continuous spectral density function . It is then demonstrated that a continuous spectral density function

  4. Analysis of the sweeped actuator line method

    Directory of Open Access Journals (Sweden)

    Nathan Jörn

    2015-01-01

    Full Text Available The actuator line method made it possible to describe the near wake of a wind turbine more accurately than with the actuator disk method. Whereas the actuator line generates the helicoidal vortex system shed from the tip blades, the actuator disk method sheds a vortex sheet from the edge of the rotor plane. But with the actuator line come also temporal and spatial constraints, such as the need for a much smaller time step than with actuator disk. While the latter one only has to obey the Courant-Friedrichs-Lewy condition, the former one is also restricted by the grid resolution and the rotor tip-speed. Additionally the spatial resolution has to be finer for the actuator line than with the actuator disk, for well resolving the tip vortices. Therefore this work is dedicated to examining a method in between of actuator line and actuator disk, which is able to model the transient behaviour, such as the rotating blades, but which also relaxes the temporal constraint. Therefore a larger time-step is used and the blade forces are swept over a certain area. The main focus of this article is on the aspect of the blade tip vortex generation in comparison with the standard actuator line and actuator disk.

  5. Evaluating Temporal Analysis Methods Using Residential Burglary Data

    Directory of Open Access Journals (Sweden)

    Martin Boldt

    2016-08-01

    Full Text Available Law enforcement agencies, as well as researchers rely on temporal analysis methods in many crime analyses, e.g., spatio-temporal analyses. A number of temporal analysis methods are being used, but a structured comparison in different configurations is yet to be done. This study aims to fill this research gap by comparing the accuracy of five existing, and one novel, temporal analysis methods in approximating offense times for residential burglaries that often lack precise time information. The temporal analysis methods are evaluated in eight different configurations with varying temporal resolution, as well as the amount of data (number of crimes available during analysis. A dataset of all Swedish residential burglaries reported between 2010 and 2014 is used (N = 103,029. From that dataset, a subset of burglaries with known precise offense times is used for evaluation. The accuracy of the temporal analysis methods in approximating the distribution of burglaries with known precise offense times is investigated. The aoristic and the novel aoristic e x t method perform significantly better than three of the traditional methods. Experiments show that the novel aoristic e x t method was most suitable for estimating crime frequencies in the day-of-the-year temporal resolution when reduced numbers of crimes were available during analysis. In the other configurations investigated, the aoristic method showed the best results. The results also show the potential from temporal analysis methods in approximating the temporal distributions of residential burglaries in situations when limited data are available.

  6. Analysis of Vibration Diagnostics Methods for Induction Motors

    Directory of Open Access Journals (Sweden)

    A. Kalinov

    2012-01-01

    Full Text Available The paper presents an analysis of existing vibration diagnostics methods. In order to evaluate an efficiency of method application the following criteria have been proposed: volume of input data required for establishing diagnosis, data content, software and hardware level, execution time for vibration diagnostics. According to the mentioned criteria a classification of vibration diagnostics methods for determination of their advantages and disadvantages, search for their development and improvement has been presented in paper. The paper contains a comparative estimation of methods in accordance with the proposed  criteria. According to this estimation the most efficient methods are a spectral analysis and spectral analysis of the vibration signal envelope.

  7. Updated Methods for Seed Shape Analysis

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2016-01-01

    Full Text Available Morphological variation in seed characters includes differences in seed size and shape. Seed shape is an important trait in plant identification and classification. In addition it has agronomic importance because it reflects genetic, physiological, and ecological components and affects yield, quality, and market price. The use of digital technologies, together with development of quantification and modeling methods, allows a better description of seed shape. Image processing systems are used in the automatic determination of seed size and shape, becoming a basic tool in the study of diversity. Seed shape is determined by a variety of indexes (circularity, roundness, and J index. The comparison of the seed images to a geometrical figure (circle, cardioid, ellipse, ellipsoid, etc. provides a precise quantification of shape. The methods of shape quantification based on these models are useful for an accurate description allowing to compare between genotypes or along developmental phases as well as to establish the level of variation in different sets of seeds.

  8. [Anorexia nervosa: study method and sleep analysis].

    Science.gov (United States)

    Cervera, S; Zapata, R; Gual, P; Quintanilla, B

    1989-01-01

    By studying anorexia nervosa with an Integrated Inventory and the quality and the quantity of sleep applying Hauri's scale for the analysis of dream contents, the sleeping habits of 50 anorexic patients who were under treatment have been studied. The results show that sleep in these patients is similar and sometimes better in quantity and quality than those in the control group. Their dreams are characterized by an almost total absence of sexual, aggressive and alimentary contents, and that reality, active participation, unpleasant feelings and sensory-perceptive elements are predominant.

  9. Extrudate Expansion Modelling through Dimensional Analysis Method

    DEFF Research Database (Denmark)

    A new model framework is proposed to correlate extrudate expansion and extrusion operation parameters for a food extrusion cooking process through dimensional analysis principle, i.e. Buckingham pi theorem. Three dimensionless groups, i.e. energy, water content and temperature, are suggested...... to describe the extrudates expansion. From the three dimensionless groups, an equation with three experimentally determined parameters is derived to express the extrudate expansion. The model is evaluated with whole wheat flour and aquatic feed extrusion experimental data. The average deviations...

  10. Casting defects analysis by the Pareto method

    Directory of Open Access Journals (Sweden)

    B. Borowiecki

    2011-07-01

    Full Text Available On the basis of receive results formed of diagram Pareto Lorenz. On the basis of receive graph it affirmed, that for 70% general number casting defects answered 3 defects (9 contribution – 100% defects. For 70% general number defects of influence it has three type of causes: sand holes, porosity and slaginclusions. Thedefects show that it is necessary to take up construction gatingsystem. The remaining 8 causes have been concerned only 25%, with general number of casting defects. Analysis of receive results permit to determine of direction of correct actions in order to eliminate or to limit the most defects.

  11. Decision making based on data analysis methods

    OpenAIRE

    Sirola, Miki; Sulkava, Mika

    2016-01-01

    This technical report is based on four our recent articles:"Data fusion of pre-election gallups and polls for improved support estimates", "Analyzing parliamentary elections based on voting advice application data", "The Finnish car rejection reasons shown in an interactive SOM visualization tool", and "Network visualization of car inspection data using graph layout". Neural methods are applied in political and technical decision making. We introduce decision support schemes based on Self-Org...

  12. Geometrical Methods for Power Network Analysis

    CERN Document Server

    Bellucci, Stefano; Gupta, Neeraj

    2013-01-01

    This book is a short introduction to power system planning and operation using advanced geometrical methods. The approach is based on well-known insights and techniques developed in theoretical physics in the context of Riemannian manifolds. The proof of principle and robustness of this approach is examined in the context of the IEEE 5 bus system. This work addresses applied mathematicians, theoretical physicists and power engineers interested in novel mathematical approaches to power network theory.

  13. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that per

  14. Enhancing Quantum Dots for Bioimaging using Advanced Surface Chemistry and Advanced Optical Microscopy: Application to Silicon Quantum Dots (SiQDs).

    Science.gov (United States)

    Cheng, Xiaoyu; Hinde, Elizabeth; Owen, Dylan M; Lowe, Stuart B; Reece, Peter J; Gaus, Katharina; Gooding, J Justin

    2015-10-28

    Fluorescence lifetime imaging microscopy is successfully demonstrated in both one- and two-photon cases with surface modified, nanocrystalline silicon quantum dots in the context of bioimaging. The technique is further demonstrated in combination with Förster resonance energy transfer studies where the color of the nanoparticles is tuned by using organic dye acceptors directly conjugated onto the nanoparticle surface.

  15. The Constant Comparative Method of Qualitative Analysis

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, Ph.D.

    2008-11-01

    Full Text Available Currently, the general approaches to the analysis of qualitative data are these:1. If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i2. If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties.We wish to suggest a third approach

  16. Radiation analysis devices, radiation analysis methods, and articles of manufacture

    Science.gov (United States)

    Roybal, Lyle Gene

    2010-06-08

    Radiation analysis devices include circuitry configured to determine respective radiation count data for a plurality of sections of an area of interest and combine the radiation count data of individual of sections to determine whether a selected radioactive material is present in the area of interest. An amount of the radiation count data for an individual section is insufficient to determine whether the selected radioactive material is present in the individual section. An article of manufacture includes media comprising programming configured to cause processing circuitry to perform processing comprising determining one or more correction factors based on a calibration of a radiation analysis device, measuring radiation received by the radiation analysis device using the one or more correction factors, and presenting information relating to an amount of radiation measured by the radiation analysis device having one of a plurality of specified radiation energy levels of a range of interest.

  17. Vulnerability analysis of three remote voting methods

    CERN Document Server

    Enguehard, Chantal

    2009-01-01

    This article analyses three methods of remote voting in an uncontrolled environment: postal voting, internet voting and hybrid voting. It breaks down the voting process into different stages and compares their vulnerabilities considering criteria that must be respected in any democratic vote: confidentiality, anonymity, transparency, vote unicity and authenticity. Whether for safety or reliability, each vulnerability is quantified by three parameters: size, visibility and difficulty to achieve. The study concludes that the automatisation of treatments combined with the dematerialisation of the objects used during an election tends to substitute visible vulnerabilities of a lesser magnitude by invisible and widespread vulnerabilities.

  18. Tournament Methods for WLAN: Analysis and Efficiency

    Science.gov (United States)

    Galtier, Jérôme

    In the context of radio distributed networks, we present a generalized approach for Medium Access Control (MAC) with a fixed congestion window. Our protocol is quite simple to analyze and can be used in a lot of different situations. We give mathematical evidence showing that our performance is asymptotically tight. We also place ourselves in the WiFi and WiMAX frameworks, and discuss experimental results showing acollision reduction of 14% to 21% compared to the best-known methods. We discuss channel capacity improvement and fairness considerations.

  19. Surface analysis methods in materials science

    CERN Document Server

    Sexton, Brett; Smart, Roger

    1992-01-01

    The idea for this book stemmed from a remark by Philip Jennings of Murdoch University in a discussion session following a regular meeting of the Australian Surface Science group. He observed that a text on surface analysis and applica­ tions to materials suitable for final year undergraduate and postgraduate science students was not currently available. Furthermore, the members of the Australian Surface Science group had the research experience and range of coverage of sur­ face analytical techniques and applications to provide a text for this purpose. A of techniques and applications to be included was agreed at that meeting. The list intended readership of the book has been broadened since the early discussions, particularly to encompass industrial users, but there has been no significant alter­ ation in content. The editors, in consultation with the contributors, have agreed that the book should be prepared for four major groups of readers: - senior undergraduate students in chemistry, physics, metallur...

  20. Experimental and analysis methods in radiochemical experiments

    Science.gov (United States)

    Cattadori, C. M.; Pandola, L.

    2016-04-01

    Radiochemical experiments made the history of neutrino physics by achieving the first observation of solar neutrinos (Cl experiment) and the first detection of the fundamental pp solar neutrinos component (Ga experiments). They measured along decades the integral νe charged current interaction rate in the exposed target. The basic operation principle is the chemical separation of the few atoms of the new chemical species produced by the neutrino interactions from the rest of the target, and their individual counting in a low-background counter. The smallness of the expected interaction rate (1 event per day in a ˜ 100 ton target) poses severe experimental challenges on the chemical and on the counting procedures. The main aspects related to the analysis techniques employed in solar neutrino experiments are reviewed and described, with a special focus given to the event selection and the statistical data treatment.

  1. Experimental and analysis methods in radiochemical experiments

    Energy Technology Data Exchange (ETDEWEB)

    Cattadori, C.M. [INFN, Milano (Italy); Pandola, L. [Laboratori Nazionali del Sud, INFN, Catania (Italy); Gran Sasso Science Institute, INFN, L' Aquila (Italy)

    2016-04-15

    Radiochemical experiments made the history of neutrino physics by achieving the first observation of solar neutrinos (Cl experiment) and the first detection of the fundamental pp solar neutrinos component (Ga experiments). They measured along decades the integral ν{sub e} charged current interaction rate in the exposed target. The basic operation principle is the chemical separation of the few atoms of the new chemical species produced by the neutrino interactions from the rest of the target, and their individual counting in a low-background counter. The smallness of the expected interaction rate (1 event per day in a ∝ 100 ton target) poses severe experimental challenges on the chemical and on the counting procedures. The main aspects related to the analysis techniques employed in solar neutrino experiments are reviewed and described, with a special focus given to the event selection and the statistical data treatment. (orig.)

  2. Stochastic back analysis of permeability coefficient using generalized Bayesian method

    Institute of Scientific and Technical Information of China (English)

    Zheng Guilan; Wang Yuan; Wang Fei; Yang Jian

    2008-01-01

    Owing to the fact that the conventional deterministic back analysis of the permeability coefficient cannot reflect the uncertainties of parameters, including the hydraulic head at the boundary, the permeability coefficient and measured hydraulic head, a stochastic back analysis taking consideration of uncertainties of parameters was performed using the generalized Bayesian method. Based on the stochastic finite element method (SFEM) for a seepage field, the variable metric algorithm and the generalized Bayesian method, formulas for stochastic back analysis of the permeability coefficient were derived. A case study of seepage analysis of a sluice foundation was performed to illustrate the proposed method. The results indicate that, with the generalized Bayesian method that considers the uncertainties of measured hydraulic head, the permeability coefficient and the hydraulic head at the boundary, both the mean and standard deviation of the permeability coefficient can be obtained and the standard deviation is less than that obtained by the conventional Bayesian method. Therefore, the present method is valid and applicable.

  3. Comparative Study Among Lease Square Method, Steepest Descent Method, and Conjugate Gradient Method for Atmopsheric Sounder Data Analysis

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-09-01

    Full Text Available Comparative study among Least Square Method: LSM, Steepest Descent Method: SDM, and Conjugate Gradient Method: CGM for atmospheric sounder data analysis (estimation of vertical profiles for water vapor is conducted. Through simulation studies, it is found that CGM shows the best estimation accuracy followed by SDM and LSM. Method dependency on atmospheric models is also clarified.

  4. Comparison of a general series expansion method and the homotopy analysis method

    CERN Document Server

    Liu, Cheng-shi

    2011-01-01

    A simple analytic tool namely the general series expansion method is proposed to find the solutions for nonlinear differential equations. By choosing a set of suitable basis functions $\\{e_n(t,t_0)\\}_{n=0}^{+\\infty}$ such that the solution to the equation can be expressed by $u(t)=\\sum_{n=0}^{+\\infty}c_ne_n(t,t_0)$. In general, $t_0$ can control and adjust the convergence region of the series solution such that our method has the same effect as the homotopy analysis method proposed by Liao, but our method is more simple and clear. As a result, we show that the secret parameter $h$ in the homotopy analysis methods can be explained by using our parameter $t_0$. Therefore, our method reveals a key secret in the homotopy analysis method. For the purpose of comparison with the homotopy analysis method, a typical example is studied in detail.

  5. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  6. Modified Homotopy Analysis Method for Zakharov-Kuznetsov Equations

    Directory of Open Access Journals (Sweden)

    Muhammad USMAN

    2013-01-01

    Full Text Available In this paper, we apply Modified Homotopy Analysis Method (MHAM to find appropriate solutions of Zakharov-Kuznetsov equations which are of utmost importance in applied and engineering sciences. The proposed modification is the elegant coupling of Homotopy Analysis Method (HAM and Taylor’s series. Numerical results coupled with graphical representation explicitly reveal the complete reliability of the proposed algorithm.

  7. 7 CFR 58.812 - Methods of sample analysis.

    Science.gov (United States)

    2010-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE...

  8. Validation of a hybrid life-cycle inventory analysis method.

    Science.gov (United States)

    Crawford, Robert H

    2008-08-01

    The life-cycle inventory analysis step of a life-cycle assessment (LCA) may currently suffer from several limitations, mainly concerned with the use of incomplete and unreliable data sources and methods of assessment. Many past LCA studies have used traditional inventory analysis methods, namely process analysis and input-output analysis. More recently, hybrid inventory analysis methods have been developed, combining these two traditional methods in an attempt to minimise their limitations. In light of recent improvements, these hybrid methods need to be compared and validated, as these too have been considered to have several limitations. This paper evaluates a recently developed hybrid inventory analysis method which aims to improve the limitations of previous methods. It was found that the truncation associated with process analysis can be up to 87%, reflecting the considerable shortcomings in the quantity of process data currently available. Capital inputs were found to account for up to 22% of the total inputs to a particular product. These findings suggest that current best-practice methods are sufficiently accurate for most typical applications, but this is heavily dependent upon data quality and availability. The use of input-output data assists in improving the system boundary completeness of life-cycle inventories. However, the use of input-output analysis alone does not always provide an accurate model for replacing process data. Further improvements in the quantity of process data currently available are needed to increase the reliability of life-cycle inventories.

  9. Analysis of speech waveform quantization methods

    Directory of Open Access Journals (Sweden)

    Tadić Predrag R.

    2008-01-01

    Full Text Available Digitalization, consisting of sampling and quantization, is the first step in any digital signal processing algorithm. In most cases, the quantization is uniform. However, having knowledge of certain stochastic attributes of the signal (namely, the probability density function, or pdf, quantization can be made more efficient, in the sense of achieving a greater signal to quantization noise ratio. This means that narrower channel bandwidths are required for transmitting a signal of the same quality. Alternatively, if signal storage is of interest, rather than transmission, considerable savings in memory space can be made. This paper presents several available methods for speech signal pdf estimation, and quantizer optimization in the sense of minimizing the quantization error power.

  10. New analysis method for passive microrheology

    Science.gov (United States)

    Nishi, Kengo; Schmidt, Christoph; Mackintosh, Fred

    Passive microrheology is an experimental technique used to measure the mechanical response of materials from the fluctuations of micron-sized beads embedded in the medium. Microrheology is well suited to study rheological properties of materials that are difficult to obtain in larger amounts and also of materials inside of single cells. In one common approach, one uses the fluctuation-dissipation theorem to obtain the imaginary part of the material response function from the power spectral density of bead displacement fluctuations, while the real part of the response function is calculated using a Kramers-Kronig integral. The high-frequency cut-off of this integral strongly affects the real part of the response function in the high frequency region. Here, we discuss how to obtain more accurate values of the real part of the response function by an alternative method using autocorrelation functions.

  11. Analysis of mesoscale forecasts using ensemble methods

    CERN Document Server

    Gross, Markus

    2016-01-01

    Mesoscale forecasts are now routinely performed as elements of operational forecasts and their outputs do appear convincing. However, despite their realistic appearance at times the comparison to observations is less favorable. At the grid scale these forecasts often do not compare well with observations. This is partly due to the chaotic system underlying the weather. Another key problem is that it is impossible to evaluate the risk of making decisions based on these forecasts because they do not provide a measure of confidence. Ensembles provide this information in the ensemble spread and quartiles. However, running global ensembles at the meso or sub mesoscale involves substantial computational resources. National centers do run such ensembles, but the subject of this publication is a method which requires significantly less computation. The ensemble enhanced mesoscale system presented here aims not at the creation of an improved mesoscale forecast model. Also it is not to create an improved ensemble syste...

  12. Analysis of the Wing Tsun Punching Methods

    Directory of Open Access Journals (Sweden)

    Jeff Webb

    2012-07-01

    Full Text Available The three punching techniques of Wing Tsun, while few in number, represent an effective approach to striking with the closed fist. At first glance, the rather short stroke of each punch would seem disproportionate to the amount of power it generates. Therefore, this article will discuss the structure and body mechanics of each punch, in addition to the various training methods employed for developing power. Two of the Wing Tsun punches, namely the lifting punch and the hooking punch, are often confused with similar punches found in Western boxing. The key differences between the Wing Tsun and boxing punches, both in form and function, will be discussed. Finally, the strategy for applying the Wing Tsun punches will serve as the greatest factor in differentiating them from the punches of other martial arts styles.

  13. Schedulability Analysis Method of Timing Constraint Petri Nets

    Institute of Scientific and Technical Information of China (English)

    李慧芳; 范玉顺

    2002-01-01

    Timing constraint Petri nets (TCPNs) can be used to model a real-time system specification and to verify the timing behavior of the system. This paper describes the limitations of the reachability analysis method in analyzing complex systems for existing TCPNs. Based on further research on the schedulability analysis method with various topology structures, a more general state reachability analysis method is proposed. To meet various requirements of timely response for actual systems, this paper puts forward a heuristic method for selecting decision-spans of transitions and develops a heuristic algorithm for schedulability analysis of TCPNs. Examples are given showing the practicality of the method in the schedulability analysis for real-time systems with various structures.

  14. Rotordynamic Analysis with Shell Elements for the Transfer Matrix Method

    Science.gov (United States)

    1989-08-01

    jACCESSION NO. 11. TITLE (Include Security Classification) (UNCLASSIFIED) ROTORDYNAMIC ANALYSIS WITH SHELL ELEMENTS FOR THE TRANSFER MATRIX METHOD 12...SECURITY CLASSIFICATION OF THIS PAGE AFIT/CI "OVERPRINT" iii ABSTRACT Rotordynamic Analysis with Shell Elements for the Transfer Matrix Method. (August...analysts in indus- try . ’ . ," Accesiu:, For NTIS CR,4i Fi FilC TA,: [3 0. fi A-1 B I ., ,.................. ,., ROTORDYNAMIC ANALYSIS WITH SHELL ELEMENTS

  15. Complex system analysis using CI methods

    Science.gov (United States)

    Fathi, Madjid; Hildebrand, Lars

    1999-03-01

    Modern technical tasks often need the use of complex system models. In many complex cases the model parameters can be gained using neural networks, but these systems allow only a one-way simulation from the input values to the learned output values. If evaluation in the other direction is needed, these model allow no direct evaluation. This task can be solved using evolutionary algorithms, which are part of the computational intelligence. The term computational intelligence covers three special fields of the artificial intelligence, fuzzy logic, artificial neural networks and evolutionary algorithms. We will focus only on the topic of evolutionary algorithms and fuzzy logic. Evolutionary algorithms covers the fields of genetic algorithms, evolution strategies and evolutionary programming. These methods can be used to optimize technical problems. Evolutionary algorithms have certain advantages, if these problems have no mathematical properties, like steadiness or the possibility to obtain the derivatives. Fuzzy logic systems normally lack these properties. The use of a combination of evolutionary algorithms and fuzzy logic allow an evaluation of the learned simulation models in the direction form output to the input values. An example can be given from the field of screw rotor design.

  16. RESULTS OF THE QUESTIONNAIRE: ANALYSIS METHODS

    CERN Multimedia

    Staff Association

    2014-01-01

    Five-yearly review of employment conditions   Article S V 1.02 of our Staff Rules states that the CERN “Council shall periodically review and determine the financial and social conditions of the members of the personnel. These periodic reviews shall consist of a five-yearly general review of financial and social conditions;” […] “following methods […] specified in § I of Annex A 1”. Then, turning to the relevant part in Annex A 1, we read that “The purpose of the five-yearly review is to ensure that the financial and social conditions offered by the Organization allow it to recruit and retain the staff members required for the execution of its mission from all its Member States. […] these staff members must be of the highest competence and integrity.” And for the menu of such a review we have: “The five-yearly review must include basic salaries and may include any other financial or soc...

  17. Method for detecting software anomalies based on recurrence plot analysis

    OpenAIRE

    Michał Mosdorf

    2012-01-01

    Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET). Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions).

  18. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  19. WORK ANALYSIS IN ORGANIZATIONS – DEFINITION, USES AND METHODS

    Directory of Open Access Journals (Sweden)

    Andrea Valéria Steil

    2016-11-01

    Full Text Available Work analysis is a process used to understand what the important tasks of the job are, how they are performed, and what human attributes are necessary to carry them out successfully. Work analysis is an attempt to develop a theory of human behavior about the job in question to support management decisions. This paper defines work analysis, discusses its main uses in organizations, and presents the objects of study and the methods of work analysis. This paper also discusses how work analysis is done, considering the following steps:  types of data to be collected, data sources, data collecting methods, summary of the information and work analysis reports. This paper ends with the differentiation of work analysis and individual modeling skills and brings arguments to endorse work analysis as an intervention of work and organizational psychology.

  20. Reliability analysis method applied in slope stability: slope prediction and forecast on stability analysis

    Institute of Scientific and Technical Information of China (English)

    Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG

    2006-01-01

    Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.

  1. Stability and Accuracy Analysis for Taylor Series Numerical Method

    Institute of Scientific and Technical Information of China (English)

    赵丽滨; 姚振汉; 王寿梅

    2004-01-01

    The Taylor series numerical method (TSNM) is a time integration method for solving problems in structural dynamics. In this paper, a detailed analysis of the stability behavior and accuracy characteristics of this method is given. It is proven by a spectral decomposition method that TSNM is conditionally stable and belongs to the category of explicit time integration methods. By a similar analysis, the characteristic indicators of time integration methods, the percentage period elongation and the amplitude decay of TSNM, are derived in a closed form. The analysis plays an important role in implementing a procedure for automatic searching and finding convergence radii of TSNM. Finally, a linear single degree of freedom undamped system is analyzed to test the properties of the method.

  2. Digital methods of photopeak integration in activation analysis.

    Science.gov (United States)

    Baedecker, P. A.

    1971-01-01

    A study of the precision attainable by several methods of gamma-ray photopeak integration has been carried out. The 'total peak area' method, the methods proposed by Covell, Sterlinski, and Quittner, and some modifications of these methods have been considered. A modification by Wasson of the total peak area method is considered to be the most advantageous due to its simplicity and the relatively high precision obtainable with this technique. A computer routine for the analysis of spectral data from nondestructive activation analysis experiments employing a Ge(Li) detector-spectrometer system is described.

  3. Stochastic Plane Stress Analysis with Elementary Stiffness Matrix Decomposition Method

    Science.gov (United States)

    Er, G. K.; Wang, M. C.; Iu, V. P.; Kou, K. P.

    2010-05-01

    In this study, the efficient analytical method named elementary stiffness matrix decomposition (ESMD) method is further investigated and utilized for the moment evaluation of stochastic plane stress problems in comparison with the conventional perturbation method in stochastic finite element analysis. In order to evaluate the performance of this method, computer programs are written and some numerical results about stochastic plane stress problems are obtained. The numerical analysis shows that the computational efficiency is much increased and the computer EMS memory requirement can be much reduced by using ESMD method.

  4. Simultaneous realization of Hg2+ sensing, magnetic resonance imaging and upconversion luminescence in vitro and in vivo bioimaging based on hollow mesoporous silica coated UCNPs and ruthenium complex

    Science.gov (United States)

    Ge, Xiaoqian; Sun, Lining; Ma, Binbin; Jin, Di; Dong, Liang; Shi, Liyi; Li, Nan; Chen, Haige; Huang, Wei

    2015-08-01

    We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak of the Ru complex is observed and the energy transfer process between the UCNPs and the Ru complex was blocked, resulting in an increase of the green emission intensity of the UCNPs. The un-changed 801 nm emission of the nanoprobe was used as an internal standard reference and the detection limit of Hg2+ was determined to be 0.16 μM for this nanoprobe in aqueous solution. In addition, based on the low cytotoxicity as studied by CCK-8 assay, the nanoprobe was successfully applied for cell imaging and small animal imaging. Furthermore, when doped with Gd3+ ions, the nanoprobe was successfully applied to in vivo magnetic resonance imaging (MRI) of Kunming mice, which demonstrates its potential as a MRI positive-contrast agent. Therefore, the method and results may provide more exciting opportunities to afford nanoprobes with multimodal bioimaging and multifunctional applications.We have constructed a multifunctional nanoprobe with sensing and imaging properties by using hollow mesoporous silica coated upconversion nanoparticles (UCNPs) and Hg2+ responsive ruthenium (Ru) complex. The Ru complex was loaded into the hollow mesoporous silica and the UCNPs acted as an energy donor, transferring luminescence energy to the Ru complex. Furthermore, polyethylenimine (PEI) was assembled on the surface of mesoporous silica to achieve better hydrophilic and bio-compatibility. Upon addition of Hg2+, a blue shift of the absorption peak

  5. Research on the Analysis Method of Micro Concentration of Uranium

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Spectrophotometric method is used for the analysis of micro concentration of uraninum in aqueousand organic phase in order to test the feasibility of TBP/OK-dimethylbenzene-TTA method for assayingorganic phase and concentrated hydrochloric acid-arsenazo Ⅲ method for assaying aqueous phase. It is

  6. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  7. Printing metal-spiked inks for LA-ICP-MS bioimaging internal standardization: comparison of the different nephrotoxic behavior of cisplatin, carboplatin, and oxaliplatin.

    Science.gov (United States)

    Moraleja, Irene; Esteban-Fernández, Diego; Lázaro, Alberto; Humanes, Blanca; Neumann, Boris; Tejedor, Alberto; Luz Mena, M; Jakubowski, Norbert; Gómez-Gómez, M Milagros

    2016-03-01

    The study of the distribution of the cytostatic drugs cisplatin, carboplatin, and oxaliplatin along the kidney may help to understand their different nephrotoxic behavior. Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) allows the acquisition of trace element images in biological tissues. However, results obtained are affected by several variations concerning the sample matrix and instrumental drifts. In this work, an internal standardization method based on printing an Ir-spiked ink onto the surface of the sample has been developed to evaluate the different distributions and accumulation levels of the aforementioned drugs along the kidney of a rat model. A conventional ink-jet printer was used to print fresh sagittal kidney tissue slices of 4 μm. A reproducible and homogenous deposition of the ink along the tissue was observed. The ink was partially absorbed on top of the tissue. Thus, this approach provides a pseudo-internal standardization, due to the fact that the ablation sample and internal standard take place subsequently and not simultaneously. A satisfactory normalization of LA-ICP-MS bioimages and therefore a reliable comparison of the kidney treated with different Pt-based drugs were achieved even for tissues analyzed on different days. Due to the complete ablation of the sample, the transport of the ablated internal standard and tissue to the inductively coupled plasma-mass spectrometry (ICP-MS) is practically taking place at the same time. Pt accumulation in the kidney was observed in accordance to the dosages administered for each drug. Although the accumulation rate of cisplatin and oxaliplatin is high in both cases, their Pt distributions differ. The strong nephrotoxicity observed for cisplatin and the absence of such side effect in the case of oxaliplatin could explain these distribution differences. The homogeneous distribution of oxaliplatin in the cortical and medullar areas could be related with its higher affinity for

  8. Geothermal water and gas: collected methods for sampling and analysis. Comment issue. [Compilation of methods

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, J.G.; Serne, R.J.; Shannon, D.W.; Woodruff, E.M.

    1976-08-01

    A collection of methods for sampling and analysis of geothermal fluids and gases is presented. Compilations of analytic options for constituents in water and gases are given. Also, a survey of published methods of laboratory water analysis is included. It is stated that no recommendation of the applicability of the methods to geothermal brines should be assumed since the intent of the table is to encourage and solicit comments and discussion leading to recommended analytical procedures for geothermal waters and research. (WHK)

  9. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  10. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  11. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  12. Fluorescent MoS2 Quantum Dots: Ultrasonic Preparation, Up-Conversion and Down-Conversion Bioimaging, and Photodynamic Therapy.

    Science.gov (United States)

    Dong, Haifeng; Tang, Songsong; Hao, Yansong; Yu, Haizhu; Dai, Wenhao; Zhao, Guifeng; Cao, Yu; Lu, Huiting; Zhang, Xueji; Ju, Huangxian

    2016-02-10

    Small size molybdenum disulfide (MoS2) quantum dots (QDs) with desired optical properties were controllably synthesized by using tetrabutylammonium-assisted ultrasonication of multilayered MoS2 powder via OH-mediated chain-like Mo-S bond cleavage mode. The tunable up-bottom approach of precise fabrication of MoS2 QDs finally enables detailed experimental investigations of their optical properties. The synthesized MoS2 QDs present good down-conversion photoluminescence behaviors and exhibit remarkable up-conversion photoluminescence for bioimaging. The mechanism of the emerging photoluminescence was investigated. Furthermore, superior (1)O2 production ability of MoS2 QDs to commercial photosensitizer PpIX was demonstrated, which has great potential application for photodynamic therapy. These early affording results of tunable synthesis of MoS2 QDs with desired photo properties can lead to application in fields of biomedical and optoelectronics.

  13. Enhanced vibrational spectroscopy, intracellular refractive indexing for label-free biosensing and bioimaging by multiband plasmonic-antenna array.

    Science.gov (United States)

    Chen, Cheng-Kuang; Chang, Ming-Hsuan; Wu, Hsieh-Ting; Lee, Yao-Chang; Yen, Ta-Jen

    2014-10-15

    In this study, we report a multiband plasmonic-antenna array that bridges optical biosensing and intracellular bioimaging without requiring a labeling process or coupler. First, a compact plasmonic-antenna array is designed exhibiting a bandwidth of several octaves for use in both multi-band plasmonic resonance-enhanced vibrational spectroscopy and refractive index probing. Second, a single-element plasmonic antenna can be used as a multifunctional sensing pixel that enables mapping the distribution of targets in thin films and biological specimens by enhancing the signals of vibrational signatures and sensing the refractive index contrast. Finally, using the fabricated plasmonic-antenna array yielded reliable intracellular observation was demonstrated from the vibrational signatures and intracellular refractive index contrast requiring neither labeling nor a coupler. These unique features enable the plasmonic-antenna array to function in a label-free manner, facilitating bio-sensing and imaging development.

  14. VALIDATION OF CYCLAMATE ANALYSIS METHOD WITH SPECTROPHOTOMETRY AND TURBIDIMETRY

    Directory of Open Access Journals (Sweden)

    Regina Tutik Padmaningrum

    2016-04-01

    Full Text Available This research aims to validate methods of analysis by spectrophotometry and turbidimetry cyclamate in the sample drink mango-flavored jelly drink  by spectrophotometry with hypochlorite reagent, ultraviolet spectrophotometry (without reagent and turbidimetry. The object of research was the validity parameters spectrophotometric method were linearity, linear range, the limit of detection, limit of quantitation, precision, and accuracy. The calibration curve of standard solution of sodium cyclamate in the spectrophotometric method with hypochlorite reagent, UV spectrophotometry (without reagent, and turbidimetry are linear. Linear range each method respectively at a concentration were (211.36-747.08; (16.000-146.434; and (1.8521-6.1717 ppm. The detection limit of each method successively were 53.6028; 0.5833; and 0.2723 ppm. Limit of quantitation each method successively were 66.9948; 1.9443; and 0.8068 ppm. Spectrophotometric analysis method cyclamate with hypochlorite reagent had good precision and accuracy. Ultra violet  spectrophotometric analysis method of cyclamate have a good precision but the accuracy was not good. Turbidimetric methods  analysis of cyclamate had  precision and accuracy were not good. Keywords:   method validation, spectrophotometry, turbidimetry, cyclamate

  15. Online Fault Diagnosis Method Based on Nonlinear Spectral Analysis

    Institute of Scientific and Technical Information of China (English)

    WEI Rui-xuan; WU Li-xun; WANG Yong-chang; HAN Chong-zhao

    2005-01-01

    The fault diagnosis based on nonlinear spectral analysis is a new technique for the nonlinear fault diagnosis, but its online application could be limited because of the enormous compution requirements for the estimation of general frequency response functions. Based on the fully decoupled Volterra identification algorithm, a new online fault diagnosis method based on nonlinear spectral analysis is presented, which can availably reduce the online compution requirements of general frequency response functions. The composition and working principle of the method are described, the test experiments have been done for damping spring of a vehicle suspension system by utilizing the new method, and the results indicate that the method is efficient.

  16. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  17. An operational modal analysis method in frequency and spatial domain

    Institute of Scientific and Technical Information of China (English)

    Wang Tong; Zhang Lingmi; Tamura Yukio

    2005-01-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  18. Meshless methods in biomechanics bone tissue remodelling analysis

    CERN Document Server

    Belinha, Jorge

    2014-01-01

    This book presents the complete formulation of a new advanced discretization meshless technique: the Natural Neighbour Radial Point Interpolation Method (NNRPIM). In addition, two of the most popular meshless methods, the EFGM and the RPIM, are fully presented. Being a truly meshless method, the major advantages of the NNRPIM over the FEM, and other meshless methods, are the remeshing flexibility and the higher accuracy of the obtained variable field. Using the natural neighbour concept, the NNRPIM permits to determine organically the influence-domain, resembling the cellulae natural behaviour. This innovation permits the analysis of convex boundaries and extremely irregular meshes, which is an advantage in the biomechanical analysis, with no extra computational effort associated.   This volume shows how to extend the NNRPIM to the bone tissue remodelling analysis, expecting to contribute with new numerical tools and strategies in order to permit a more efficient numerical biomechanical analysis.

  19. Comparison of methods for vibration analysis of electrostatic precipitators

    Institute of Scientific and Technical Information of China (English)

    Iwona Adamiec-Wójcik; Andrzej Nowak; Stanis(l)aw Wojciech

    2011-01-01

    The paper presents two methods for the formulation of free vibration analysis of collecting electrodes of precipitators. The first, called the hybrid finite element method,combines the finit element method used for calculations of spring deformations with the rigid finite element method used to reflect mass and geometrical features, which is called the hybrid finite element method. As a result, a model with a diagonal mass matrix is obtained. Due to a specific geometry of the electrodes, which are long plates of complicated shapes, the second method proposed is the strip method which is a semi-analytical method. The strip method allows us to formulate the equations of motion with a considerably smaller number of generalized coordinates. Results of numerical calculations obtained by both methods are compared with those obtained using commercial software like ANSYS and ABAQUS. Good compatibility of results is achieved.

  20. Rapid method to determine proximate analysis and pyritic sulfur

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.W.; Hyman, M.

    1985-05-01

    The use of thermomagnetogravimetry has been proposed as an alternative to the ASTM methods for measuring the pyritic sulphur content of coal and for proximate analysis. This paper presents a comparison of the results of thermogravimetry for proximate analysis and thermomagnetometry for pyritic sulphur with ASTM values on the same samples. The thermomagnetogravimetric technique is quicker and easier than the ASTM methods, and of comparable accuracy.

  1. [Progress on detection and analysis method of endocrine disrupting compounds].

    Science.gov (United States)

    Du, Hui-Fang; Yan, Hui-Fang

    2005-07-01

    EDCs are new generation of environmental pollutions which are globally concerned. They may cause adverse effect mainly to the endocrine system and nervous system, etc. To assess the EDCs' hazard to the health exactly, we should know about the distribution and level of EDCs in the environment. In this paper, the technique of pretreatment in different matrices, the method of detection and analysis about EDCs were reviewed, and the future's prospect on the study of detection and analysis method were talked about also.

  2. Numerical methods design, analysis, and computer implementation of algorithms

    CERN Document Server

    Greenbaum, Anne

    2012-01-01

    Numerical Methods provides a clear and concise exploration of standard numerical analysis topics, as well as nontraditional ones, including mathematical modeling, Monte Carlo methods, Markov chains, and fractals. Filled with appealing examples that will motivate students, the textbook considers modern application areas, such as information retrieval and animation, and classical topics from physics and engineering. Exercises use MATLAB and promote understanding of computational results. The book gives instructors the flexibility to emphasize different aspects--design, analysis, or c

  3. One-step synthesis of amino-functionalized ultrasmall near infrared-emitting persistent luminescent nanoparticles for in vitro and in vivo bioimaging

    Science.gov (United States)

    Shi, Junpeng; Sun, Xia; Zhu, Jianfei; Li, Jinlei; Zhang, Hongwu

    2016-05-01

    Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be easily conjugated with many bio-functional molecules, which has been successfully utilized to realize in vitro and in vivo imaging. More importantly, the ZGO achieved re-excitation imaging using 650 nm and 808 nm NIR light in situ, which is advantageous for long-term and higher SNR bioimaging.Near infrared (NIR)-emitting persistent luminescent nanoparticles (NPLNPs) have attracted much attention in bioimaging because they can provide long-term in vivo imaging with a high signal-to-noise ratio (SNR). However, conventional NPLNPs with large particle sizes that lack modifiable surface groups suffer from many serious limitations in bioimaging. Herein, we report a one-step synthesis of amino-functionalized ZnGa2O4:Cr,Eu nanoparticles (ZGO) that have an ultrasmall size, where ethylenediamine served as the reactant to fabricate the ZGO as well as the surfactant ligand to control the nanocrystal size and form surface amino groups. The ZGO exhibited a narrow particle size distribution, a bright NIR emission and a long afterglow luminescence. In addition, due to the excellent conjugation ability of the surface amino groups, the ZGO can be

  4. Analysis of spectral methods for the homogeneous Boltzmann equation

    KAUST Repository

    Filbet, Francis

    2011-04-01

    The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  6. 7 CFR 58.245 - Method of sample analysis.

    Science.gov (United States)

    2010-01-01

    ... laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE...

  7. Stochastic Analysis Method of Sea Environment Simulated by Numerical Models

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 焦桂英; 张明霞; 温书勤

    2003-01-01

    This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.

  8. Scalable Kernel Methods and Algorithms for General Sequence Analysis

    Science.gov (United States)

    Kuksa, Pavel

    2011-01-01

    Analysis of large-scale sequential data has become an important task in machine learning and pattern recognition, inspired in part by numerous scientific and technological applications such as the document and text classification or the analysis of biological sequences. However, current computational methods for sequence comparison still lack…

  9. The Constant Comparative Analysis Method Outside of Grounded Theory

    Science.gov (United States)

    Fram, Sheila M.

    2013-01-01

    This commentary addresses the gap in the literature regarding discussion of the legitimate use of Constant Comparative Analysis Method (CCA) outside of Grounded Theory. The purpose is to show the strength of using CCA to maintain the emic perspective and how theoretical frameworks can maintain the etic perspective throughout the analysis. My…

  10. The Politics of Historical Discourse Analysis: A Qualitative Research Method?

    Science.gov (United States)

    Johannesson, Ingolfur Asgeir

    2010-01-01

    This article deals with the ways in which historical discourse analysis is at once different from and similar to research described as qualitative or quantitative. It discusses the consequences of applying the standards of such methods to historical discourse analysis. It is pointed out that although the merit of research using historical…

  11. Method of morphological analysis of enterprise management organizational structure

    OpenAIRE

    Heorhiadi, N.; Iwaszczuk, N.; Vilhutska, R.

    2013-01-01

    The essence of the method of morphological analysis of enterprise management organizational structure is described in the article. Setting levels of morphological decomposition and specification of sets of elements are necessary for morphological analysis. Based on empirical research identified factors that influence the formation and use of enterprises management organizational structures.

  12. Review of analysis methods for rotating systems with periodic coefficients

    Science.gov (United States)

    Dugundji, J.; Wendell, J. H.

    1981-01-01

    Two of the more common procedures for analyzing the stability and forced response of equations with periodic coefficients are reviewed: the use of Floquet methods, and the use of multiblade coordinate and harmonic balance methods. The analysis procedures of these periodic coefficient systems are compared with those of the more familiar constant coefficient systems.

  13. Some analysis methods for rotating systems with periodic coefficients

    Science.gov (United States)

    Dugundji, J.; Wendell, J. H.

    1983-01-01

    Two of the more common procedures for analyzing the stability and forced response of equations with periodic coefficients are reviewed: the use of Floquet methods, and the use of multiblade coordinate and harmonic balance methods. The analysis procedures of these periodic coefficient systems are compared with those of the more familiar constant coefficient systems. Previously announced in STAR as N82-23702

  14. Robust methods for multivariate data analysis A1

    DEFF Research Database (Denmark)

    Frosch, Stina; Von Frese, J.; Bro, Rasmus

    2005-01-01

    Outliers may hamper proper classical multivariate analysis, and lead to incorrect conclusions. To remedy the problem of outliers, robust methods are developed in statistics and chemometrics. Robust methods reduce or remove the effect of outlying data points and allow the ?good? data to primarily...

  15. Bayesian data analysis in population ecology: motivations, methods, and benefits

    Science.gov (United States)

    Dorazio, Robert

    2016-01-01

    During the 20th century ecologists largely relied on the frequentist system of inference for the analysis of their data. However, in the past few decades ecologists have become increasingly interested in the use of Bayesian methods of data analysis. In this article I provide guidance to ecologists who would like to decide whether Bayesian methods can be used to improve their conclusions and predictions. I begin by providing a concise summary of Bayesian methods of analysis, including a comparison of differences between Bayesian and frequentist approaches to inference when using hierarchical models. Next I provide a list of problems where Bayesian methods of analysis may arguably be preferred over frequentist methods. These problems are usually encountered in analyses based on hierarchical models of data. I describe the essentials required for applying modern methods of Bayesian computation, and I use real-world examples to illustrate these methods. I conclude by summarizing what I perceive to be the main strengths and weaknesses of using Bayesian methods to solve ecological inference problems.

  16. Investigating Convergence Patterns for Numerical Methods Using Data Analysis

    Science.gov (United States)

    Gordon, Sheldon P.

    2013-01-01

    The article investigates the patterns that arise in the convergence of numerical methods, particularly those in the errors involved in successive iterations, using data analysis and curve fitting methods. In particular, the results obtained are used to convey a deeper level of understanding of the concepts of linear, quadratic, and cubic…

  17. Integrated numerical methods for hypersonic aircraft cooling systems analysis

    Science.gov (United States)

    Petley, Dennis H.; Jones, Stuart C.; Dziedzic, William M.

    1992-01-01

    Numerical methods have been developed for the analysis of hypersonic aircraft cooling systems. A general purpose finite difference thermal analysis code is used to determine areas which must be cooled. Complex cooling networks of series and parallel flow can be analyzed using a finite difference computer program. Both internal fluid flow and heat transfer are analyzed, because increased heat flow causes a decrease in the flow of the coolant. The steady state solution is a successive point iterative method. The transient analysis uses implicit forward-backward differencing. Several examples of the use of the program in studies of hypersonic aircraft and rockets are provided.

  18. A New Venture Analysis Method and Its Application

    Institute of Scientific and Technical Information of China (English)

    WANG Wen-jie(王文杰); Ronald K. Mitchell; TANG Bing-yong(汤兵勇)

    2003-01-01

    The new venture analysis is the foundation of venture development. In this paper, 14 venture prototypes are proposed based on the attributes of venture.Then, a new venture analysis method is discussed by the way of matching the new venture with the corresponding prototype. Considering the fuzziness of human subjective grading, the L-R fuzzy numbers are used to express the variables and corresponding fuzzy algorithm are applied in analysis. At the end, an application example is applied to indicate the effectiveness of the method.

  19. Homotopy Analysis Method to the Generalized Zakharov Equations

    Directory of Open Access Journals (Sweden)

    Hassan A. Zedan

    2012-01-01

    Full Text Available We introduce two powerful methods to solve the generalized Zakharov equations; one is the homotopy perturbation method and the other is the homotopy analysis method. The homotopy perturbation method is proposed for solving the generalized Zakharov equations. The initial approximations can be freely chosen with possible unknown constants which can be determined by imposing the boundary and initial conditions; the homotopy analysis method is applied to solve the generalized Zakharov equations. HAM is a strong and easy-to-use analytic tool for nonlinear problems. Computation of the absolute errors between the exact solutions of the GZE equations and the approximate solutions, comparison of the HPM results with those of Adomian’s decomposition method and the HAM results, and computation the absolute errors between the exact solutions of the GZE equations with the HPM solutions and HAM solutions are presented.

  20. Fault Diagnosis and Reliability Analysis Using Fuzzy Logic Method

    Institute of Scientific and Technical Information of China (English)

    Miao Zhinong; Xu Yang; Zhao Xiangyu

    2006-01-01

    A new fuzzy logic fault diagnosis method is proposed. In this method, fuzzy equations are employed to estimate the component state of a system based on the measured system performance and the relationship between component state and system performance which is called as "performance-parameter" knowledge base and constructed by expert. Compared with the traditional fault diagnosis method, this fuzzy logic method can use humans intuitive knowledge and dose not need a precise mapping between system performance and component state. Simulation proves its effectiveness in fault diagnosis. Then, the reliability analysis is performed based on the fuzzy logic method.

  1. Image analysis benchmarking methods for high-content screen design.

    Science.gov (United States)

    Fuller, C J; Straight, A F

    2010-05-01

    The recent development of complex chemical and small interfering RNA (siRNA) collections has enabled large-scale cell-based phenotypic screening. High-content and high-throughput imaging are widely used methods to record phenotypic data after chemical and small interfering RNA treatment, and numerous image processing and analysis methods have been used to quantify these phenotypes. Currently, there are no standardized methods for evaluating the effectiveness of new and existing image processing and analysis tools for an arbitrary screening problem. We generated a series of benchmarking images that represent commonly encountered variation in high-throughput screening data and used these image standards to evaluate the robustness of five different image analysis methods to changes in signal-to-noise ratio, focal plane, cell density and phenotype strength. The analysis methods that were most reliable, in the presence of experimental variation, required few cells to accurately distinguish phenotypic changes between control and experimental data sets. We conclude that by applying these simple benchmarking principles an a priori estimate of the image acquisition requirements for phenotypic analysis can be made before initiating an image-based screen. Application of this benchmarking methodology provides a mechanism to significantly reduce data acquisition and analysis burdens and to improve data quality and information content.

  2. Statistical methods of SNP data analysis with applications

    CERN Document Server

    Bulinski, Alexander; Shashkin, Alexey; Yaskov, Pavel

    2011-01-01

    Various statistical methods important for genetic analysis are considered and developed. Namely, we concentrate on the multifactor dimensionality reduction, logic regression, random forests and stochastic gradient boosting. These methods and their new modifications, e.g., the MDR method with "independent rule", are used to study the risk of complex diseases such as cardiovascular ones. The roles of certain combinations of single nucleotide polymorphisms and external risk factors are examined. To perform the data analysis concerning the ischemic heart disease and myocardial infarction the supercomputer SKIF "Chebyshev" of the Lomonosov Moscow State University was employed.

  3. Analysis of proteins and peptides by electromigration methods in microchips.

    Science.gov (United States)

    Štěpánová, Sille; Kašička, Václav

    2017-01-01

    This review presents the developments and applications of microchip electromigration methods in the separation and analysis of peptides and proteins in the period 2011-mid-2016. The developments in sample preparation and preconcentration, microchannel material, and surface treatment are described. Separations by various microchip electromigration methods (zone electrophoresis in free and sieving media, affinity electrophoresis, isotachophoresis, isoelectric focusing, electrokinetic chromatography, and electrochromatography) are demonstrated. Advances in detection methods are reported and novel applications in the areas of proteomics and peptidomics, quality control of peptide and protein pharmaceuticals, analysis of proteins and peptides in biomatrices, and determination of physicochemical parameters are shown.

  4. Unascertained Factor Method of Dynamic Characteristic Analysis for Antenna Structures

    Institute of Scientific and Technical Information of China (English)

    ZHU Zeng-qing; LIANG Zhen-tao; CHEN Jian-jun

    2008-01-01

    The dynamic characteristic analysis model of antenna structures is built, in which the structural physical parameters and geometrical dimensions are all considered as unascertained variables, And a structure dynamic characteristic analysis method based on the unascertained factor method is given. The computational expression of structural characteristic is developed by the mathematics expression of unascertained factor and the principles of unascertained rational numbers arithmetic. An example is given, in which the possible values and confidence degrees of the unascertained structure characteristics are obtained. The calculated results show that the method is feasible and effective.

  5. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  6. Application of homotopy analysis method for solving nonlinear Cauchy problem

    Directory of Open Access Journals (Sweden)

    V.G. Gupta

    2012-11-01

    Full Text Available In this paper, by means of the homotopy analysis method (HAM, the solutions of some nonlinear Cauchy problem of parabolic-hyperbolic type are exactly obtained in the form of convergent Taylor series. The HAM contains the auxiliary parameter \\hbar that provides a convenient way of controlling the convergent region of series solutions. This analytical method is employed to solve linear examples to obtain the exact solutions. The results reveal that the proposed method is very effective and simple.

  7. Linear Algebraic Method for Non-Linear Map Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yu,L.; Nash, B.

    2009-05-04

    We present a newly developed method to analyze some non-linear dynamics problems such as the Henon map using a matrix analysis method from linear algebra. Choosing the Henon map as an example, we analyze the spectral structure, the tune-amplitude dependence, the variation of tune and amplitude during the particle motion, etc., using the method of Jordan decomposition which is widely used in conventional linear algebra.

  8. The Additional Interpolators Method for Variational Analysis in Lattice QCD

    CERN Document Server

    Schiel, Rainer W

    2015-01-01

    In this paper, I describe the Additional Interpolators Method, a new technique for variational analysis in lattice QCD. It is shown to be an excellent method which uses additional interpolators to remove backward in time running states that would otherwise contaminate the signal. The proof of principle, which also makes use of the Time-Shift Trick (Generalized Pencil-of-Functions method), will be delivered at an example on a $64^4$ lattice close to the physical pion mass.

  9. New experimental and analysis methods in I-DLTS

    Energy Technology Data Exchange (ETDEWEB)

    Pandey, S.U.; Middelkamp, P.; Li, Z.; Eremin, V.

    1998-02-01

    A new experimental apparatus to perform I-DLTS measurements is presented. The method is shown to be faster and more sensitive than traditional double boxcar I-DLTS systems. A novel analysis technique utilizing multiple exponential fits to the I-DLTS signal from a highly neutron irradiated silicon sample is presented with a discussion of the results. It is shown that the new method has better resolution and can deconvolute overlapping peaks more accurately than previous methods.

  10. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Dalia Krikščiūnienė

    2012-07-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks.The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules.The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables.The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis.The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  11. Intellectual Data Analysis Method for Evaluation of Virtual Teams

    Directory of Open Access Journals (Sweden)

    Sandra Strigūnaitė

    2013-01-01

    Full Text Available The purpose of the article is to present a method for virtual team performance evaluation based on intelligent team member collaboration data analysis. The motivation for the research is based on the ability to create an evaluation method that is similar to ambiguous expert evaluations. The concept of the hierarchical fuzzy rule based method aims to evaluate the data from virtual team interaction instances related to implementation of project tasks. The suggested method is designed for project managers or virtual team leaders to help in virtual teamwork evaluation that is based on captured data analysis. The main point of the method is the ability to repeat human thinking and expert valuation process for data analysis by applying fuzzy logic: fuzzy sets, fuzzy signatures and fuzzy rules. The fuzzy set principle used in the method allows evaluation criteria numerical values to transform into linguistic terms and use it in constructing fuzzy rules. Using a fuzzy signature is possible in constructing a hierarchical criteria structure. This structure helps to solve the problem of exponential increase of fuzzy rules including more input variables. The suggested method is aimed to be applied in the virtual collaboration software as a real time teamwork evaluation tool. The research shows that by applying fuzzy logic for team collaboration data analysis it is possible to get evaluations equal to expert insights. The method includes virtual team, project task and team collaboration data analysis. The advantage of the suggested method is the possibility to use variables gained from virtual collaboration systems as fuzzy rules inputs. Information on fuzzy logic based virtual teamwork collaboration evaluation has evidence that can be investigated in the future. Also the method can be seen as the next virtual collaboration software development step.

  12. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  13. Complexity of software trustworthiness and its dynamical statistical analysis methods

    Institute of Scientific and Technical Information of China (English)

    ZHENG ZhiMing; MA ShiLong; LI Wei; JIANG Xin; WEI Wei; MA LiLi; TANG ShaoTing

    2009-01-01

    Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications.At present,the method of measurement and assessment of software trustworthiness cannot guarantee safe and reliable operations of software systems completely and effectively.Based on the dynamical system study,this paper interprets the characteristics of behaviors of software systems and the basic scientific problems of software trustworthiness complexity,analyzes the characteristics of complexity of software trustworthiness,and proposes to study the software trustworthiness measurement in terms of the complexity of software trustworthiness.Using the dynamical statistical analysis methods,the paper advances an invariant-measure based assessment method of software trustworthiness by statistical indices,and hereby provides a dynamical criterion for the untrustworthiness of software systems.By an example,the feasibility of the proposed dynamical statistical analysis method in software trustworthiness measurement is demonstrated using numerical simulations and theoretical analysis.

  14. SOLUTION OF THE ENSO DELAYED OSCILLATOR WITH HOMOTOPY ANALYSIS METHOD

    Institute of Scientific and Technical Information of China (English)

    WU Zi-ku

    2009-01-01

    An ENSO delayed oscillator is considered.The El Nino atmospheric physics oscillation is an abnormal phenomenon involved in the tropical Pacific ocean-atmosphere interactions.The conceptual oscillator model should consider the variations of both the eastern and western Pacific anomaly patterns.Using the homotopy analysis method,the approximate expansions of the solution of corresponding problem are constructed.The method is based on a continuous variation from an initial trial to the exact solution.A Maclaurin series expansion provides a successive approximation of the solution through repeated application of a differential operator with the initial trial as the first term.This approach does not require the use of perturbation parameters and the solution series converges rapidly with the number of terms.Comparing the approximate analytical solution by homotopy analysis method with the exact solution,we can find that the homotopy analysis method is valid for solving the strong nonlinear ENSO delayed oscillator model.

  15. Piezoelectric Analysis of Saw Sensor Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    Vladimír KUTIŠ

    2013-06-01

    Full Text Available In this contribution modeling and simulation of surface acoustic waves (SAW sensor using finite element method will be presented. SAW sensor is made from piezoelectric GaN layer and SiC substrate. Two different analysis types are investigated - modal and transient. Both analyses are only 2D. The goal of modal analysis, is to determine the eigenfrequency of SAW, which is used in following transient analysis. In transient analysis, wave propagation in SAW sensor is investigated. Both analyses were performed using FEM code ANSYS.

  16. Qualitative analysis of the CCEBC/EEAC method

    Institute of Scientific and Technical Information of China (English)

    LIAO; Haohui; TANG; Yun

    2004-01-01

    The CCEBC/EEAC method is an effective method in the quantitative analysis of power system transient stability. This paper provides a qualitative analysis of the CCEBC/EEAC method and shows that from a geometrical point of view, the CCCOI-RM transformation used in the CCEBC/EEAC method can be regarded as a projection of the variables of the system model on a weighted vector space, from which a generalized(-P)-(-δ) trajectory is obtained. Since a transient process of power systems can be approximately regarded as a time-piecewise simple Hamiltonian system, in order to qualitatively analyse the CCEBC/EEAC method, this paper compares the potential energy of a two-machine infinite bus system with its CCEBC/EEAC energy. Numerical result indicates their similarity. Clarifying the qualitative relation between these two kinds of energies is significant in verifying mathematically the CCEBC/EEAC method for judging the criterion of power system transient stability. Moreover, the qualitative analysis of the CCEBC/EEAC method enables us to better understand some important phenomena revealed by quantitative analysis, such as multi-swing loss of stability and isolated stable domain.

  17. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  18. Objective analysis of the ARM IOP data: method and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Cedarwall, R; Lin, J L; Xie, S C; Yio, J J; Zhang, M H

    1999-04-01

    Motivated by the need of to obtain accurate objective analysis of field experimental data to force physical parameterizations in numerical models, this paper -first reviews the existing objective analysis methods and interpolation schemes that are used to derive atmospheric wind divergence, vertical velocity, and advective tendencies. Advantages and disadvantages of each method are discussed. It is shown that considerable uncertainties in the analyzed products can result from the use of different analysis schemes and even more from different implementations of a particular scheme. The paper then describes a hybrid approach to combine the strengths of the regular grid method and the line-integral method, together with a variational constraining procedure for the analysis of field experimental data. In addition to the use of upper air data, measurements at the surface and at the top-of-the-atmosphere are used to constrain the upper air analysis to conserve column-integrated mass, water, energy, and momentum. Analyses are shown for measurements taken in the Atmospheric Radiation Measurement Programs (ARM) July 1995 Intensive Observational Period (IOP). Sensitivity experiments are carried out to test the robustness of the analyzed data and to reveal the uncertainties in the analysis. It is shown that the variational constraining process significantly reduces the sensitivity of the final data products.

  19. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  20. INDIRECT DETERMINATION METHOD OF DYNAMIC FORCEBY USING CEPSTRUM ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    吴淼; 魏任之

    1996-01-01

    The dynamic load spectrum is one of the most important basis of design and dynamic characteristics analysis of machines. But it is difficult to measure it on many occasions, especially for mining machines, due to their bad working circumstances and high cost of measurements. For such situation, the load spectrum has to be obtained by indirect determination methods. A new method to identify the load spectrum, cepstrum analysis method, was presented in this paper.This method can be used to eliminate the filtering influence of transfer function to the response signals so that the load spectrum can be determined indirectly. The experimental and engineering actual examples indicates that this method has the advantages that the calculation is simple and the measurement is easy.

  1. Research advance in safety analysis methods for high concrete dam

    Institute of Scientific and Technical Information of China (English)

    REN; QingWen; XU; LanYu; WAN; YunHui

    2007-01-01

    High tensile stresses occurred in high concrete dams and in their foundation lead to the growing importance of their safety with the increase of concrete dam height.Without any exiting specification or successful experiences of concrete dams up to 300 m at home and abroad for reference,experts feel obliged to figure out how to perform safety analysis on high concrete dam.This paper involves the main contents and mechanical features of the safety analysis on high concrete dam and shows the current state and progress of the analysis methods.For the insufficiency and problems existing in normative methods,study on modern numerical method such as finite element method must be strengthened to find out the stress control criterion which is in accordance with the methods.Two aspects of the safety analysis of high dam--local damage from material level and integral destruction from structure level--should be considered.For the local damage,we should consider the non-homogeneity of material and strengthen the research of meso-damage mechanics.While for integral destruction of the system of high dam and its foundation,a study on non-strength theory should receive enough concerns.Further,attention should be paid to the research on the failure modes and criterions of high concrete dam failure analysis and safety evaluation,and the effect of uncertainty and classification of safety should be considered too.

  2. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  3. Multiple methods integration for structural mechanics analysis and design

    Science.gov (United States)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  4. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  5. Extended Finite Element Method for Fracture Analysis of Structures

    CERN Document Server

    Mohammadi, Soheil

    2008-01-01

    This important textbook provides an introduction to the concepts of the newly developed extended finite element method (XFEM) for fracture analysis of structures, as well as for other related engineering applications.One of the main advantages of the method is that it avoids any need for remeshing or geometric crack modelling in numerical simulation, while generating discontinuous fields along a crack and around its tip. The second major advantage of the method is that by a small increase in number of degrees of freedom, far more accurate solutions can be obtained. The method has recently been

  6. Reproducing Kernel Particle Method for Non-Linear Fracture Analysis

    Institute of Scientific and Technical Information of China (English)

    Cao Zhongqing; Zhou Benkuan; Chen Dapeng

    2006-01-01

    To study the non-linear fracture, a non-linear constitutive model for piezoelectric ceramics was proposed, in which the polarization switching and saturation were taken into account. Based on the model, the non-linear fracture analysis was implemented using reproducing kernel particle method (RKPM). Using local J-integral as a fracture criterion, a relation curve of fracture loads against electric fields was obtained. Qualitatively, the curve is in agreement with the experimental observations reported in literature. The reproducing equation, the shape function of RKPM, and the transformation method to impose essential boundary conditions for meshless methods were also introduced. The computation was implemented using object-oriented programming method.

  7. ANALYSIS OF MULTICONDUCTOR TRANSMISSION LINES USING THE MACCORMACK METHOD

    Institute of Scientific and Technical Information of China (English)

    Dou Lei; Wang Zhiquan

    2006-01-01

    The MacCormack method is applied to the analysis of multiconductor transmission lines by introducing a new technique that does not require decoupling. This method can be used to analyze a wide range of problems and does not have to consider the matrix forms of distributed parameters. We have developed software named MacCormack Transmission Line Analyzer based on the proposed method. Numerical examples are presented to demonstrate the accuracy and efficiency of the method and illustrate its application to analyzing multiconductor transmission lines.

  8. AN ANALYSIS METHOD FOR HIGH-SPEED CIRCUIT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new method for analyzing high-speed circuit systems is presented. The method adds transmission line end currents to the circuit variables of the classical modified nodal approach. Then the matrix equation describing high-speed circuit system can be formulated directly and analyzed conveniently for its normative form. A time-domain analysis method for transmission lines is also introduced. The two methods are combined together to efficiently analyze high-speed circuit systems having general transmission lines. Numerical experiment is presented and the results are compared with that calculated by Hspice.

  9. Rapid method of proximate analysis of coals from Indian coalfields

    Energy Technology Data Exchange (ETDEWEB)

    Alam, N.; Paul, S.K. [CFRI, Dhanbad (India)

    2001-07-01

    The proximate analysis has been useful in finding out the percentage of ash in coal, which is the threshold requirement by which the commercial value of coal is adjudged. In the present study, a rapid method for determination of ash percentage in coal has been discussed. Rapid method is always warranted to release more quality parameters within scheduled time. The results obtained by this method were found to be satisfactory and comparable with the results obtained by the British Standard specifications. It is concluded that this method can be effectively adopted where sulphur, carbonates and alkalis are present in low amounts. 4 refs., 3 tabs.

  10. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  11. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...

  12. Comparative studies of upconversion luminescence characteristics and cell bioimaging based on one-step synthesized upconversion nanoparticles capped with different functional groups

    Energy Technology Data Exchange (ETDEWEB)

    Tsang, Ming-Kiu [Department of Applied Physics, The Hong Kong Polytechnic University, Hong Kong (China); Chan, Chi-Fai; Wong, Ka-Leung [Department of Chemistry, Hong Kong Baptist University (Hong Kong); Hao, Jianhua, E-mail: jh.hao@polyu.edu.hk [Department of Applied Physics, The Hong Kong Polytechnic University, Hong Kong (China)

    2015-01-15

    Herein, three types of upconverting NaGdF{sub 4}:Yb/Er nanoparticles (UCNPs) have been synthesized via one-step hydrothermal synthesis with polyethylene glycol (PEG), polyethylenimine (PEI) and 6-aminocapronic acid (6AA) functionalization. To evident the presence of these groups, FTIR spectra and ζ-potentials were measured to support the successful capping of PEG, PEI and 6AA on the UCNPs. The regular morphology and cubic phase of these functionalized UCNPs were attributed to the capping effect of the surfactants. Tunable upconversion luminescence (UCL) from red to green were observed under 980 nm laser excitation and the UCL tuning was attributed to the presence of various surface ligands. Moreover, surface group dependent UCL bioimaging was performed in HeLa cells. The enhanced UCL bioimaging demonstrated by PEI functionalized UCNPs evident high cell uptake. The significant cell uptake is explained by the electrostatic attraction between the amino groups (–NH{sub 2}) and the cell membrane. Moreover, the functionalized UCNPs demonstrated low cytotoxicity in MTT assay. Additional, paramagnetic property was presented by these UCNPs under magnetic field. - Highlights: • Tunable upconversion emission by capped functional groups under fixed composition. • Surface dependent upconversion luminescence bioimaging in HeLa cells. • Low cytotoxicity. • Additional paramagnetic property due to Gd{sup 3+} ions.

  13. Photoluminescent and biodegradable polycitrate-polyethylene glycol-polyethyleneimine polymers as highly biocompatible and efficient vectors for bioimaging-guided siRNA and miRNA delivery.

    Science.gov (United States)

    Wang, Min; Guo, Yi; Yu, Meng; Ma, Peter X; Mao, Cong; Lei, Bo

    2017-02-20

    Development of biodegradable and biocompatible non-viral vectors with intrinsical multifunctional properties such as bioimaging ability for highly efficient nucleic acids delivery still remains a challenge. Here, a biodegradable poly (1,8-octanedio-citric acid)-co-polyethylene glycol grafted with polyethyleneimine (PEI) (POCG-PEI) polymers with the photoluminescent capacity were synthesized for nucleic acids delivery (siRNA and miRNA). POCG-PEI polymers can efficiently bind various nucleic acids, protect them against enzymatic degradation and release the genes in the presence of polyanionic heparin. POCG-PEI also showed a significantly low cytotoxicity, enhanced cellular uptake and high transfection efficiency of nucleic acids, as compared to commercial transfection agents, lipofectamine 2000 (Lipo) and polyethylenimine (PEI 25K). POCG-PEI polymers demonstrate an excellent photostability, which allows for imaging the cells and real-time tracking the nucleic acids delivery. The photoluminescent property, low cytotoxicity, biodegradation, good gene binding and protection ability and high genes delivery efficiency make POCG-PEI highly competitive as a non-virus vector for genes delivery and real-time bioimaging applications. Our results may be also an important step for designing biodegradable biomaterials with multifunctional properties towards bioimaging-guided genes therapeutic applications.

  14. Continuum damage growth analysis using element free Galerkin method

    Indian Academy of Sciences (India)

    C O Arun; B N Rao; S M Srinivasan

    2010-06-01

    This paper presents an elasto-plastic element free Galerkin formulation based on Newton–Raphson algorithm for damage growth analysis. Isotropic ductile damage evolution law is used. A study has been carried out in this paper using the proposed element free Galerkin method to understand the effect of initial damage and its growth on structural response of single and bi-material problems. A simple method is adopted for enforcing EBCs by scaling the function approximation using a scaling matrix, when non-singular weight functions are used over the entire domain of the problem definition. Numerical examples comprising of one-and two-dimensional problems are presented to illustrate the effectiveness of the proposed method in analysis of uniform and non-uniform damage evolution problems. Effect of material discontinuity on damage growth analysis is also presented.

  15. Beyond perturbation introduction to the homotopy analysis method

    CERN Document Server

    Liao, Shijun

    2003-01-01

    Solving nonlinear problems is inherently difficult, and the stronger the nonlinearity, the more intractable solutions become. Analytic approximations often break down as nonlinearity becomes strong, and even perturbation approximations are valid only for problems with weak nonlinearity.This book introduces a powerful new analytic method for nonlinear problems-homotopy analysis-that remains valid even with strong nonlinearity. In Part I, the author starts with a very simple example, then presents the basic ideas, detailed procedures, and the advantages (and limitations) of homotopy analysis. Part II illustrates the application of homotopy analysis to many interesting nonlinear problems. These range from simple bifurcations of a nonlinear boundary-value problem to the Thomas-Fermi atom model, Volterra''s population model, Von Kármán swirling viscous flow, and nonlinear progressive waves in deep water.Although the homotopy analysis method has been verified in a number of prestigious journals, it has yet to be ...

  16. Computer vision analysis of image motion by variational methods

    CERN Document Server

    Mitiche, Amar

    2014-01-01

    This book presents a unified view of image motion analysis under the variational framework. Variational methods, rooted in physics and mechanics, but appearing in many other domains, such as statistics, control, and computer vision, address a problem from an optimization standpoint, i.e., they formulate it as the optimization of an objective function or functional. The methods of image motion analysis described in this book use the calculus of variations to minimize (or maximize) an objective functional which transcribes all of the constraints that characterize the desired motion variables. The book addresses the four core subjects of motion analysis: Motion estimation, detection, tracking, and three-dimensional interpretation. Each topic is covered in a dedicated chapter. The presentation is prefaced by an introductory chapter which discusses the purpose of motion analysis. Further, a chapter is included which gives the basic tools and formulae related to curvature, Euler Lagrange equations, unconstrained de...

  17. Structural analysis with the finite element method linear statics

    CERN Document Server

    Oñate, Eugenio

    2013-01-01

    STRUCTURAL ANALYSIS WITH THE FINITE ELEMENT METHOD Linear Statics Volume 1 : The Basis and Solids Eugenio Oñate The two volumes of this book cover most of the theoretical and computational aspects of the linear static analysis of structures with the Finite Element Method (FEM). The content of the book is based on the lecture notes of a basic course on Structural Analysis with the FEM taught by the author at the Technical University of Catalonia (UPC) in Barcelona, Spain for the last 30 years. Volume1 presents the basis of the FEM for structural analysis and a detailed description of the finite element formulation for axially loaded bars, plane elasticity problems, axisymmetric solids and general three dimensional solids. Each chapter describes the background theory for each structural model considered, details of the finite element formulation and guidelines for the application to structural engineering problems. The book includes a chapter on miscellaneous topics such as treatment of inclined supports, elas...

  18. Nonlinear Dimensionality Reduction Methods in Climate Data Analysis

    CERN Document Server

    Ross, Ian

    2008-01-01

    Linear dimensionality reduction techniques, notably principal component analysis, are widely used in climate data analysis as a means to aid in the interpretation of datasets of high dimensionality. These linear methods may not be appropriate for the analysis of data arising from nonlinear processes occurring in the climate system. Numerous techniques for nonlinear dimensionality reduction have been developed recently that may provide a potentially useful tool for the identification of low-dimensional manifolds in climate data sets arising from nonlinear dynamics. In this thesis I apply three such techniques to the study of El Nino/Southern Oscillation variability in tropical Pacific sea surface temperatures and thermocline depth, comparing observational data with simulations from coupled atmosphere-ocean general circulation models from the CMIP3 multi-model ensemble. The three methods used here are a nonlinear principal component analysis (NLPCA) approach based on neural networks, the Isomap isometric mappin...

  19. Methods of biomedical optical imaging: from subcellular structures to tissues and organs

    Science.gov (United States)

    Turchin, I. V.

    2016-05-01

    Optical bioimaging methods have a wide range of applications in the life sciences, most notably including the molecular resolution study of subcellular structures, small animal molecular imaging, and structural and functional clinical diagnostics of tissue layers and organs. We review fluorescent microscopy, fluorescent macroscopy, optical coherence tomography, optoacoustic tomography, and optical diffuse spectroscopy and tomography from the standpoint of physical fundamentals, applications, and progress.

  20. Vibration Analysis of Plates by MLS-Element Method

    Science.gov (United States)

    Zhou, L.; Xiang, Y.

    2010-05-01

    This paper presents a novel numerical method, the moving least square element (MLS-element) method for the free vibration analysis of plates based on the Mindlin shear deformable plate theory. In the MLS-element method, a plate can be first divided into multiple elements which are connected through selected nodal points on the interfaces of the elements. An element can be of any shape and the size of the element varies dependent on the problem at hand. The shape functions of the element for the transverse displacement and the rotations are derived based on the MLS interpolation technique. The convergence and accuracy of the method can be controlled by either increasing the number of elements or by increasing the number of MLS interpolation points within elements. Two selected examples for vibration of a simply supported square Mindlin plate and a clamped L-shaped Mindlin plate are studied to illustrate the versatility and accuracy of the proposed method. It shows that the proposed method is highly accurate and flexible for the vibration analysis of plate problems. The method can be further developed to bridge the existing meshless method and the powerful finite element method in dealing with various engineering computational problems, such as large deformation and crack propagation in solid mechanics.

  1. Material nonlinear analysis via mixed-iterative finite element method

    Science.gov (United States)

    Sutjahjo, Edhi; Chamis, Christos C.

    1992-01-01

    The performance of elastic-plastic mixed-iterative analysis is examined through a set of convergence studies. Membrane and bending behaviors are tested using 4-node quadrilateral finite elements. The membrane result is excellent, which indicates the implementation of elastic-plastic mixed-iterative analysis is appropriate. On the other hand, further research to improve bending performance of the method seems to be warranted.

  2. MANNER OF STOCKS SORTING USING CLUSTER ANALYSIS METHODS

    Directory of Open Access Journals (Sweden)

    Jana Halčinová

    2014-06-01

    Full Text Available The aim of the present article is to show the possibility of using the methods of cluster analysis in classification of stocks of finished products. Cluster analysis creates groups (clusters of finished products according to similarity in demand i.e. customer requirements for each product. Manner stocks sorting of finished products by clusters is described a practical example. The resultants clusters are incorporated into the draft layout of the distribution warehouse.

  3. Engineering and Design: Geotechnical Analysis by the Finite Element Method

    Science.gov (United States)

    2007-11-02

    used it to determine stresses and movements in embank- ments, and Reyes and Deer described its application to analysis of underground openings in rock...3-D steady-state seepage analysis of permeability of the cutoff walls was varied from 10 to Cerrillos Dam near Ponce , Puerto Rico, for the U.S.-6 10...36 Hughes, T. J. R. (1987). The Finite Element Reyes , S. F., and Deene, D. K. (1966). “Elastic Method, Linear Static and Dynamic Finite Element

  4. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  5. A Coupling Model of the Discontinuous Deformation Analysis Method and the Finite Element Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ming; YANG Heqing; LI Zhongkui

    2005-01-01

    Neither the finite element method nor the discontinuous deformation analysis method can solve problems very well in rock mechanics and engineering due to their extreme complexities. A coupling method combining both of them should have wider applicability. Such a model coupling the discontinuous deformation analysis method and the finite element method is proposed in this paper. In the model, so-called line blocks are introduced to deal with the interaction via the common interfacial boundary of the discontinuous deformation analysis domain with the finite element domain. The interfacial conditions during the incremental iteration process are satisfied by means of the line blocks. The requirement of gradual small displacements in each incremental step of this coupling method is met through a displacement control procedure. The model is simple in concept and is easy in numerical implementation. A numerical example is given. The displacement obtained by the coupling method agrees well with those obtained by the finite element method, which shows the rationality of this model and the validity of the implementation scheme.

  6. Entropy Analysis as an Electroencephalogram Feature Extraction Method

    Directory of Open Access Journals (Sweden)

    P. I. Sotnikov

    2014-01-01

    Full Text Available The aim of this study was to evaluate a possibility for using an entropy analysis as an electroencephalogram (EEG feature extraction method in brain-computer interfaces (BCI. The first section of the article describes the proposed algorithm based on the characteristic features calculation using the Shannon entropy analysis. The second section discusses issues of the classifier development for the EEG records. We use a support vector machine (SVM as a classifier. The third section describes the test data. Further, we estimate an efficiency of the considered feature extraction method to compare it with a number of other methods. These methods include: evaluation of signal variance; estimation of spectral power density (PSD; estimation of autoregression model parameters; signal analysis using the continuous wavelet transform; construction of common spatial pattern (CSP filter. As a measure of efficiency we use the probability value of correctly recognized types of imagery movements. At the last stage we evaluate the impact of EEG signal preprocessing methods on the final classification accuracy. Finally, it concludes that the entropy analysis has good prospects in BCI applications.

  7. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  8. A method for exergy analysis of sugar cane bagasse boilers

    Energy Technology Data Exchange (ETDEWEB)

    Cortez, L.A.B.; Gomez, E.O. [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Agricola

    1998-03-01

    This work presents a method to conduct a thermodynamic analysis of sugarcane bagasse boilers. The method is based on the standard and actual reactions which allows the calculation of the enthalpies of each process subequation and the exergies of each of the main flowrates participating in the combustion. The method is presented using an example with real data from a sugarcane bagasse boiler. A summary of the results obtained is also presented together based on the 1 st Law of Thermodynamics analysis, the exergetic efficiencies, and the irreversibility rates. The method presented is very rigorous with respect to data consistency, particularly for the flue gas composition. (author) 11 refs., 1 fig., 6 tabs.; e-mail: cortez at agr.unicamp.br

  9. Variable cluster analysis method for building neural network model

    Institute of Scientific and Technical Information of China (English)

    王海东; 刘元东

    2004-01-01

    To address the problems that input variables should be reduced as much as possible and explain output variables fully in building neural network model of complicated system, a variable selection method based on cluster analysis was investigated. Similarity coefficient which describes the mutual relation of variables was defined. The methods of the highest contribution rate, part replacing whole and variable replacement are put forwarded and deduced by information theory. The software of the neural network based on cluster analysis, which can provide many kinds of methods for defining variable similarity coefficient, clustering system variable and evaluating variable cluster, was developed and applied to build neural network forecast model of cement clinker quality. The results show that all the network scale, training time and prediction accuracy are perfect. The practical application demonstrates that the method of selecting variables for neural network is feasible and effective.

  10. Method for improving accuracy in full evaporation headspace analysis.

    Science.gov (United States)

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-03-21

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. This article is protected by copyright. All rights reserved.

  11. Statistical evaluation of texture analysis from the biocrystallization method

    OpenAIRE

    Meelursarn, Aumaporn

    2007-01-01

    The consumers are becoming more concerned about food quality, especially regarding how, when and where the foods are produced (Haglund et al., 1999; Kahl et al., 2004; Alföldi, et al., 2006). Therefore, during recent years there has been a growing interest in the methods for food quality assessment, especially in the picture-development methods as a complement to traditional chemical analysis of single compounds (Kahl et al., 2006). The biocrystallization as one of the picture-developin...

  12. Longitudinal data analysis a handbook of modern statistical methods

    CERN Document Server

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  13. Application of Raman spectroscopy method for analysis of biopolymer materials

    Science.gov (United States)

    Timchenko, Elena V.; Timchenko, Pavel E.; Volchkov, S. E.; Mahortova, Alexsandra O.; Asadova, Anna A.; Kornilin, Dmitriy V.

    2016-10-01

    This work presents the results of spectral analysis of biopolymer materials that are implemented in medical sphere. Polymer samples containing polycaprolactone and iron oxides of different valence were used in the studies. Raman spectroscopy method was used as a main control method. Relative content of iron and polycaprolactone in studied materials was assessed using ratio of RS intensities values at 604 cm-1 and 1726 cm-1 wavenumbers to intensity value of 1440 cm-1 line.

  14. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... updating and integrating stresses in time is problematic. This is discussed using an example of the dynamical collapse of a soil column....

  15. Analysis of experts' perception of the effectiveness of teaching methods

    Science.gov (United States)

    Kindra, Gurprit S.

    1984-03-01

    The present study attempts to shed light on the perceptions of business educators regarding the effectiveness of six methodologies in achieving Gagné's five learning outcomes. Results of this study empirically confirm the oft-stated contention that no one method is globally effective for the attainment of all objectives. Specifically, business games, traditional lecture, and case study methods are perceived to be most effective for the learning of application, knowledge acquisition, and analysis and application, respectively.

  16. New experimental and analysis methods in I-DLTS

    Energy Technology Data Exchange (ETDEWEB)

    Pandey, S.U. E-mail: sanjeev@bnl.gov; Middelkamp, P.; Li, Z.; Eremin, V

    1999-04-21

    A simple modification to traditional experimental apparatus to perform I-DLTS measurements is presented. This setup is shown to be faster and more sensitive than traditional double boxcar I-DLTS systems. A novel analysis technique utilising multiple exponential fits to the I-DLTS signal from a highly neutron irradiated silicon sample is presented along with a discussion of the results. It is shown that the new method has better resolution and can deconvolute overlapping peaks more accurately than previous methods.

  17. ANALYSIS METHODS ON STABILITY OF TALL AND BEDDIIG CREEP SLOPE

    Institute of Scientific and Technical Information of China (English)

    RUIYongqin; JIANGZhiming; LIUJinghui

    1995-01-01

    Based on the model of slope engineering geology,the creep and its failure mechanism of tall and bedding slope are deeply analyzed in this paper .The creep laws of weak intercalations are also discussed.The analysis om the stability of creep slope and the age forecasting of sliding slope have been conducted through mumerical simulations using Finite Element Method (FEM)and Dintimct Element Method(DEM).

  18. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    Science.gov (United States)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  19. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  20. Surface-confined fluorescence enhancement of Au nanoclusters anchoring to a two-dimensional ultrathin nanosheet toward bioimaging

    Science.gov (United States)

    Tian, Rui; Yan, Dongpeng; Li, Chunyang; Xu, Simin; Liang, Ruizheng; Guo, Lingyan; Wei, Min; Evans, David G.; Duan, Xue

    2016-05-01

    Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC/ELDH hybrid material exhibits excellent imaging properties with good stability and biocompatibility in the intracellular environment. Therefore, this work provides a facile strategy to achieve highly luminescent Au NCs via surface-confined emission enhancement imposed by ultrathin inorganic nanosheets, which can be potentially used in bio-imaging and cell labelling.Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC

  1. Advanced Life Analysis Methods. Volume 3. Experimental Evaluation of Crack Growth Analysis Methods for Attachment Lugs

    Science.gov (United States)

    1984-09-17

    Structural Lugs 10 6.00 TETM TESX, MARI LOCKHEED L𔃼.0 GRUP IIhA AND 2Rii * 2~~~~.0 .RUPINI .01 .05 1 .2 .5. 9 99PROABLTY F*ý,r 1-40 4oprsno R ato nTs rga...monitor loads and perform failsafe functions . A sinewave function generator provides load commands to the servo loop and a calibrated amplitude measurement...Simple Compounding Solution o 2-D Cracked Finite Element Procedure o Green’s Function Method 0 3-D Cracked Finite Element PrTocedure Parameters and

  2. Finite strip method combined with other numerical methods for the analysis of plates

    Science.gov (United States)

    Cheung, M. S.; Li, Wenchang

    1992-09-01

    Finite plate strips are combined with finite elements or boundary elements in the analysis of rectangular plates with some minor irregularities such as openings, skew edges, etc. The plate is divided into regular and irregular regions. The regular region is analyzed by the finite strip method while the irregular one is analyzed by the finite element or boundary element method. A special transition element and strip are developed in order to connect the both regions. Numerical examples will show the accuracy and efficiency of this combined analysis.

  3. NUMERICAL ANALYSIS ON BINOMIAL TREE METHODS FOR AMERICAN LOOKBACK OPTIONS

    Institute of Scientific and Technical Information of China (English)

    戴民

    2001-01-01

    Lookback options are path-dependent options. In general, the binomial tree methods,as the most popular approaches to pricing options, involve a path dependent variable as well as the underlying asset price for lookback options. However, for floating strike lookback options, a single-state variable binomial tree method can be constructed. This paper is devoted to the convergence analysis of the single-state binomial tree methods both for discretely and continuously monitored American floating strike lookback options. We also investigate some properties of such options, including effects of expiration date, interest rate and dividend yield on options prices,properties of optimal exercise boundaries and so on.

  4. Application of the IPEBS method to dynamic contingency analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A.C.B. [FURNAS, Rio de Janeiro, RJ (Brazil); Pedroso, A.S. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    1994-12-31

    Dynamic contingency analysis is certainly a demanding task in the context of dynamic performance evaluation. This paper presents the results of a test for checking the contingency screening capability of the IPEBS method. A brazilian 1100-bus, 112-gen system was used in the test; the ranking of the contingencies based on critical clearing times obtained with IPEBS, was compared with the ranking derived from detailed time-domain simulation. The results of this comparison encourages us to recommended the use of the method in industry applications, in a complementary basis to the current method of time domain simulation. (author) 5 refs., 1 fig., 2 tabs.

  5. Background and Mathematical Analysis of Diffusion MRI Methods.

    Science.gov (United States)

    Ozcan, Alpay; Wong, Kenneth H; Larson-Prior, Linda; Cho, Zang-Hee; Mun, Seong K

    2012-03-01

    The addition of a pair of magnetic field gradient pulses had initially provided the measurement of spin motion with nuclear magnetic resonance (NMR) techniques. In the adaptation of DW-NMR techniques to magnetic resonance imaging (MRI), the taxonomy of mathematical models is divided in two categories: model matching and spectral methods. In this review, the methods are summarized starting from early diffusion weighted (DW) NMR models followed up with their adaptation to DW MRI. Finally, a newly introduced Fourier analysis based unifying theory, so-called Complete Fourier Direct MRI, is included to explain the mechanisms of existing methods.

  6. Transuranic waste characterization sampling and analysis methods manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  7. Scientific and methodical approaches to analysis of enterprise development potential

    Directory of Open Access Journals (Sweden)

    Hrechina Iryna V.

    2014-01-01

    Full Text Available The modern state of the Ukrainian economy urge enterprises to search for new possibilities of their development, which makes the study subject topical. The article systemises existing approaches to analysis of the potential of enterprise development and marks out two main scientific approaches: first is directed at analysis of prospects of self-development of the economic system; the second – at analysis of probability of possibilities of growth. In order to increase the quality of the process of formation of methods of analysis of potential of enterprise development, the article offers an organisation model of methods and characterises its main elements. It develops methods of analysis, in the basis of which there are indicators of potentialogical sustainability. Scientific novelty of the obtained results lies in a possibility of identification of main directions of enterprise development with the use of the enterprise development potential ration: self-development or probability of augmenting opportunities, which is traced through interconnection of resources and profit.

  8. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu...

  9. A Syntactic Method for Time-Varying Pattern Analysis,

    Science.gov (United States)

    1981-05-01

    Orientation primitives 7 AD-AlOl 665 PURDUE UNIV LAFAYETTE IN SCHOOL OF ELECTRICAL ENINEERING F/B 9/2A SYNTACTIC METHOD FOR TIME-VARYING PAT ERN ANALYSIS...Fu and P. H. Swain, On Syntactic Pattern Recognition, In Software Engineering (J. T. Tou, ea.), Vol. 2, Academic Press, New York, 1971. [8] W. F

  10. Single corn kernel aflatoxin B1 extraction and analysis method

    Science.gov (United States)

    Aflatoxins are highly carcinogenic compounds produced by the fungus Aspergillus flavus. Aspergillus flavus is a phytopathogenic fungus that commonly infects crops such as cotton, peanuts, and maize. The goal was to design an effective sample preparation method and analysis for the extraction of afla...

  11. Smoothed analysis of the k-means method

    NARCIS (Netherlands)

    Arthur, David; Manthey, Bodo; Röglin, Heiko

    2011-01-01

    The k-means method is one of the most widely used clustering algorithms, drawing its popularity from its speed in practice. Recently, however, it was shown to have exponential worst-case running time. In order to close the gap between practical performance and theoretical analysis, the k-means metho

  12. CHROMATOGRAPHIC METHODS IN THE ANALYSIS OF CHOLESTEROL AND RELATED LIPIDS

    NARCIS (Netherlands)

    HOVING, EB

    1995-01-01

    Methods using thin-layer chromatography, solid-phase extraction, gas chromatography, high-performance liquid chromatography and supercritical fluid chromatography are described for the analysis of single cholesterol, esterified and sulfated cholesterol, and for cholesterol in the context of other li

  13. Homotopy analysis method for solving KdV equations

    Directory of Open Access Journals (Sweden)

    Hossein Jafari

    2010-06-01

    Full Text Available A scheme is developed for the numerical study of the Korteweg-de Vries (KdV and the Korteweg-de Vries Burgers (KdVB equations with initial conditions by a homotopy approach. Numerical solutions obtained by homotopy analysis method are compared with exact solution. The comparison shows that the obtained solutions are in excellent agreement.

  14. Fast nonlinear regression method for CT brain perfusion analysis.

    Science.gov (United States)

    Bennink, Edwin; Oosterbroek, Jaap; Kudo, Kohsuke; Viergever, Max A; Velthuis, Birgitta K; de Jong, Hugo W A M

    2016-04-01

    Although computed tomography (CT) perfusion (CTP) imaging enables rapid diagnosis and prognosis of ischemic stroke, current CTP analysis methods have several shortcomings. We propose a fast nonlinear regression method with a box-shaped model (boxNLR) that has important advantages over the current state-of-the-art method, block-circulant singular value decomposition (bSVD). These advantages include improved robustness to attenuation curve truncation, extensibility, and unified estimation of perfusion parameters. The method is compared with bSVD and with a commercial SVD-based method. The three methods were quantitatively evaluated by means of a digital perfusion phantom, described by Kudo et al. and qualitatively with the aid of 50 clinical CTP scans. All three methods yielded high Pearson correlation coefficients ([Formula: see text]) with the ground truth in the phantom. The boxNLR perfusion maps of the clinical scans showed higher correlation with bSVD than the perfusion maps from the commercial method. Furthermore, it was shown that boxNLR estimates are robust to noise, truncation, and tracer delay. The proposed method provides a fast and reliable way of estimating perfusion parameters from CTP scans. This suggests it could be a viable alternative to current commercial and academic methods.

  15. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  16. Transient Analysis of Hysteresis Queueing Model Using Matrix Geometric Method

    Directory of Open Access Journals (Sweden)

    Wajiha Shah

    2011-10-01

    Full Text Available Various analytical methods have been proposed for the transient analysis of a queueing system in the scalar domain. In this paper, a vector domain based transient analysis is proposed for the hysteresis queueing system with internal thresholds for the efficient and numerically stable analysis. In this system arrival rate of customer is controlled through the internal thresholds and the system is analyzed as a quasi-birth and death process through matrix geometric method with the combination of vector form Runge-Kutta numerical procedure which utilizes the special matrices. An arrival and service process of the system follows a Markovian distribution. We analyze the mean number of customers in the system when the system is in transient state against varying time for a Markovian distribution. The results show that the effect of oscillation/hysteresis depends on the difference between the two internal threshold values.

  17. Collage Portraits as a Method of Analysis in Qualitative Research

    Directory of Open Access Journals (Sweden)

    Paula Gerstenblatt PhD

    2013-02-01

    Full Text Available This article explores the use of collage portraits in qualitative research and analysis. Collage portraiture, an area of arts-based research (ABR, is gaining stature as a method of analysis and documentation in many disciplines. This article presents a method of creating collage portraits to support a narrative thematic analysis that explored the impact of participation in an art installation construction. Collage portraits provide the opportunity to include marginalized voices and encourage a range of linguistic and non-linguistic representations to articulate authentic lived experiences. Other potential benefits to qualitative research are cross-disciplinary study and collaboration, innovative ways to engage and facilitate dialogue, and the building and dissemination of knowledge.

  18. Fourier analysis for discontinuous Galerkin and related methods

    Institute of Scientific and Technical Information of China (English)

    ZHANG MengPing; SHU Chi-Wang

    2009-01-01

    In this paper we review a series of recent work on using a Fourier analysis technique to study the sta-bility and error estimates for the discontinuous Galerkin method and other related schemes. The ad-vantage of this approach is that it can reveal instability of certain "bad"' schemes; it can verify stability for certain good schemes which are not easily amendable to standard finite element stability analysis techniques; it can provide quantitative error comparisons among different schemes; and it can be used to study superconvergence and time evolution of errors for the discontinuous Galerkin method. We will briefly describe this Fourier analysis technique, summarize its usage in stability and error estimates for various schemes, and indicate the advantages and disadvantages of this technique in comparison with other finite element techniques.

  19. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  20. Regionally Smoothed Meta-Analysis Methods for GWAS Datasets.

    Science.gov (United States)

    Begum, Ferdouse; Sharker, Monir H; Sherman, Stephanie L; Tseng, George C; Feingold, Eleanor

    2016-02-01

    Genome-wide association studies are proven tools for finding disease genes, but it is often necessary to combine many cohorts into a meta-analysis to detect statistically significant genetic effects. Often the component studies are performed by different investigators on different populations, using different chips with minimal SNPs overlap. In some cases, raw data are not available for imputation so that only the genotyped single nucleotide polymorphisms (SNPs) results can be used in meta-analysis. Even when SNP sets are comparable, different cohorts may have peak association signals at different SNPs within the same gene due to population differences in linkage disequilibrium or environmental interactions. We hypothesize that the power to detect statistical signals in these situations will improve by using a method that simultaneously meta-analyzes and smooths the signal over nearby markers. In this study, we propose regionally smoothed meta-analysis methods and compare their performance on real and simulated data.

  1. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  2. Template matching method for the analysis of interstellar cloud structure

    CERN Document Server

    Juvela, M

    2016-01-01

    The structure of interstellar medium can be characterised at large scales in terms of its global statistics (e.g. power spectra) and at small scales by the properties of individual cores. Interest has been increasing in structures at intermediate scales, resulting in a number of methods being developed for the analysis of filamentary structures. We describe the application of the generic template-matching (TM) method to the analysis of maps. Our aim is to show that it provides a fast and still relatively robust way to identify elongated structures or other image features. We present the implementation of a TM algorithm for map analysis. The results are compared against rolling Hough transform (RHT), one of the methods previously used to identify filamentary structures. We illustrate the method by applying it to Herschel surface brightness data. The performance of the TM method is found to be comparable to that of RHT but TM appears to be more robust regarding the input parameters, for example, those related t...

  3. A method for rapid similarity analysis of RNA secondary structures

    Directory of Open Access Journals (Sweden)

    Liu Na

    2006-11-01

    Full Text Available Abstract Background Owing to the rapid expansion of RNA structure databases in recent years, efficient methods for structure comparison are in demand for function prediction and evolutionary analysis. Usually, the similarity of RNA secondary structures is evaluated based on tree models and dynamic programming algorithms. We present here a new method for the similarity analysis of RNA secondary structures. Results Three sets of real data have been used as input for the example applications. Set I includes the structures from 5S rRNAs. Set II includes the secondary structures from RNase P and RNase MRP. Set III includes the structures from 16S rRNAs. Reasonable phylogenetic trees are derived for these three sets of data by using our method. Moreover, our program runs faster as compared to some existing ones. Conclusion The famous Lempel-Ziv algorithm can efficiently extract the information on repeated patterns encoded in RNA secondary structures and makes our method an alternative to analyze the similarity of RNA secondary structures. This method will also be useful to researchers who are interested in evolutionary analysis.

  4. Precise methods for conducted EMI modeling,analysis, and prediction

    Institute of Scientific and Technical Information of China (English)

    MA WeiMing; ZHAO ZhiHua; MENG Jin; PAN QiJun; ZHANG Lei

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0-10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  5. Precise methods for conducted EMI modeling,analysis,and prediction

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Focusing on the state-of-the-art conducted EMI prediction, this paper presents a noise source lumped circuit modeling and identification method, an EMI modeling method based on multiple slope approximation of switching transitions, and dou-ble Fourier integral method modeling PWM conversion units to achieve an accurate modeling of EMI noise source. Meanwhile, a new sensitivity analysis method, a general coupling model for steel ground loops, and a partial element equivalent circuit method are proposed to identify and characterize conducted EMI coupling paths. The EMI noise and propagation modeling provide an accurate prediction of conducted EMI in the entire frequency range (0―10 MHz) with good practicability and generality. Finally a new measurement approach is presented to identify the surface current of large dimensional metal shell. The proposed analytical modeling methodology is verified by experimental results.

  6. Method of Real-Time Principal-Component Analysis

    Science.gov (United States)

    Duong, Tuan; Duong, Vu

    2005-01-01

    Dominant-element-based gradient descent and dynamic initial learning rate (DOGEDYN) is a method of sequential principal-component analysis (PCA) that is well suited for such applications as data compression and extraction of features from sets of data. In comparison with a prior method of gradient-descent-based sequential PCA, this method offers a greater rate of learning convergence. Like the prior method, DOGEDYN can be implemented in software. However, the main advantage of DOGEDYN over the prior method lies in the facts that it requires less computation and can be implemented in simpler hardware. It should be possible to implement DOGEDYN in compact, low-power, very-large-scale integrated (VLSI) circuitry that could process data in real time.

  7. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  8. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  9. Multiaxial Cyclic Thermoplasticity Analysis with Besseling's Subvolume Method

    Science.gov (United States)

    Mcknight, R. L.

    1983-01-01

    A modification was formulated to Besseling's Subvolume Method to allow it to use multilinear stress-strain curves which are temperature dependent to perform cyclic thermoplasticity analyses. This method automotically reproduces certain aspects of real material behavior important in the analysis of Aircraft Gas Turbine Engine (AGTE) components. These include the Bauschinger effect, cross-hardening, and memory. This constitutive equation was implemented in a finite element computer program called CYANIDE. Subsequently, classical time dependent plasticity (creep) was added to the program. Since its inception, this program was assessed against laboratory and component testing and engine experience. The ability of this program to simulate AGTE material response characteristics was verified by this experience and its utility in providing data for life analyses was demonstrated. In this area of life analysis, the multiaxial thermoplasticity capabilities of the method have proved a match for the actual AGTE life experience.

  10. Vectorized Monte Carlo methods for reactor lattice analysis

    Science.gov (United States)

    Brown, F. B.

    1984-01-01

    Some of the new computational methods and equivalent mathematical representations of physics models used in the MCV code, a vectorized continuous-enery Monte Carlo code for use on the CYBER-205 computer are discussed. While the principal application of MCV is the neutronics analysis of repeating reactor lattices, the new methods used in MCV should be generally useful for vectorizing Monte Carlo for other applications. For background, a brief overview of the vector processing features of the CYBER-205 is included, followed by a discussion of the fundamentals of Monte Carlo vectorization. The physics models used in the MCV vectorized Monte Carlo code are then summarized. The new methods used in scattering analysis are presented along with details of several key, highly specialized computational routines. Finally, speedups relative to CDC-7600 scalar Monte Carlo are discussed.

  11. Inverse thermal analysis method to study solidification in cast iron

    DEFF Research Database (Denmark)

    Dioszegi, Atilla; Hattel, Jesper

    2004-01-01

    Solidification modelling of cast metals is widely used to predict final properties in cast components. Accurate models necessitate good knowledge of the solidification behaviour. The present study includes a re-examination of the Fourier thermal analysis method. This involves an inverse numerical...... solution of a 1-dimensional heat transfer problem connected to solidification of cast alloys. In the analysis, the relation between the thermal state and the fraction solid of the metal is evaluated by a numerical method. This method contains an iteration algorithm controlled by an under relaxation term...... was developed in order to investigate the thermal behaviour of the solidifying metal. Three cylindrically shaped cast samples surrounded by different cooling materials were introduced in the same mould allowing a common metallurgical background for samples solidifying at different cooling rates. The proposed...

  12. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  13. Self-adaptive method for high frequency multi-channel analysis of surface wave method

    Science.gov (United States)

    When the high frequency multi-channel analysis of surface waves (MASW) method is conducted to explore soil properties in the vadose zone, existing rules for selecting the near offset and spread lengths cannot satisfy the requirements of planar dominant Rayleigh waves for all frequencies of interest ...

  14. Addition to the method of dimensional analysis in hydraulic problems

    Directory of Open Access Journals (Sweden)

    A.M. Kalyakin

    2013-03-01

    Full Text Available The modern engineering design, structures, and especially machines running of new technologies set to engineers the problems that require immediate solution. Therefore, the importance of the method of dimensional analysis as a tool for ordinary engineer is increasing, allows developers to get quick and quite simple solution of even very complex tasks.The method of dimensional analysis is being applied to almost any field of physics and engineering, but it is especially effective at solving problems of mechanics and applied mechanics – hydraulics, fluid mechanics, structural mechanics, etc.Until now the main obstacle to the application of the method of dimensional analysis in its classic form was a multifactorial problem (with many arguments, the solution of which was rather difficult and sometimes impossible. In order to overcome these difficulties, the authors of this study proposed a simple method – application of the combined option avoiding these difficulties.The main result of the study is a simple algorithm which application will make it possible to solve a large class of previously unsolvable problems.

  15. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    2012-01-01

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb-Douglas a......Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify a functional form of the production function of which the Cobb...... parameter estimates, but also in biased measures which are derived from the parameters, such as elasticities. Therefore, we propose to use non-parametric econometric methods. First, these can be applied to verify the functional form used in parametric production analysis. Second, they can be directly used...... to estimate production functions without the specification of a functional form. Therefore, they avoid possible misspecification errors due to the use of an unsuitable functional form. In this paper, we use parametric and non-parametric methods to identify the optimal size of Polish crop farms...

  16. Meso-ester and carboxylic acid substituted BODIPYs with far-red and near-infrared emission for bioimaging applications

    KAUST Repository

    Ni, Yong

    2014-01-21

    A series of meso-ester-substituted BODIPY derivatives 1-6 are synthesized and characterized. In particular, dyes functionalized with oligo(ethylene glycol) ether styryl or naphthalene vinylene groups at the α positions of the BODIPY core (3-6) become partially soluble in water, and their absorptions and emissions are located in the far-red or near-infrared region. Three synthetic approaches are attempted to access the meso-carboxylic acid (COOH)-substituted BODIPYs 7 and 8 from the meso-ester-substituted BODIPYs. Two feasible synthetic routes are developed successfully, including one short route with only three steps. The meso-COOH-substituted BODIPY 7 is completely soluble in pure water, and its fluorescence maximum reaches around 650 nm with a fluorescence quantum yield of up to 15 %. Time-dependent density functional theory calculations are conducted to understand the structure-optical properties relationship, and it is revealed that the Stokes shift is dependent mainly on the geometric change from the ground state to the first excited singlet state. Furthermore, cell staining tests demonstrate that the meso-ester-substituted BODIPYs (1 and 3-6) and one of the meso-COOH-substituted BODIPYs (8) are very membrane-permeable. These features make these meso-ester- and meso-COOH-substituted BODIPY dyes attractive for bioimaging and biolabeling applications in living cells. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Core-shell designs of photoluminescent nanodiamonds with porous silica coatings for bioimaging and drug delivery II: application.

    Science.gov (United States)

    Prabhakar, Neeraj; Näreoja, Tuomas; von Haartman, Eva; Karaman, Didem Şen; Jiang, Hua; Koho, Sami; Dolenko, Tatiana A; Hänninen, Pekka E; Vlasov, Denis I; Ralchenko, Victor G; Hosomi, Satoru; Vlasov, Igor I; Sahlgren, Cecilia; Rosenholm, Jessica M

    2013-05-07

    Recent advances within materials science and its interdisciplinary applications in biomedicine have emphasized the potential of using a single multifunctional composite material for concurrent drug delivery and biomedical imaging. Here we present a novel composite material consisting of a photoluminescent nanodiamond (ND) core with a porous silica (SiO2) shell. This novel multifunctional probe serves as an alternative nanomaterial to address the existing problems with delivery and subsequent tracing of the particles. Whereas the unique optical properties of ND allows for long-term live cell imaging and tracking of cellular processes, mesoporous silica nanoparticles (MSNs) have proven to be efficient drug carriers. The advantages of both ND and MSNs were hereby integrated in the new composite material, ND@MSN. The optical properties provided by the ND core rendered the nanocomposite suitable for microscopy imaging in fluorescence and reflectance mode, as well as super-resolution microscopy as a STED label; whereas the porous silica coating provided efficient intracellular delivery capacity, especially in surface-functionalized form. This study serves as a demonstration how this novel nanomaterial can be exploited for both bioimaging and drug delivery for future theranostic applications.

  18. Morphology Tuning of Self-Assembled Perylene Monoimide from Nanoparticles to Colloidosomes with Enhanced Excimeric NIR Emission for Bioimaging.

    Science.gov (United States)

    Jana, Avijit; Bai, Linyi; Li, Xin; Ågren, Hans; Zhao, Yanli

    2016-01-27

    Organic near-infrared (NIR) fluorescent probes have been recognized as an emerging class of materials exhibiting a great potential in advanced bioanalytical applications. However, synthesizing such organic probes that could simultaneously work in the NIR spectral range and have large Stokes shift, high stability in biological systems, and high photostability have been proven challenging. In this work, aggregation induced excimeric NIR emission in aqueous media was observed from a suitably substituted perylene monoimide (PeIm) dye. Controlled entrapment of the dye into pluronic F127 micellar system to preserve its monomeric green emission in aqueous media was also established. The aggregation process of the PeIm dye to form organic nanoparticles (NPs) was evaluated experimentally by the means of transmission electron microscope imaging as well as theoretically by the molecular dynamics simulation studies. Tuning the morphology along with the formation of colloidosomes by the controlled self-aggregation of PeIm NPs in aqueous suspension was demonstrated successfully. Finally, both excimeric and monomeric emissive PeIm NPs as well as PeIm colloidosomes were employed for the bioimaging in vitro.

  19. Si-doped carbon quantum dots: a facile and general preparation strategy, bioimaging application, and multifunctional sensor.

    Science.gov (United States)

    Qian, Zhaosheng; Shan, Xiaoyue; Chai, Lujing; Ma, Juanjuan; Chen, Jianrong; Feng, Hui

    2014-05-14

    Heteroatom doping of carbon quantum dots not only enables great improvement of fluorescence efficiency and tunability of fluorescence emission, but also provides active sites in carbon dots to broaden their application in sensor. Silicon as a biocompatible element offers a promising direction for doping of carbon quantum dots. Si-doped carbon quantum dots (SiCQDs) were synthesized through a facile and effective approach. The as-prepared Si-doped carbon quantum dots possess visible fluorescence with high quantum yield up to 19.2%, owing to fluorescence enhancement effect of introduced silicon atoms into carbon dots. The toxicity test on human Hela cells showed that SiCQDs have lower cellular toxicity than common CQDs, and bioimaging experiments clearly demonstrated their excellent biolabelling ability and outstanding performance in resistance to photobleaching. Strong fluorescence quenching effect of Fe(III) on SiCQDs can be used for its selective detection among general metal ions. Specific electron transfer between SiCQDs and hydrogen peroxide enables SiCQDs as a sensitive fluorescence sensing platform for hydrogen peroxide. The subsequent fluorescence recovery induced by removal of hydrogen peroxide from SiCQDs due to formation of the stable adducts between hydrogen peroxide and melamine was taken advantage of to construct effective sensor for melamine.

  20. In site bioimaging of hydrogen sulfide uncovers its pivotal role in regulating nitric oxide-induced lateral root formation.

    Directory of Open Access Journals (Sweden)

    Yan-Jun Li

    Full Text Available Hydrogen sulfide (H2S is an important gasotransmitter in mammals. Despite physiological changes induced by exogenous H2S donor NaHS to plants, whether and how H2S works as a true cellular signal in plants need to be examined. A self-developed specific fluorescent probe (WSP-1 was applied to track endogenous H2S in tomato (Solanum lycopersicum roots in site. Bioimaging combined with pharmacological and biochemical approaches were used to investigate the cross-talk among H2S, nitric oxide (NO, and Ca(2+ in regulating lateral root formation. Endogenous H2S accumulation was clearly associated with primordium initiation and lateral root emergence. NO donor SNP stimulated the generation of endogenous H2S and the expression of the gene coding for the enzyme responsible for endogenous H2S synthesis. Scavenging H2S or inhibiting H2S synthesis partially blocked SNP-induced lateral root formation and the expression of lateral root-related genes. The stimulatory effect of SNP on Ca(2+ accumulation and CaM1 (calmodulin 1 expression could be abolished by inhibiting H2S synthesis. Ca(2+ chelator or Ca(2+ channel blocker attenuated NaHS-induced lateral root formation. Our study confirmed the role of H2S as a cellular signal in plants being a mediator between NO and Ca(2+ in regulating lateral root formation.

  1. Conceptual design of an undulator system for a dedicated bio-imaging beamline at the European X-ray FEL

    CERN Document Server

    Geloni, Gianluca; Saldin, Evgeni

    2012-01-01

    We describe a future possible upgrade of the European XFEL consisting in the construction of an undulator beamline dedicated to life science experiments. The availability of free undulator tunnels at the European XFEL facility offers a unique opportunity to build a beamline optimized for coherent diffraction imaging of complex molecules, like proteins and other biologically interesting structures. Crucial parameters for such bio-imaging beamline are photon energy range, peak power, and pulse duration. Key component of the setup is the undulator source. The peak power is maximized in the photon energy range between 3 keV and 13 keV by the use of a very efficient combination of self-seeding, fresh bunch and tapered undulator techniques. The unique combination of ultra-high peak power of 1 TW in the entire energy range, and ultrashort pulse duration tunable from 2 fs to 10 fs, would allow for single shot coherent imaging of protein molecules with size larger than 10 nm. Also, the new beamline would enable imagin...

  2. Multi-dye theranostic nanoparticle platform for bioimaging and cancer therapy

    Directory of Open Access Journals (Sweden)

    Singh AK

    2012-06-01

    Full Text Available Amit K Singh,1,2 Megan A Hahn,2 Luke G Gutwein,3 Michael C Rule,4 Jacquelyn A Knapik,5 Brij M Moudgil,1,2 Stephen R Grobmyer,3 Scott C Brown,2,61Department of Materials Science and Engineering, College of Engineering, 2Particle Engineering Research Center, College of Engineering, 3Division of Surgical Oncology, Department of Surgery, College of Medicine, 4Cell and Tissue Analysis Core, McKnight Brain Institute, 5Department of Pathology, College of Medicine, University of Florida, Gainesville, FL, USA; 6DuPont Central Research and Development, Corporate Center for Analytical Science, Wilmington, DE, USABackground: Theranostic nanomaterials composed of fluorescent and photothermal agents can both image and provide a method of disease treatment in clinical oncology. For in vivo use, the near-infrared (NIR window has been the focus of the majority of studies, because of greater light penetration due to lower absorption and scatter of biological components. Therefore, having both fluorescent and photothermal agents with optical properties in the NIR provides the best chance of improved theranostic capabilities utilizing nanotechnology.Methods: We developed nonplasmonic multi-dye theranostic silica nanoparticles (MDT-NPs, combining NIR fluorescence visualization and photothermal therapy within a single nanoconstruct comprised of molecular components. A modified NIR fluorescent heptamethine cyanine dye was covalently incorporated into a mesoporous silica matrix and a hydrophobic metallo-naphthalocyanine dye with large molar absorptivity was loaded into the pores of these fluorescent particles. The imaging and therapeutic capabilities of these nanoparticles were demonstrated in vivo using a direct tumor injection model.Results: The fluorescent nanoparticles are bright probes (300-fold enhancement in quantum yield versus free dye that have a large Stokes shift (>110 nm. Incorporation of the naphthalocyanine dye and exposure to NIR laser excitation

  3. Thermal Analysis of Thin Plates Using the Finite Element Method

    Science.gov (United States)

    Er, G. K.; Iu, V. P.; Liu, X. L.

    2010-05-01

    The isotropic thermal plate is analyzed with finite element method. The solution procedure is presented. The elementary stiffness matrix and loading vector are derived rigorously with variation principle and the principle of minimum potential energy. Numerical results are obtained based on the derived equations and tested with available exact solutions. The problems in the finite element analysis are figured out. It is found that the finite element solutions can not converge as the number of elements increases around the corners of the plate. The derived equations presented in this paper are fundamental for our further study on more complicated thermal plate analysis.

  4. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  5. HARMONIC ANALYSIS OF SVPWM INVERTER USING MULTIPLE-PULSES METHOD

    Directory of Open Access Journals (Sweden)

    Mehmet YUMURTACI

    2009-01-01

    Full Text Available Space Vector Modulation (SVM technique is a popular and an important PWM technique for three phases voltage source inverter in the control of Induction Motor. In this study harmonic analysis of Space Vector PWM (SVPWM is investigated using multiple-pulses method. Multiple-Pulses method calculates the Fourier coefficients of individual positive and negative pulses of the output PWM waveform and adds them together using the principle of superposition to calculate the Fourier coefficients of the all PWM output signal. Harmonic magnitudes can be calculated directly by this method without linearization, using look-up tables or Bessel functions. In this study, the results obtained in the application of SVPWM for values of variable parameters are compared with the results obtained with the multiple-pulses method.

  6. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  7. Nanosilicon properties, synthesis, applications, methods of analysis and control

    CERN Document Server

    Ischenko, Anatoly A; Aslalnov, Leonid A

    2015-01-01

    Nanosilicon: Properties, Synthesis, Applications, Methods of Analysis and Control examines the latest developments on the physics and chemistry of nanosilicon. The book focuses on methods for producing nanosilicon, its electronic and optical properties, research methods to characterize its spectral and structural properties, and its possible applications. The first part of the book covers the basic properties of semiconductors, including causes of the size dependence of the properties, structural and electronic properties, and physical characteristics of the various forms of silicon. It presents theoretical and experimental research results as well as examples of porous silicon and quantum dots. The second part discusses the synthesis of nanosilicon, modification of the surface of nanoparticles, and properties of the resulting particles. The authors give special attention to the photoluminescence of silicon nanoparticles. The third part describes methods used for studying and controlling the structure and pro...

  8. Vitamin B6: deficiency diseases and methods of analysis.

    Science.gov (United States)

    Ahmad, Iqbal; Mirza, Tania; Qadeer, Kiran; Nazim, Urooj; Vaid, Faiyaz Hm

    2013-09-01

    Vitamin B6 (pyridoxine) is closely associated with the functions of the nervous, immune and endocrine systems. It also participates in the metabolic processes of proteins, lipids and carbohydrates. Pyridoxine deficiency may result in neurological disorders including convulsions and epileptic encephalopathy and may lead to infant abnormalities. The Intravenous administration of pyridoxine to patients results in a dramatic cessation of seizures. A number of analytical methods were developed for the determination of pyridoxine in different dosage forms, food materials and biological fluids. These include UV spectrometric, spectrofluorimetric, mass spectrometric, thin-layer and high-performance liquid chromatographic, electrophoretic, electrochemical and enzymatic methods. Most of these methods are capable of determining pyridoxine in the presence of other vitamins and complex systems in µg quantities. The development and applications of these methods in pharmaceutical and clinical analysis mostly during the last decade have been reviewed.

  9. Analysis of new actuation methods for capacitive shunt micro switchs

    Directory of Open Access Journals (Sweden)

    Ben Sassi S

    2016-01-01

    Full Text Available This work investigates the use of new actuation methods in capacitive shunt micro switches. We formulate the coupled electromechanical problem by taking into account the fringing effects and nonlinearities due to mid-plane stretching. Static analysis is undertaken using the Differential Quadrature Method (DQM to obtain the pull in voltage which is verified by means of the Finite Element Method (FEM. Based on Galerkin approximation, a single degree of freedom dynamic model is developed and limit-cycle solutions are calculated using the Finite Difference Method (FDM. In addition to the harmonic waveform signal, we apply novel actuation waveform signals to simulate the frequency-response. We show that, biased signals, using a square wave signal reduces significantly the pull-in voltage compared to the triangular and harmonic signal . Finally, these results are validated experimentally.

  10. Similarity theory based method for MEMS dynamics analysis

    Institute of Scientific and Technical Information of China (English)

    LI Gui-xian; PENG Yun-feng; ZHANG Xin

    2008-01-01

    A new method for MEMS dynamics analysis is presented, ased on the similarity theory. With this method, two systems' similarities can be captured in terms of physics quantities/governed-equations amongst different energy fields, and then the unknown dynamic characteristics of one of the systems can be analyzed ac-cording to the similar ones of the other system. The probability to establish a pair of similar systems among MEMS and other energy systems is also discussed based on the equivalent between mechanics and electrics, and then the feasibility of applying this method is proven by an example, in which the squeezed damping force in MEMS and the current of its equivalent circuit established by this method are compared.

  11. Stability Analysis of a Variant of the Prony Method

    Directory of Open Access Journals (Sweden)

    Rodney Jaramillo

    2012-01-01

    Full Text Available Prony type methods are used in many engineering applications to determine the exponential fit corresponding to a dataset. In this paper we study a variant of Prony's method that was used by Martín-Landrove et al., in a process of segmentation of T2-weighted MRI brain images. We show the equivalence between that method and the classical Prony method and study the stability of the computed solutions with respect to noise in the data set. In particular, we show that the relative error in the calculation of the exponential fit parameters is linear with respect to noise in the data. Our analysis is based on classical results from linear algebra, matrix computation theory, and the theory of stability for roots of polynomials.

  12. Comparative analysis of different methods for graphene nanoribbon synthesis

    Directory of Open Access Journals (Sweden)

    Tošić Dragana D.

    2013-01-01

    Full Text Available Graphene nanoribbons (GNRs are thin strips of graphene that have captured the interest of scientists due to their unique structure and promising applications in electronics. This paper presents the results of a comparative analysis of morphological properties of graphene nanoribbons synthesized by different methods. Various methods have been reported for graphene nanoribons synthesis. Lithography methods usually include electron-beam (e-beam lithography, atomic force microscopy (AFM lithography, and scanning tunnelling microscopy (STM lithography. Sonochemical and chemical methods exist as well, namely chemical vapour deposition (CVD and anisotropic etching. Graphene nanoribbons can also be fabricated from unzipping carbon nanotubes (CNTs. We propose a new highly efficient method for graphene nanoribbons production by gamma irradiation of graphene dispersed in cyclopentanone (CPO. Surface morphology of graphene nanoribbons was visualized with atomic force and transmission electron microscopy. It was determined that dimensions of graphene nanoribbons are inversely proportional to applied gamma irradiation dose. It was established that the narrowest nanoribbons were 10-20 nm wide and 1 nm high with regular and smooth edges. In comparison to other synthesis methods, dimensions of graphene nanoribbons synthesized by gamma irradiation are slightly larger, but the yield of nanoribbons is much higher. Fourier transform infrared spectroscopy was used for structural analysis of graphene nanoribbons. Results of photoluminescence spectroscopy revealed for the first time that synthesized nanoribbons showed photoluminescence in the blue region of visible light in contrast to graphene nanoribbons synthesized by other methods. Based on disclosed facts, we believe that our synthesis method has good prospects for potential future mass production of graphene nanoribbons with uniform size, as well as for future investigations of carbon nanomaterials for

  13. Analysis on Large Deformation Compensation Method for Grinding Machine

    Directory of Open Access Journals (Sweden)

    Wang Ya-jie

    2013-08-01

    Full Text Available The positioning accuracy of computer numerical control machines tools and manufacturing systems is affected by structural deformations, especially for large sized systems. Structural deformations of the machine body are difficult to model and to predict. Researchs for the direct measurement of the amount of deformation and its compensation are farly limited in domestic and overseas,not involved to calculate the amount of deformation compensation. A new method to compensate large deformation caused by self-weight was presented in the paper. First of all, the compensation method is summarized; Then,static force analysis was taken on the large grinding machine through APDL(ANSYS Parameter Design Language. It could automatic extract results and form data files, getting the N points displacement in the working stroke of mechanical arm. Then, the mathematical model and corresponding flat rectangular function were established. The conclusion that the new compensation method is feasible was obtained through the analysis of displacement of N points. Finally, the MATLAB as a tool is used to calculate compensate amount and the accuracy of the proposed method is proved. Practice shows that the error caused by large deformatiion compensation method can meet the requirements of grinding.  

  14. Cytotoxicity and fluorescence studies of silica-coated CdSe quantum dots for bioimaging applications

    Energy Technology Data Exchange (ETDEWEB)

    Vibin, Muthunayagam [University of Kerala, Department of Biochemistry (India); Vinayakan, Ramachandran [National Institute for Interdisciplinary Science and Technology (CSIR), Photosciences and Photonics (India); John, Annie [Sree Chitra Tirunal Institute of Medical Sciences and Technology, Biomedical Technology Wing (India); Raji, Vijayamma; Rejiya, Chellappan S.; Vinesh, Naresh S.; Abraham, Annie, E-mail: annieab2@yahoo.co.in [University of Kerala, Department of Biochemistry (India)

    2011-06-15

    The toxicological effects of silica-coated CdSe quantum dots (QDs) were investigated systematically on human cervical cancer cell line. Trioctylphosphine oxide capped CdSe QDs were synthesized and rendered water soluble by overcoating with silica, using aminopropyl silane as silica precursor. The cytotoxicity studies were conducted by exposing cells to freshly synthesized QDs as a function of time (0-72 h) and concentration up to micromolar level by Lactate dehydrogenase assay, MTT [3-(4,5-Dimethylthiazol-2-yl)-2,5-Diphenyltetrazolium Bromide] assay, Neutral red cell viability assay, Trypan blue dye exclusion method and morphological examination of cells using phase contrast microscope. The in vitro analysis results showed that the silica-coated CdSe QDs were nontoxic even at higher loadings. Subsequently the in vivo fluorescence was also demonstrated by intravenous administration of the QDs in Swiss albino mice. The fluorescence images in the cryosections of tissues depicted strong luminescence property of silica-coated QDs under biological conditions. These results confirmed the role of these luminescent materials in biological labeling and imaging applications.

  15. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    CERN Document Server

    Schad, Ariane; Duvall, Tom L; Roth, Markus; Vorontsov, Sergei V

    2016-01-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  16. Global metabolite analysis of yeast: evaluation of sample preparation methods

    DEFF Research Database (Denmark)

    Villas-Bôas, Silas Granato; Højer-Pedersen, Jesper; Åkesson, Mats Fredrik;

    2005-01-01

    , which is the analysis of a large number of metabolites with very diverse chemical and physical properties. This work reports the leakage of intracellular metabolites observed during quenching yeast cells with cold methanol solution, the efficacy of six different methods for the extraction...... of intracellular metabolites, and the losses noticed during sample concentration by lyophilization and solvent evaporation. A more reliable procedure is suggested for quenching yeast cells with cold methanol solution, followed by extraction of intracellular metabolites by pure methanol. The method can be combined...

  17. New pressure transient analysis methods for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Serra, K.; Raghavan, R.; Reynolds, A.C.

    1983-10-01

    This paper presents new methods for analyzing pressure drawdown and buildup data obtained at wells producing naturally fractured reservoirs. The model used in this study assumes unsteady-state fluid transfer from the matrix system to the fracture system. A new flow regime is identified. The discovery of this flow regime explains field behavior that has been considered unusual. The probability of obtaining data reflecting this flow regime in a field test is higher than that of obtaining the classical responses given in the literature. The identification of this new flow regime provides methods for preparing a complete analysis of pressure data obtained from naturally fractured reservoirs. Applications to field data are discussed.

  18. New pressure transient analysis methods for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Serra, K.; Raghavan, R.; Reynolds, A.C.

    1983-12-01

    This paper presents new methods for analyzing pressure drawdown and buildup data obtained at wells producing naturally fractured reservoirs. The model used in this study assumes unsteady-state fluid transfer from the matrix system to the fracture system. A new flow regime is identified. The discovery of this flow regime explains field behavior that has been considered unusual. The probability of obtaining data reflecting this flow regime in a field test is higher than that of obtaining the classical responses given in the literature. The identification of this new flow regime provides methods for preparing a complete analysis of pressure data obtained from naturally fractured reservoirs. Applications to field data are discussed.

  19. Method of guiding functions in problems of nonlinear analysis

    CERN Document Server

    Obukhovskii, Valeri; Van Loi, Nguyen; Kornev, Sergei

    2013-01-01

    This book offers a self-contained introduction to the theory of guiding functions methods, which can be used to study the existence of periodic solutions and their bifurcations in ordinary differential equations, differential inclusions and in control theory. It starts with the basic concepts of nonlinear and multivalued analysis, describes the classical aspects of the method of guiding functions, and then presents recent findings only available in the research literature. It describes essential applications in control theory, the theory of bifurcations, and physics, making it a valuable resource not only for “pure” mathematicians, but also for students and researchers working in applied mathematics, the engineering sciences and physics.

  20. SAMA: A Method for 3D Morphological Analysis.

    Science.gov (United States)

    Paulose, Tessie; Montévil, Maël; Speroni, Lucia; Cerruti, Florent; Sonnenschein, Carlos; Soto, Ana M

    2016-01-01

    Three-dimensional (3D) culture models are critical tools for understanding tissue morphogenesis. A key requirement for their analysis is the ability to reconstruct the tissue into computational models that allow quantitative evaluation of the formed structures. Here, we present Software for Automated Morphological Analysis (SAMA), a method by which epithelial structures grown in 3D cultures can be imaged, reconstructed and analyzed with minimum human intervention. SAMA allows quantitative analysis of key features of epithelial morphogenesis such as ductal elongation, branching and lumen formation that distinguish different hormonal treatments. SAMA is a user-friendly set of customized macros operated via FIJI (http://fiji.sc/Fiji), an open-source image analysis platform in combination with a set of functions in R (http://www.r-project.org/), an open-source program for statistical analysis. SAMA enables a rapid, exhaustive and quantitative 3D analysis of the shape of a population of structures in a 3D image. SAMA is cross-platform, licensed under the GPLv3 and available at http://montevil.theobio.org/content/sama.

  1. Methods of Analysis of Electronic Money in Banks

    Directory of Open Access Journals (Sweden)

    Melnychenko Oleksandr V.

    2014-03-01

    Full Text Available The article identifies methods of analysis of electronic money, formalises its instruments and offers an integral indicator, which should be calculated by issuing banks and those banks, which carry out operations with electronic money, issued by other banks. Calculation of the integral indicator would allow complex assessment of activity of the studied bank with electronic money and would allow comparison of parameters of different banks by the aggregate of indicators for the study of the electronic money market, its level of development, etc. The article presents methods which envisage economic analysis of electronic money in banks by the following directions: solvency and liquidity, efficiency of electronic money issue, business activity of the bank and social responsibility. Moreover, the proposed indicators by each of the directions are offered to be taken into account when building integral indicators, with the help of which banks are studied: business activity, profitability, solvency, liquidity and so on.

  2. Development of Photogrammetric Methods of Stress Analysis and Quality Control

    CERN Document Server

    Kubik, D L; Kubik, Donna L.; Greenwood, John A.

    2003-01-01

    A photogrammetric method of stress analysis has been developed to test thin, nonstandard windows designed for hydrogen absorbers, major components of a muon cooling channel. The purpose of the absorber window tests is to demonstrate an understanding of the window behavior and strength as a function of applied pressure. This is done by comparing the deformation of the window, measured via photogrammetry, to the deformation predicted by finite element analysis (FEA). FEA analyses indicate a strong sensitivity of strain to the window thickness. Photogrammetric methods were chosen to measure the thickness of the window, thus providing data that are more accurate to the FEA. This, plus improvements made in hardware and testing procedures, resulted in a precision of 5 microns in all dimensions and substantial agreement with FEA predictions.

  3. Extraction, chromatographic and mass spectrometric methods for lipid analysis.

    Science.gov (United States)

    Pati, Sumitra; Nie, Ben; Arnold, Robert D; Cummings, Brian S

    2016-05-01

    Lipids make up a diverse subset of biomolecules that are responsible for mediating a variety of structural and functional properties as well as modulating cellular functions such as trafficking, regulation of membrane proteins and subcellular compartmentalization. In particular, phospholipids are the main constituents of biological membranes and play major roles in cellular processes like transmembrane signaling and structural dynamics. The chemical and structural variety of lipids makes analysis using a single experimental approach quite challenging. Research in the field relies on the use of multiple techniques to detect and quantify components of cellular lipidomes as well as determine structural features and cellular organization. Understanding these features can allow researchers to elucidate the biochemical mechanisms by which lipid-lipid and/or lipid-protein interactions take place within the conditions of study. Herein, we provide an overview of essential methods for the examination of lipids, including extraction methods, chromatographic techniques and approaches for mass spectrometric analysis.

  4. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  5. Application of the maximum entropy method to profile analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, N.; Kalceff, W. [University of Technology, Department of Applied Physics, Sydney, NSW (Australia); Cline, J.P. [National Institute of Standards and Technology, Gaithersburg, (United States)

    1999-12-01

    Full text: A maximum entropy (MaxEnt) method for analysing crystallite size- and strain-induced x-ray profile broadening is presented. This method treats the problems of determining the specimen profile, crystallite size distribution, and strain distribution in a general way by considering them as inverse problems. A common difficulty faced by many experimenters is their inability to determine a well-conditioned solution of the integral equation, which preserves the positivity of the profile or distribution. We show that the MaxEnt method overcomes this problem, while also enabling a priori information, in the form of a model, to be introduced into it. Additionally, we demonstrate that the method is fully quantitative, in that uncertainties in the solution profile or solution distribution can be determined and used in subsequent calculations, including mean particle sizes and rms strain. An outline of the MaxEnt method is presented for the specific problems of determining the specimen profile and crystallite or strain distributions for the correspondingly broadened profiles. This approach offers an alternative to standard methods such as those of Williamson-Hall and Warren-Averbach. An application of the MaxEnt method is demonstrated in the analysis of alumina size-broadened diffraction data (from NIST, Gaithersburg). It is used to determine the specimen profile and column-length distribution of the scattering domains. Finally, these results are compared with the corresponding Williamson-Hall and Warren-Averbach analyses. Copyright (1999) Australian X-ray Analytical Association Inc.

  6. On the Analysis Method of the Triple Tesh Cross Design

    Institute of Scientific and Technical Information of China (English)

    JinYi

    1995-01-01

    The analysis method of the triple test cross design has been discussed carefully form the two factor experiment design and the genetic models of additive-dominant effect and of epistasis effect-Two points different from the previous reprots have been concluded:(1)both the degrees of freedom of the orthogonal terms C2 and C3 are m;(2) the denominator in the F test to C2 and C3 is the error mean of square between plots.

  7. Interval Analysis of the Finite Element Method for Stochastic Structures

    Institute of Scientific and Technical Information of China (English)

    刘长虹; 刘筱玲; 陈虬

    2004-01-01

    A random parameter can be transformed into an interval number in the structural analysis with the concept of the confidence interval. Hence, analyses of uncertain structural systems can be used in the traditional FEM software. In some cases, the amount of solutions in stochastic structures is nearly as many as that in the traditional structural problems. In addition, a new method to evaluate the failure probability of structures is presented for the needs of the modern engineering design.

  8. A Method for the Analysis of High Power Battery Designs

    OpenAIRE

    1997-01-01

    Proceedings of the 32nd Intersociety Energy Conversion Engineering Conference, Honolulu, HI, July 27 - August 1, 1997 A spreadsheet model for the analysis of batteries of various types has been developed that permits the calculation of the size and performance characteristics of the battery based on its internal geometry and electrode/electrolyte material properties. The method accounts for most of the electrochemical mechanisms in both the anode and cathode without solving the gover...

  9. Classification of analysis methods for characterization of magnetic nanoparticle properties

    DEFF Research Database (Denmark)

    Posth, O.; Hansen, Mikkel Fougt; Steinhoff, U.

    2015-01-01

    The aim of this paper is to provide a roadmap for the standardization of magnetic nanoparticle (MNP) characterization. We have assessed common MNP analysis techniques under various criteria in order to define the methods that can be used as either standard techniques for magnetic particle...... characterization or those that can be used to obtain a comprehensive picture of a MNP system. This classification is the first step on the way to develop standards for nanoparticle characterization....

  10. Dynamic Characteristic Analysis and Experiment for Integral Impeller Based on Cyclic Symmetry Analysis Method

    Institute of Scientific and Technical Information of China (English)

    WU Qiong; ZHANG Yidu; ZHANG Hongwei

    2012-01-01

    A cyclic symmetry analysis method is proposed for analyzing the dynamic characteristic problems of thin walled integral impeller.Reliability and feasibility of the present method are investigated by means of simulation and experiment.The fundamental cyclic symmetry equations and the solutions of these equations are derived for the cyclic symmetry structure.The computational efficiency analysis between whole and part is performed.Comparison of results obtained by the finite element analysis (FEA)and experiment shows that the local dynamic characteristic of integral impeller has consistency with the single cyclic symmetry blade.When the integral impeller is constrained and the thin walled blade becomes a concerned object in analysis,the dynamic characteristic of integral impeller can be replaced by the cyclic symmetry blade approximately.Hence,a cyclic symmetry analysis method is effectively used to improve efficiency and obtain more information of parameters for dynamic characteristic of integral impellers.

  11. Simplified QCD fit method for BSM analysis of HERA data

    CERN Document Server

    Turkot, Oleksii; Zarnecki, Aleksander Filip

    2016-01-01

    The high-precision HERA data can be used as an input to a QCD analysis within the DGLAP formalism to obtain the detailed description of the proton structure in terms of the parton distribution functions (PDFs). However, when searching for Beyond Standard Model (BSM) contributions in the data one should take into account the possibility that the PDF set may already have been biased by partially or totally absorbing previously unrecognised new physics contributions. The ZEUS Collaboration has proposed a new approach to the BSM analysis of the inclusive $ep$ data based on the simultaneous QCD fits of parton distribution functions together with contributions of new physics processes. Unfortunately, limit setting procedure in the frequentist approach is very time consuming in this method, as full QCD analysis has to be repeated for numerous data replicas. We describe a simplified approach, based on the Taylor expansion of the cross section predictions in terms of PDF parameters, which allowed us to reduce the calc...

  12. An Overview of Bayesian Methods for Neural Spike Train Analysis

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2013-01-01

    Full Text Available Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  13. Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations

    CERN Document Server

    Loredo, Thomas J; Chernoff, David F; Clyde, Merlise A; Liu, Bin

    2011-01-01

    We describe work in progress by a collaboration of astronomers and statisticians developing a suite of Bayesian data analysis tools for extrasolar planet (exoplanet) detection, planetary orbit estimation, and adaptive scheduling of observations. Our work addresses analysis of stellar reflex motion data, where a planet is detected by observing the "wobble" of its host star as it responds to the gravitational tug of the orbiting planet. Newtonian mechanics specifies an analytical model for the resulting time series, but it is strongly nonlinear, yielding complex, multimodal likelihood functions; it is even more complex when multiple planets are present. The parameter spaces range in size from few-dimensional to dozens of dimensions, depending on the number of planets in the system, and the type of motion measured (line-of-sight velocity, or position on the sky). Since orbits are periodic, Bayesian generalizations of periodogram methods facilitate the analysis. This relies on the model being linearly separable, ...

  14. THE WAVELET ANALYSIS METHOD ON THE TRANSIENT SIGNAL

    Institute of Scientific and Technical Information of China (English)

    吴淼

    1996-01-01

    Many dynamic signals of mining machines are transient, such as load signals when roadheader's cutting head being cut-in or cut-out and response signals produced by these loads. For these transient signals, the traditional Fourier analysis method is quite inadequate,The limitations of analysis, resolution by using Short-Time Fourier Transform (STFT) on them were discussed in this paper. Because of wavelet transform having the characteristics of flexible window and multiresolution analysis, we try to apply it to analyse these transientsignal. In order to give a pratical example,using D 18 wavelet and Mallat's tree algorithm with MATLAB, the discrete wavelet transform was calculated for the simulating response signals of a three-degree-of freedom vibration system when it was under impulse and random excitations. The results of the wavelet transform made clear its effectiveness and superiority in analysing transient signals of mining machines.

  15. Extending methods: using Bourdieu's field analysis to further investigate taste

    Science.gov (United States)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  16. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  17. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  18. Lattice Boltzmann methods for global linear instability analysis

    Science.gov (United States)

    Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis

    2016-11-01

    Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.

  19. Synthesis of GdAlO3:Mn(4+),Ge(4+)@Au Core-Shell Nanoprobes with Plasmon-Enhanced Near-Infrared Persistent Luminescence for in Vivo Trimodality Bioimaging.

    Science.gov (United States)

    Liu, Jing-Min; Liu, Yao-Yao; Zhang, Dong-Dong; Fang, Guo-Zhen; Wang, Shuo

    2016-11-09

    The rise of multimodal nanoprobes has promoted the development of new methods to explore multiple molecular targets simultaneously or to combine various bioimaging tools in one assay to more clearly delineate localization and expression of biomarkers. Persistent luminescence nanophosphors (PLNPs) have been qualified as a promising contrast agent for in vivo imaging. The easy surface modification and proper nanostructure design strategy would favor the fabrication of PLNP-based multifunctional nanoprobes for biological application. In this paper, we have proposed novel multifunctional core-shell nanomaterials, applying the Mn(4+) and Ge(4+) co-doped gadolinium aluminate (GdAlO3:Mn(4+),Ge(4+)) PLNPs as the near-infrared persistent luminescence emission center and introducing the gold nanoshell coated on the PLNPs to enhance the luminescence efficiency via plasmon resonance. Our developed core-shell nanoprobes have demonstrated the excellent features of ultrabrightness, superlong afterglow, good monodispersity, low toxicity, and excellent biocompatibility. The well-characterized nanoprobes have been utilized for trimodality in vivo imaging, with near-infrared persistent luminescence for optical imaging, Gd element for magnetic resonance imaging, and Au element for computed tomography imaging.

  20. Analysis of Social Cohesion in Health Data by Factor Analysis Method: The Ghanaian Perspective

    Science.gov (United States)

    Saeed, Bashiru I. I.; Xicang, Zhao; Musah, A. A. I.; Abdul-Aziz, A. R.; Yawson, Alfred; Karim, Azumah

    2013-01-01

    We investigated the study of the overall social cohesion of Ghanaians. In this study, we considered the paramount interest of the involvement of Ghanaians in their communities, their views of other people and institutions, and their level of interest in both local and national politics. The factor analysis method was employed for analysis using R…

  1. Summary oral reflective analysis: a method for interview data analysis in feminist qualitative research.

    Science.gov (United States)

    Thompson, S M; Barrett, P A

    1997-12-01

    This article explores an innovative approach to qualitative data analysis called Summary Oral Reflective Analysis (SORA). The method preserves the richness and contextuality of in-depth interview data within a broader feminist philosophical perspective. This multidisciplinary approach was developed in two individual research programs within a cooperative, collaborative arrangement. It represents a creative response to perceived deficiencies in the pragmatics of qualitative data analysis where the maintenance of data contextuality is critical.

  2. The SMART CLUSTER METHOD - adaptive earthquake cluster analysis and declustering

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2016-04-01

    Earthquake declustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity with usual applications comprising of probabilistic seismic hazard assessments (PSHAs) and earthquake prediction methods. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation. Various methods have been developed to address this issue from other researchers. These have differing ranges of complexity ranging from rather simple statistical window methods to complex epidemic models. This study introduces the smart cluster method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal identification. Hereby, an adaptive search algorithm for data point clusters is adopted. It uses the earthquake density in the spatio-temporal neighbourhood of each event to adjust the search properties. The identified clusters are subsequently analysed to determine directional anisotropy, focussing on a strong correlation along the rupture plane and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010/2011 Darfield-Christchurch events, an adaptive classification procedure is applied to disassemble subsequent ruptures which may have been grouped into an individual cluster using near-field searches, support vector machines and temporal splitting. The steering parameters of the search behaviour are linked to local earthquake properties like magnitude of completeness, earthquake density and Gutenberg-Richter parameters. The method is capable of identifying and classifying earthquake clusters in space and time. It is tested and validated using earthquake data from California and New Zealand. As a result of the cluster identification process, each event in

  3. Sampling and analysis methods for geothermal fluids and gases

    Energy Technology Data Exchange (ETDEWEB)

    Watson, J.C.

    1978-07-01

    The sampling procedures for geothermal fluids and gases include: sampling hot springs, fumaroles, etc.; sampling condensed brine and entrained gases; sampling steam-lines; low pressure separator systems; high pressure separator systems; two-phase sampling; downhole samplers; and miscellaneous methods. The recommended analytical methods compiled here cover physical properties, dissolved solids, and dissolved and entrained gases. The sequences of methods listed for each parameter are: wet chemical, gravimetric, colorimetric, electrode, atomic absorption, flame emission, x-ray fluorescence, inductively coupled plasma-atomic emission spectroscopy, ion exchange chromatography, spark source mass spectrometry, neutron activation analysis, and emission spectrometry. Material on correction of brine component concentrations for steam loss during flashing is presented. (MHR)

  4. Microscale extraction method for HPLC carotenoid analysis in vegetable matrices

    Directory of Open Access Journals (Sweden)

    Sidney Pacheco

    2014-10-01

    Full Text Available In order to generate simple, efficient analytical methods that are also fast, clean, and economical, and are capable of producing reliable results for a large number of samples, a micro scale extraction method for analysis of carotenoids in vegetable matrices was developed. The efficiency of this adapted method was checked by comparing the results obtained from vegetable matrices, based on extraction equivalence, time required and reagents. Six matrices were used: tomato (Solanum lycopersicum L., carrot (Daucus carota L., sweet potato with orange pulp (Ipomoea batatas (L. Lam., pumpkin (Cucurbita moschata Duch., watermelon (Citrullus lanatus (Thunb. Matsum. & Nakai and sweet potato (Ipomoea batatas (L. Lam. flour. Quantification of the total carotenoids was made by spectrophotometry. Quantification and determination of carotenoid profiles were formulated by High Performance Liquid Chromatography with photodiode array detection. Microscale extraction was faster, cheaper and cleaner than the commonly used one, and advantageous for analytical laboratories.

  5. An analytical method for ditching analysis of an airborne vehicle

    Science.gov (United States)

    Ghaffari, Farhad

    1988-01-01

    A simple analytical method has been introduced for aerohydrodynamic load analysis of an airborne configuration during water ditching. The method employs an aerodynamic panel code, based on linear potential flow theory, to simulate the flow of air and water around an aircraft configuration. The free surface separating the air and water region is represented by doublet sheet singularities. Although all the theoretical load distributions are computed for air, provisions are made to correct the pressure coefficients obtained on the configuration wetted surfaces to account for the water density. As an analytical tool, the Vortex Separation Aerodynamic (VSAERO) code is chosen to carry out the present investigation. After assessing the validity of the method, its first application is to analyze the water ditching of the Space Shuttle configuration at a 12 degree attitude.

  6. Dynamic multiplexed analysis method using ion mobility spectrometer

    Science.gov (United States)

    Belov, Mikhail E [Richland, WA

    2010-05-18

    A method for multiplexed analysis using ion mobility spectrometer in which the effectiveness and efficiency of the multiplexed method is optimized by automatically adjusting rates of passage of analyte materials through an IMS drift tube during operation of the system. This automatic adjustment is performed by the IMS instrument itself after determining the appropriate levels of adjustment according to the method of the present invention. In one example, the adjustment of the rates of passage for these materials is determined by quantifying the total number of analyte molecules delivered to the ion trap in a preselected period of time, comparing this number to the charge capacity of the ion trap, selecting a gate opening sequence; and implementing the selected gate opening sequence to obtain a preselected rate of analytes within said IMS drift tube.

  7. Analysis of Photovoltaic System Energy Performance Evaluation Method

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.; Newmiller, J.; Kimber, A.; Flottemesch, R.; Riley, E.; Dierauf, T.; McKee, J.; Krishnani, P.

    2013-11-01

    Documentation of the energy yield of a large photovoltaic (PV) system over a substantial period can be useful to measure a performance guarantee, as an assessment of the health of the system, for verification of a performance model to then be applied to a new system, or for a variety of other purposes. Although the measurement of this performance metric might appear to be straight forward, there are a number of subtleties associated with variations in weather and imperfect data collection that complicate the determination and data analysis. A performance assessment is most valuable when it is completed with a very low uncertainty and when the subtleties are systematically addressed, yet currently no standard exists to guide this process. This report summarizes a draft methodology for an Energy Performance Evaluation Method, the philosophy behind the draft method, and the lessons that were learned by implementing the method.

  8. Urinary density measurement and analysis methods in neonatal unit care

    Directory of Open Access Journals (Sweden)

    Maria Vera Lúcia Moreira Leitão Cardoso

    2013-09-01

    Full Text Available The objective was to assess urine collection methods through cotton in contact with genitalia and urinary collector to measure urinary density in newborns. This is a quantitative intervention study carried out in a neonatal unit of Fortaleza-CE, Brazil, in 2010. The sample consisted of 61 newborns randomly chosen to compose the study group. Most neonates were full term (31/50.8% males (33/54%. Data on urinary density measurement through the methods of cotton and collector presented statistically significant differences (p<0.05. The analysis of interquartile ranges between subgroups resulted in statistical differences between urinary collector/reagent strip (1005 and cotton/reagent strip (1010, however there was no difference between urinary collector/ refractometer (1008 and cotton/ refractometer. Therefore, further research should be conducted with larger sampling using methods investigated in this study and whenever possible, comparing urine density values to laboratory tests.

  9. Reliability Analysis of Penetration Systems Using Nondeterministic Methods

    Energy Technology Data Exchange (ETDEWEB)

    FIELD JR.,RICHARD V.; PAEZ,THOMAS L.; RED-HORSE,JOHN R.

    1999-10-27

    Device penetration into media such as metal and soil is an application of some engineering interest. Often, these devices contain internal components and it is of paramount importance that all significant components survive the severe environment that accompanies the penetration event. In addition, the system must be robust to perturbations in its operating environment, some of which exhibit behavior which can only be quantified to within some level of uncertainty. In the analysis discussed herein, methods to address the reliability of internal components for a specific application system are discussed. The shock response spectrum (SRS) is utilized in conjunction with the Advanced Mean Value (AMV) and Response Surface methods to make probabilistic statements regarding the predicted reliability of internal components. Monte Carlo simulation methods are also explored.

  10. Research on PGNAA adaptive analysis method with BP neural network

    Science.gov (United States)

    Peng, Ke-Xin; Yang, Jian-Bo; Tuo, Xian-Guo; Du, Hua; Zhang, Rui-Xue

    2016-11-01

    A new approach method to dealing with the puzzle of spectral analysis in prompt gamma neutron activation analysis (PGNAA) is developed and demonstrated. It consists of utilizing BP neural network to PGNAA energy spectrum analysis which is based on Monte Carlo (MC) simulation, the main tasks which we will accomplish as follows: (1) Completing the MC simulation of PGNAA spectrum library, we respectively set mass fractions of element Si, Ca, Fe from 0.00 to 0.45 with a step of 0.05 and each sample is simulated using MCNP. (2) Establishing the BP model of adaptive quantitative analysis of PGNAA energy spectrum, we calculate peak areas of eight characteristic gamma rays that respectively correspond to eight elements in each individual of 1000 samples and that of the standard sample. (3) Verifying the viability of quantitative analysis of the adaptive algorithm where 68 samples were used successively. Results show that the precision when using neural network to calculate the content of each element is significantly higher than the MCLLS.

  11. Spatial Analysis Methods for Health Promotion and Education.

    Science.gov (United States)

    Chaney, Robert A; Rojas-Guyler, Liliana

    2016-05-01

    This article provides a review of spatial analysis methods for use in health promotion and education research and practice. Spatial analysis seeks to describe or make inference about variables with respect to the places they occur. This includes geographic differences, proximity issues, and access to resources. This is important for understanding how health outcomes differ from place to place; and in terms of understanding some of the environmental underpinnings of health outcomes data by placing it in context of geographic location. This article seeks to promote spatial analysis as a viable tool for health promotion and education research and practice. Four more commonly used spatial analysis techniques are described in-text. An illustrative example of motor vehicle collisions in a large metropolitan city is presented using these techniques. The techniques discussed are as follows: descriptive mapping, global spatial autocorrelation, cluster detection, and identification and spatial regression analysis. This article provides useful information for health promotion and education researchers and practitioners seeking to examine research questions from a spatial perspective.

  12. Application of the holistic methods in analysis of organic milk

    Directory of Open Access Journals (Sweden)

    Anka Popović-Vranješ

    2012-12-01

    Full Text Available Organic farming has advantages in terms of environmental protection, biodiversity, soil quality, animal welfare and pesticide residues. Unlike conventional production “organic chain” means that healthy soil leads to healthy animal feed, leading to healthy cows with normal milk, which eventually leads to healthy consumers. Since this must be scientifically proven, there is an increasing need for scientific methods that will reveal the benefits of organic food. For this purpose holistic methods such as biocrystallization and methods of rising picture are introduced. Biocrystallization shows that organic milk is systematically more “balanced” and that there is more “ordered structure” and better “integration and coordination.” Previous studies using biocrystallization method were performed on the raw milk produced in different conditions, differently treated milk (heat treatment and homogenization and on butter. Pictures of biocrystallization are firstly visually assessed and then by the computer analysis of texture images, which are used to estimate the density of images. Rising picture method which normally works in parallel with biocrystallization can differentiate samples of Demeter, and organic milk from conventional production and milk treated differently during processing. Organic milk in relation to conventional shows better result in terms of impact on the health of consumers when using both the conventional and holistic methods.

  13. NOLB : Non-linear rigid block normal mode analysis method.

    Science.gov (United States)

    Hoffmann, Alexandre; Grudinin, Sergei

    2017-04-05

    We present a new conceptually simple and computationally efficient method for non-linear normal mode analysis called NOLB. It relies on the rotations-translations of blocks (RTB) theoretical basis developed by Y.-H. Sanejouand and colleagues. We demonstrate how to physically interpret the eigenvalues computed in the RTB basis in terms of angular and linear velocities applied to the rigid blocks and how to construct a non-linear extrapolation of motion out of these velocities. The key observation of our method is that the angular velocity of a rigid block can be interpreted as the result of an implicit force, such that the motion of the rigid block can be considered as a pure rotation about a certain center. We demonstrate the motions produced with the NOLB method on three different molecular systems and show that some of the lowest frequency normal modes correspond to the biologically relevant motions. For example, NOLB detects the spiral sliding motion of the TALE protein, which is capable of rapid diffusion along its target DNA. Overall, our method produces better structures compared to the standard approach, especially at large deformation amplitudes, as we demonstrate by visual inspection, energy and topology analyses, and also by the MolProbity service validation. Finally, our method is scalable and can be applied to very large molecular systems, such as ribosomes. Standalone executables of the NOLB normal mode analysis method are available at https://team.inria.fr/nano-d/software/nolb-normal-modes. A graphical user interfaces created for the SAMSON software platform will be made available at https: //www.samson-connect.net.

  14. Statistical Models and Methods for Network Meta-Analysis.

    Science.gov (United States)

    Madden, L V; Piepho, H-P; Paul, P A

    2016-08-01

    Meta-analysis, the methodology for analyzing the results from multiple independent studies, has grown tremendously in popularity over the last four decades. Although most meta-analyses involve a single effect size (summary result, such as a treatment difference) from each study, there are often multiple treatments of interest across the network of studies in the analysis. Multi-treatment (or network) meta-analysis can be used for simultaneously analyzing the results from all the treatments. However, the methodology is considerably more complicated than for the analysis of a single effect size, and there have not been adequate explanations of the approach for agricultural investigations. We review the methods and models for conducting a network meta-analysis based on frequentist statistical principles, and demonstrate the procedures using a published multi-treatment plant pathology data set. A major advantage of network meta-analysis is that correlations of estimated treatment effects are automatically taken into account when an appropriate model is used. Moreover, treatment comparisons may be possible in a network meta-analysis that are not possible in a single study because all treatments of interest may not be included in any given study. We review several models that consider the study effect as either fixed or random, and show how to interpret model-fitting output. We further show how to model the effect of moderator variables (study-level characteristics) on treatment effects, and present one approach to test for the consistency of treatment effects across the network. Online supplemental files give explanations on fitting the network meta-analytical models using SAS.

  15. Preparation of 99mTc-EDTA-MN and Its Bioimaging in Mouse

    Directory of Open Access Journals (Sweden)

    Yongshuai QI

    2015-07-01

    Full Text Available Background and objective Hypoxia is an important biological characteristics of solid tumor, it is not sensitive to radiotherapy and chemotherapy for which is the presence of hypoxic cell, thus increasing their resistance to conventional radiotherapy and chemotherapy, therefore, the detection of hypoxia degree of tumor tissue is of great significance. The hypoxia imaging of nuclear medicine can reflect the degree of tissue hypoxia, which can selectively retained on the hypoxic cells or tissues, including nitroimidazole and non nitroimidazole; the nitroimidazole is widely and deeply researched as hypoxic celles developer in China and abroad at present. The research about application of radionuclide labelled technique has clinical application value to develop the hypoxia imaging agent EDTA-MN complexes which was labeled. To study the feasibility of 99mTc by direct labeling method, the radiochemical properties evaluation of 99mTc-EDTA-MN, and observe the distribution characteristics of 99mTc radiolabeled EDTA-MN in the xenograft lung cancer nude mice bearing non-small cell lung cancer cell (A549, and provide experimental evidence for its further research and application. Methods The radiolabeling of EDTA-MN with 99mTc was performed with direct labeling method, respectively, on the reaction dosage (10 mg, 5 mg, 2 mg, stannous chloride dosage (8 mg/mL, 4 mg/mL, 2 mg/mL, mark system pH (2, 4, 5, 6 one by one test, using orthogonal design analysis, to find the optimal labeling conditions. Labelling rate, radiochemical purity, lipid-water partition coefficient and in vitro stability in normal saline (NS were determined by TLC and HPLC, and the preliminary study on the distribution of 99mTc-EDTA-MN in nude mice. Results The labeling rate of 99mTc-EDTA-MN with the best labeling conditions was (84.11±2.83%, and the radiochemical purity was higher than 90% by HPLC purification, without any notable decomposition at room temperature over a period of 12 h. The

  16. Biclustering methods: biological relevance and application in gene expression analysis.

    Directory of Open Access Journals (Sweden)

    Ali Oghabian

    Full Text Available DNA microarray technologies are used extensively to profile the expression levels of thousands of genes under various conditions, yielding extremely large data-matrices. Thus, analyzing this information and extracting biologically relevant knowledge becomes a considerable challenge. A classical approach for tackling this challenge is to use clustering (also known as one-way clustering methods where genes (or respectively samples are grouped together based on the similarity of their expression profiles across the set of all samples (or respectively genes. An alternative approach is to develop biclustering methods to identify local patterns in the data. These methods extract subgroups of genes that are co-expressed across only a subset of samples and may feature important biological or medical implications. In this study we evaluate 13 biclustering and 2 clustering (k-means and hierarchical methods. We use several approaches to compare their performance on two real gene expression data sets. For this purpose we apply four evaluation measures in our analysis: (1 we examine how well the considered (biclustering methods differentiate various sample types; (2 we evaluate how well the groups of genes discovered by the (biclustering methods are annotated with similar Gene Ontology categories; (3 we evaluate the capability of the methods to differentiate genes that are known to be specific to the particular sample types we study and (4 we compare the running time of the algorithms. In the end, we conclude that as long as the samples are well defined and annotated, the contamination of the samples is limited, and the samples are well replicated, biclustering methods such as Plaid and SAMBA are useful for discovering relevant subsets of genes and samples.

  17. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  18. The development of a 3D risk analysis method.

    Science.gov (United States)

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  19. Primary component analysis method and reduction of seismicity parameters

    Institute of Scientific and Technical Information of China (English)

    WANG Wei; MA Qin-zhong; LIN Ming-zhou; WU Geng-feng; WU Shao-chun

    2005-01-01

    In the paper, the primary component analysis is made using 8 seismicity parameters of earthquake frequency N (ML≥3.0), b-value, 7-value, A(b)-value, Mf-value, Ac-value, C-value and D-value that reflect the characteristics of magnitude, time and space distribution of seismicity from different respects. By using the primary component analysis method, the synthesis parameter W reflecting the anomalous features of earthquake magnitude, time and space distribution can be gained. Generally, there is some relativity among the 8 parameters, but their variations are different in different periods. The earthquake prediction based on these parameters is not very well. However,the synthesis parameter W showed obvious anomalies before 13 earthquakes (MS>5.8) occurred in North China,which indicates that the synthesis parameter W can reflect the anomalous characteristics of magnitude, time and space distribution of seismicity better. Other problems related to the conclusions drawn by the primary component analysis method are also discussed.

  20. Numerical method in biomechanical analysis of intramedullary osteosynthesis in children

    Directory of Open Access Journals (Sweden)

    A. Krauze

    2006-02-01

    Full Text Available Purpose: The paper presents the biomechanical analysis of intramedullary osteosynthesis in 5-7 year old children.Design/methodology/approach: The numerical analysis was performed for two different materials (stainless steel – 316L and titanium alloy – Ti-6Al-4V and for two different fractures of the femur (1/2 of the bone shaft, and 25 mm above. Furthermore, the stresses between the bone fragments were calculated while loading the femur with forces derived from the trunk mass. In the research the Metaizeau method was applied. This method ensures appropriate fixation without complications.Findings: The numerical analysis shows that stresses in both the steel and the titanium alloy nails didn’t exceed the yield point: for the stainless steel Rp0,2,min = 690 MPa and for the titanium alloy Rp0,2,min = 895 MPa.Research limitations/implications: The obtained results are the basis for the optimization of mechanical properties of the metallic biomaterial.Practical implications: On the basis of the obtained results it can be stated that both stainless steel and titanium alloy nails can be aplied in elastic osteosythesis in femur fractures in children.Originality/value: The obtain results can be used by physicians to ensure elastic osteosythesis that accelerate bone union.

  1. Advanced response surface method for mechanical reliability analysis

    Institute of Scientific and Technical Information of China (English)

    L(U) Zhen-zhou; ZHAO Jie; YUE Zhu-feng

    2007-01-01

    Based on the classical response surface method (RSM), a novel RSM using improved experimental points (EPs) is presented for reliability analysis. Two novel points are included in the presented method. One is the use of linear interpolation, from which the total EPs for determining the RS are selected to be closer to the actual failure surface;the other is the application of sequential linear interpolation to control the distance between the surrounding EPs and the center EP, by which the presented method can ensure that the RS fits the actual failure surface in the region of maximum likelihood as the center EPs converge to the actual most probable point (MPP). Since the fitting precision of the RS to the actual failure surface in the vicinity of the MPP, which has significant contribution to the probability of the failure surface being exceeded, is increased by the presented method, the precision of the failure probability calculated by RS is increased as well. Numerical examples illustrate the accuracy and efficiency of the presented method.

  2. Analysis of Fiber deposition using Automatic Image Processing Method

    Science.gov (United States)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  3. Analysis of Fiber deposition using Automatic Image Processing Method

    Directory of Open Access Journals (Sweden)

    Jicha M.

    2013-04-01

    Full Text Available Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  4. SIMS: a hybrid method for rapid conformational analysis.

    Directory of Open Access Journals (Sweden)

    Bryant Gipson

    Full Text Available Proteins are at the root of many biological functions, often performing complex tasks as the result of large changes in their structure. Describing the exact details of these conformational changes, however, remains a central challenge for computational biology due the enormous computational requirements of the problem. This has engendered the development of a rich variety of useful methods designed to answer specific questions at different levels of spatial, temporal, and energetic resolution. These methods fall largely into two classes: physically accurate, but computationally demanding methods and fast, approximate methods. We introduce here a new hybrid modeling tool, the Structured Intuitive Move Selector (sims, designed to bridge the divide between these two classes, while allowing the benefits of both to be seamlessly integrated into a single framework. This is achieved by applying a modern motion planning algorithm, borrowed from the field of robotics, in tandem with a well-established protein modeling library. sims can combine precise energy calculations with approximate or specialized conformational sampling routines to produce rapid, yet accurate, analysis of the large-scale conformational variability of protein systems. Several key advancements are shown, including the abstract use of generically defined moves (conformational sampling methods and an expansive probabilistic conformational exploration. We present three example problems that sims is applied to and demonstrate a rapid solution for each. These include the automatic determination of "active" residues for the hinge-based system Cyanovirin-N, exploring conformational changes involving long-range coordinated motion between non-sequential residues in Ribose-Binding Protein, and the rapid discovery of a transient conformational state of Maltose-Binding Protein, previously only determined by Molecular Dynamics. For all cases we provide energetic validations using well

  5. Analysis on electric energy measuring method based on multi-resolution analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-bing; CUI Jia-rui; LIANG Yuan-hua; WANG Mu-kun

    2006-01-01

    Along with the massive applications of the non-linear loads and the impact loads, many non-stationary stochastic signals such as harmonics, inter-harmonics, impulse signals and so on are introduced into the electric network, and these non-stationary stochastic signals have had effects on the accuracy of the measurement of electric energy. The traditional method like Fourier Analysis can be applied efficiently on the stationary stochastic signals, but it has little effect on non-stationary stochastic signals. In light of this, the form of the signals of the electric network in wavelet domain will be discussed in this paper. A measurement method of active power based on multi-resolution analysis in the stochastic process is presented. This method has a wider application scope compared with the traditional method Fourier analysis, and it is of good referential value and practical value in terms of raising the level of the existing electric energy measurement.

  6. Acoustic analysis of lightweight auto-body based on finite element method and boundary element method

    Institute of Scientific and Technical Information of China (English)

    LIANG Xinhua; ZHU Ping; LIN Zhongqin; ZHANG Yan

    2007-01-01

    A lightweight automotive prototype using alter- native materials and gauge thickness is studied by a numeri- cal method. The noise, vibration, and harshness (NVH) performance is the main target of this study. In the range of 1-150 Hz, the frequency response function (FRF) of the body structure is calculated by a finite element method (FEM) to get the dynamic behavior of the auto-body structure. The pressure response of the interior acoustic domain is solved by a boundary element method (BEM). To find the most contrib- uting panel to the inner sound pressure, the panel acoustic contribution analysis (PACA) is performed. Finally, the most contributing panel is located and the resulting structural optimization is found to be more efficient.

  7. Robust Colloidal Nanoparticles of Pyrrolopyrrole Cyanine J-Aggregates with Bright Near-Infrared Fluorescence in Aqueous Media: From Spectral Tailoring to Bioimaging Applications.

    Science.gov (United States)

    Yang, Cangjie; Wang, Xiaochen; Wang, Mingfeng; Xu, Keming; Xu, Chenjie

    2017-03-28

    Colloidal nanoparticles (NPs) containing near-infrared-fluorescent J-aggregates (JAGGs) of pyrrolopyrrole cyanines (PPcys) stabilized by amphiphilic block co-polymers were prepared in aqueous medium. JAGG formation can be tuned by means of the chemical structure of PPcys, the concentration of chromophores inside the polymeric NPs, and ultrasonication. The JAGG NPs exhibit a narrow emission band at 773 nm, a fluorescence quantum yield comparable to that of indocyanine green, and significantly enhanced photostability, which is ideal for long-term bioimaging.

  8. Statistical methods for the detection and analysis of radioactive sources

    Science.gov (United States)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  9. Statistical analysis of the precision of the Match method

    Directory of Open Access Journals (Sweden)

    R. Lehmann

    2005-05-01

    Full Text Available The Match method quantifies chemical ozone loss in the polar stratosphere. The basic idea consists in calculating the forward trajectory of an air parcel that has been probed by an ozone measurement (e.g., by an ozone sonde or satellite and finding a second ozone measurement close to this trajectory. Such an event is called a ''match''. A rate of chemical ozone destruction can be obtained by a statistical analysis of several tens of such match events. Information on the uncertainty of the calculated rate can be inferred from the scatter of the ozone mixing ratio difference (second measurement minus first measurement associated with individual matches. A standard analysis would assume that the errors of these differences are statistically independent. However, this assumption may be violated because different matches can share a common ozone measurement, so that the errors associated with these match events become statistically dependent. Taking this effect into account, we present an analysis of the uncertainty of the final Match result. It has been applied to Match data from the Arctic winters 1995, 1996, 2000, and 2003. For these ozone-sonde Match studies the effect of the error correlation on the uncertainty estimates is rather small: compared to a standard error analysis, the uncertainty estimates increase by 15% on average. However, the effect is more pronounced for typical satellite Match analyses: for an Antarctic satellite Match study (2003, the uncertainty estimates increase by 60% on average.

  10. Transient Analysis of Air-Core Coils by Moment Method

    Science.gov (United States)

    Fujita, Akira; Kato, Shohei; Hirai, Takao; Okabe, Shigemitu

    In electric power system a threat of lighting surge is decreased by using ground wire and arrester, but the risk of failure of transformer is still high. Winding is the most familiar conductor configuration of electromagnetic field components such as transformer, resistors, reactance device etc. Therefore, it is important that we invest the lighting surge how to advance into winding, but the electromagnet coupling in a winding makes lighting surge analysis difficult. In this paper we present transient characteristics analysis of an air-core coils by moment method in frequency domain. We calculate the inductance from time response and impedance in low frequency, and compare them with the analytical equation which is based on Nagaoka factor.

  11. ANALYSIS METHODS OF BANKRUPTCY RISK IN ROMANIAN ENERGY MINING INDUSTRY

    Directory of Open Access Journals (Sweden)

    CORICI MARIAN CATALIN

    2016-12-01

    Full Text Available The study is an analysis of bankruptcy risk and assessing the economic performance of the entity in charge of energy mining industry from southwest region. The scientific activity assesses the risk of bankruptcy using score’s method and some indicators witch reflecting the results obtained and elements from organization balance sheet involved in mining and energy which contributes to the stability of the national energy system. Analysis undertaken is focused on the application of the business organization models that allow a comprehensive assessment of the risk of bankruptcy and be an instrument of its forecast. In this study will be highlighted developments bankruptcy risk within the organization through the Altman model and Conan-Holder model in order to show a versatile image on the organization's ability to ensure business continuity

  12. Bayesian methods for the design and analysis of noninferiority trials.

    Science.gov (United States)

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  13. Genome analysis methods - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available [ Credits ] BLAST Search Image Search Home About Archive Update History Contact us PGDBj Registered...ear Year of genome analysis Sequencing method Sequencing method Read counts Read counts Covered genome region Covered...otation method Number of predicted genes Number of predicted genes Genome database Genome database informati... License Update History of This Database Site Policy | Contact Us Genome analysis... methods - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  14. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  15. Conceptual design of an undulator system for a dedicated bio-imaging beamline at the European X-ray FEL

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany); Kocharyan, Vitali; Saldin, Evgeni [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2012-05-15

    We describe a future possible upgrade of the European XFEL consisting in the construction of an undulator beamline dedicated to life science experiments. The availability of free undulator tunnels at the European XFEL facility offers a unique opportunity to build a beamline optimized for coherent diffraction imaging of complex molecules, like proteins and other biologically interesting structures. Crucial parameters for such bio-imaging beamline are photon energy range, peak power, and pulse duration. Key component of the setup is the undulator source. The peak power is maximized in the photon energy range between 3 keV and 13 keV by the use of a very efficient combination of self-seeding, fresh bunch and tapered undulator techniques. The unique combination of ultra-high peak power of 1 TW in the entire energy range, and ultrashort pulse duration tunable from 2 fs to 10 fs, would allow for single shot coherent imaging of protein molecules with size larger than 10 nm. Also, the new beamline would enable imaging of large biological structures in the water window, between 0.3 keV and 0.4 keV. In order to make use of standardized components, at present we favor the use of SASE3-type undulator segments. The number segments, 40, is determined by the tapered length for the design output power of 1 TW. The present plan assumes the use of a nominal electron bunch with charge of 0.1 nC. Experiments will be performed without interference with the other three undulator beamlines. Therefore, the total amount of scheduled beam time per year is expected to be up to 4000 hours.

  16. Pharmacokinetics of quercetin-loaded nanodroplets with ultrasound activation and their use for bioimaging

    Directory of Open Access Journals (Sweden)

    Chang LW

    2015-04-01

    Full Text Available Li-Wen Chang,1 Mei-Ling Hou,1 Shuo-Hui Hung,2 Lie-Chwen Lin,3 Tung-Hu Tsai1,4–6 1Institute of Traditional Medicine, School of Medicine, National Yang-Ming University, 2Department of Surgery, 3National Research Institute of Chinese Medicine, Ministry of Health and Welfare, Taipei, 4Department of Education and Research, Taipei City Hospital, 5School of Pharmacy, College of Pharmacy, Kaohsiung Medical University, Kaohsiung, 6Graduate Institute of Acupuncture Science, China Medical University, Taichung, Taiwan Abstract: Bubble formulations have both diagnostic and therapeutic applications. However, research on nanobubbles/nanodroplets remains in the initial stages. In this study, a nanodroplet formulation was prepared and loaded with a novel class of chemotherapeutic drug, ie, quercetin, to observe its pharmacokinetic properties and ultrasonic bioimaging of specific sites, namely the abdominal vein and bladder. Four parallel groups were designed to investigate the effects of ultrasound and nanodroplets on the pharmacokinetics of quercetin. These groups were quercetin alone, quercetin triggered with ultrasound, quercetin-encapsulated in nanodroplets, and quercetin encapsulated in nanodroplets triggered with ultrasound. Spherical vesicles with a mean diameter of 280 nm were formed, and quercetin was completely encapsulated within. In vivo ultrasonic imaging confirmed that the nanodroplets could be treated by ultrasound. The results indicate that the initial 5-minute serum concentration, area under the concentration–time curve, elimination half-life, and clearance of quercetin were significantly enhanced by nanodroplets with or without ultrasound. Keywords: nanodroplets, quercetin, ultrasonic pharmacokinetics, ultrasonic imaging, ultrasound

  17. Analysis of PCA Method in Image Recognition with MATALAB

    Institute of Scientific and Technical Information of China (English)

    ZHAO Ping

    2014-01-01

    The growing need for effective biometric identification is widely acknowledged. Human face recognition is an important area in the field of biometrics. It has been an active area of research for several decades,but still remains a challenging problem because of the complexity of the human face. The Principal Component Analysis(PCA),or the eigenface method,is a de - facto standard in human face recognition. In this paper,the principle of PCA is introduced and the compressing and rebuilding of the image is accomplished with matlab program.

  18. Eigenvalue analysis using a full-core Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Okafor, K.C.; Zino, J.F. (Westinghouse Savannah River Co., Aiken, SC (United States))

    1992-01-01

    The reactor physics codes used at the Savannah River Site (SRS) to predict reactor behavior have been continually benchmarked against experimental and operational data. A particular benchmark variable is the observed initial critical control rod position. Historically, there has been some difficulty predicting this position because of the difficulties inherent in using computer codes to model experimental or operational data. The Monte Carlo method is applied in this paper to study the initial critical control rod positions for the SRS K Reactor. A three-dimensional, full-core MCNP model of the reactor was developed for this analysis.

  19. Primal and Dual Integrated Force Methods Used for Stochastic Analysis

    Science.gov (United States)

    Patnaik, Surya N.

    2005-01-01

    At the NASA Glenn Research Center, the primal and dual integrated force methods are being extended for the stochastic analysis of structures. The stochastic simulation can be used to quantify the consequence of scatter in stress and displacement response because of a specified variation in input parameters such as load (mechanical, thermal, and support settling loads), material properties (strength, modulus, density, etc.), and sizing design variables (depth, thickness, etc.). All the parameters are modeled as random variables with given probability distributions, means, and covariances. The stochastic response is formulated through a quadratic perturbation theory, and it is verified through a Monte Carlo simulation.

  20. Numerical analysis of sound transmission loss using FDTD method

    OpenAIRE

    Murakami, Keiichi; Aoyama, Takashi; 村上, 桂一; 青山, 剛史

    2009-01-01

    This paper provides the results of a numerical analysis on sound transmission loss of a thin aluminum plate. The finite difference time domain (FDTD) method is used in this study because it simultaneously solves both sound wave propagation in fluid and elastic wave propagation in solid. The calculated value of sound transmission loss gives good agreement with that of mass law. Sound transmission of saw-shaped wave approximated by the sum of sine waves is also calculated. As a result, it is co...

  1. CAD—Oriented Noise Analysis Method of Nonlinear Microwave Chircuits

    Institute of Scientific and Technical Information of China (English)

    WANGJun; TANGGaodi; CHENHuilian

    2003-01-01

    A general method is introduced which is capable of making accurate,quantitative predictions about the noise of different type of nonlinear microwave circuits.This new approach also elucidates several design criteria for making it suitable to CAD-oriented analysis via identifying the mechanisms by which intrinsic device noise and external noise sources contribute to the total equivalent noise.In particular,it explains the details of how noise spectrum at the interesting port is obtained.And the theory also naturally leads to additional important design insights.In the illustrative experiments,excellent agreement among theory,simulations,and measurements is observed.

  2. Roof collapse of shallow tunnels with limit analysis method

    Institute of Scientific and Technical Information of China (English)

    YANG Xiao-li; LONG Ze-xiang

    2015-01-01

    A new failure mechanism is proposed to analyze the roof collapse based on nonlinear failure criterion. Limit analysis approach and variational principle are used to obtain analytical findings concerning the stability of potential roof. Then, parametric study is carried out to derive the change rule of corresponding parameters on the influence of collapsing shape, which is of paramount engineering significance to instruct the tunnel excavations. In comparison with existing results, the findings show agreement and validity of the proposed method. The actual collapse in certain shallow tunnels is well in accordance with the proposed failure mechanism.

  3. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  4. Analysis of electroperforated materials using the quadrat counts method

    Energy Technology Data Exchange (ETDEWEB)

    Miranda, E; Garzon, C; Garcia-Garcia, J [Departament d' Enginyeria Electronica, Universitat Autonoma de Barcelona, 08193 Bellaterra, Barcelona (Spain); MartInez-Cisneros, C; Alonso, J, E-mail: enrique.miranda@uab.cat [Departament de Quimica AnalItica, Universitat Autonoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2011-06-23

    The electroperforation distribution in thin porous materials is investigated using the quadrat counts method (QCM), a classical statistical technique aimed to evaluate the deviation from complete spatial randomness (CSR). Perforations are created by means of electrical discharges generated by needle-like tungsten electrodes. The objective of perforating a thin porous material is to enhance its air permeability, a critical issue in many industrial applications involving paper, plastics, textiles, etc. Using image analysis techniques and specialized statistical software it is shown that the perforation locations follow, beyond a certain length scale, a homogeneous 2D Poisson distribution.

  5. Generalized Method of Variational Analysis for 3-D Flow

    Institute of Scientific and Technical Information of China (English)

    兰伟仁; 黄思训; 项杰

    2004-01-01

    The generalized method of variational analysis (GMVA) suggested for 2-D wind observations by Huang et al. is extended to 3-D cases. Just as in 2-D cases, the regularization idea is applied. But due to the complexity of the 3-D cases, the vertical vorticity is taken as a stable functional. The results indicate that wind observations can be both variationally optimized and filtered. The efficiency of GMVA is also checked in a numerical test. Finally, 3-D wind observations with random disturbances are manipulated by GMVA after being filtered.

  6. Methods of analysis applied on the e-shop Arsta

    OpenAIRE

    Flégl, Jan

    2013-01-01

    Bachelor thesis is focused on summarizing methods of e-shop analysis. The first chapter summarizes and describes the basics of e-commerce and e-shops in general. The second chapter deals with search engines, their functioning and in what ways it is possible to influence the order of search results. Special attention is paid to the optimization and search engine marketing. The third chapter summarizes basic tools of the Google Analytics. The fourth chapter uses findings of all the previous cha...

  7. Infrared medical image visualization and anomalies analysis method

    Science.gov (United States)

    Gong, Jing; Chen, Zhong; Fan, Jing; Yan, Liang

    2015-12-01

    Infrared medical examination finds the diseases through scanning the overall human body temperature and obtaining the temperature anomalies of the corresponding parts with the infrared thermal equipment. In order to obtain the temperature anomalies and disease parts, Infrared Medical Image Visualization and Anomalies Analysis Method is proposed in this paper. Firstly, visualize the original data into a single channel gray image: secondly, turn the normalized gray image into a pseudo color image; thirdly, a method of background segmentation is taken to filter out background noise; fourthly, cluster those special pixels with the breadth-first search algorithm; lastly, mark the regions of the temperature anomalies or disease parts. The test is shown that it's an efficient and accurate way to intuitively analyze and diagnose body disease parts through the temperature anomalies.

  8. Method for fractional solid-waste sampling and chemical analysis

    DEFF Research Database (Denmark)

    Riber, Christian; Rodushkin, I.; Spliid, Henrik

    2007-01-01

    to repeated particle-size reduction, mixing, and mass reduction until a sufficiently small but representative sample was obtained for digestion prior to chemical analysis. The waste-fraction samples were digested according to their properties for maximum recognition of all the studied substances. By combining...... four subsampling methods and five digestion methods, paying attention to the heterogeneity and the material characteristics of the waste fractions, it was possible to determine 61 substances with low detection limits, reasonable variance, and high accuracy. For most of the substances of environmental...... concern, the waste-sample concentrations were above the detection limit (e.g. Cd gt; 0.001 mg kg-1, Cr gt; 0.01 mg kg-1, Hg gt; 0.002 mg kg-1, Pb gt; 0.005 mg kg-1). The variance was in the range of 5-100%, depending on material fraction and substance as documented by repeated sampling of two highly...

  9. Comparative analysis of minor histocompatibility antigens genotyping methods

    Directory of Open Access Journals (Sweden)

    A. S. Vdovin

    2016-01-01

    Full Text Available The wide range of techniques could be employed to find mismatches in minor histocompatibility antigens between transplant recipients and their donors. In the current study we compared three genotyping methods based on polymerase chain reaction (PCR for four minor antigens. Three of the tested methods: allele-specific PCR, restriction fragment length polymorphism and real-time PCR with TaqMan probes demonstrated 100% reliability when compared to Sanger sequencing for all of the studied polymorphisms. High resolution melting analysis was unsuitable for genotyping of one of the tested minor antigens (HA-1 as it has linked synonymous polymorphism. Obtained data could be used to select the strategy for large-scale clinical genotyping.

  10. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used......-Douglas function nor the Translog function are consistent with the “true” relationship between the inputs and the output in our data set. We solve this problem by using non-parametric regression. This approach delivers reasonable results, which are on average not too different from the results of the parametric...

  11. Using non-parametric methods in econometric production analysis

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    Econometric estimation of production functions is one of the most common methods in applied economic production analysis. These studies usually apply parametric estimation techniques, which obligate the researcher to specify the functional form of the production function. Most often, the Cobb......-Douglas or the Translog production function is used. However, the specification of a functional form for the production function involves the risk of specifying a functional form that is not similar to the “true” relationship between the inputs and the output. This misspecification might result in biased estimation...... results—including measures that are of interest of applied economists, such as elasticities. Therefore, we propose to use nonparametric econometric methods. First, they can be applied to verify the functional form used in parametric estimations of production functions. Second, they can be directly used...

  12. Coarse Analysis of Microscopic Models using Equation-Free Methods

    DEFF Research Database (Denmark)

    Marschler, Christian

    -dimensional models. The goal of this thesis is to investigate such high-dimensional multiscale models and extract relevant low-dimensional information from them. Recently developed mathematical tools allow to reach this goal: a combination of so-called equation-free methods with numerical bifurcation analysis....... Applications include the learning behavior in the barn owl’s auditory system, traffic jam formation in an optimal velocity model for circular car traffic and oscillating behavior of pedestrian groups in a counter-flow through a corridor with narrow door. The methods do not only quantify interesting properties...... factor for the complexity of models, e.g., in real-time applications. With the increasing amount of data generated by computer simulations a challenge is to extract valuable information from the models in order to help scientists and managers in a decision-making process. Although the dynamics...

  13. Latent semantic analysis: a new method to measure prose recall.

    Science.gov (United States)

    Dunn, John C; Almeida, Osvaldo P; Barclay, Lee; Waterreus, Anna; Flicker, Leon

    2002-02-01

    The aim of this study was to compare traditional methods of scoring the Logical Memory test of the Wechsler Memory Scale-III with a new method based on Latent Semantic Analysis (LSA). LSA represents texts as vectors in a high-dimensional semantic space and the similarity of any two texts is measured by the cosine of the angle between their respective vectors. The Logical Memory test was administered to a sample of 72 elderly individuals, 14 of whom were classified as cognitively impaired by the Mini-Mental State Examination (MMSE). The results showed that LSA was at least as valid and sensitive as traditional measures. Partial correlations between prose recall measures and measures of cognitive function indicated that LSA explained all the relationship between Logical Memory and general cognitive function. This suggests that LSA may serve as an improved measure of prose recall.

  14. Pressure transient analysis methods for bounded naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Chen, C-C; Raghavan, R.; Reynolds, A.C.; Serra, K.

    1985-06-01

    New methods for analyzing drawdown and buildup pressure data obtained at a well located in an infinite, naturally fractured reservoir were presented recently. In this work, the analysis of both drawdown and buildup data in a bounded, naturally fractured reservoir is considered. For the bounded case, the authors show that five possible flow regimes may be exhibited by drawdown data. They delineate the conditions under which each of these five flow regimes exists and the information that can be obtained from each possible combination of flow regimes. Conditions under which semilog methods can be used to analyze buildup data are discussed for the bounded fractured reservoir case. New Matthews-Brons-Hazebroek (MBH) functions for computing the average reservoir pressure from buildup data are presented.

  15. Method for measuring anterior chamber volume by image analysis

    Science.gov (United States)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  16. NUMERICAL METHOD AND RANDOM ANALYSIS OF CEMENT CONCRETE EXPANSION

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The numerical method and random analysis of cement concrete expansion are given. A mathematical procedure is presented which includes the nonlinear characteristics of the concrete. An expression is presented to predict the linear restrained expansion of expansive concrete bar restrained by a steel rod. The results indicate a rapid change in strains and stresses within initial days, after which the change gradually decreases. A reliable and accurate method of predicting the behavior of the concrete bulkheads in drifts is presented here. Extensive sensitivity and parametric studies have been performed. The random density distributions of expansive concrete are given based on the restricted or unrestricted condition. These studies show that the bulkhead stress fields are largely influenced by the early modulus of the concrete and the randomness of the ultimate unrestrained expansion of the concrete.

  17. Computational mathematics models, methods, and analysis with Matlab and MPI

    CERN Document Server

    White, Robert E

    2004-01-01

    Computational Mathematics: Models, Methods, and Analysis with MATLAB and MPI explores and illustrates this process. Each section of the first six chapters is motivated by a specific application. The author applies a model, selects a numerical method, implements computer simulations, and assesses the ensuing results. These chapters include an abundance of MATLAB code. By studying the code instead of using it as a "black box, " you take the first step toward more sophisticated numerical modeling. The last four chapters focus on multiprocessing algorithms implemented using message passing interface (MPI). These chapters include Fortran 9x codes that illustrate the basic MPI subroutines and revisit the applications of the previous chapters from a parallel implementation perspective. All of the codes are available for download from www4.ncsu.edu./~white.This book is not just about math, not just about computing, and not just about applications, but about all three--in other words, computational science. Whether us...

  18. Analysis of equivalent antenna based on FDTD method

    Institute of Scientific and Technical Information of China (English)

    Yun-xing YANG; Hui-chang ZHAO; Cui DI

    2014-01-01

    An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD) method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is air)takes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  19. Finite Element Method for Analysis of Material Properties

    DEFF Research Database (Denmark)

    Rauhe, Jens Christian

    description of the material microstructure the finite element models must contain a large number of elements and this problem is solved by using the preconditioned conjugated gradient solver with an Element-By-Element preconditioner. Finite element analysis provides the volume averaged stresses and strains...... and the finite element method. The material microstructure of the heterogeneous material is non-destructively determined using X-ray microtomography. A software program has been generated which uses the X-ray tomographic data as an input for the mesh generation of the material microstructure. To obtain a proper...... which are used for the determination of the effective properties of the heterogeneous material. Generally, the properties determined using the finite element method coupled with X-ray microtomography are in good agreement with both experimentally determined properties and properties determined using...

  20. Spectral analysis methods for vehicle interior vibro-acoustics identification

    Science.gov (United States)

    Hosseini Fouladi, Mohammad; Nor, Mohd. Jailani Mohd.; Ariffin, Ahmad Kamal

    2009-02-01

    Noise has various effects on comfort, performance and health of human. Sound are analysed by human brain based on the frequencies and amplitudes. In a dynamic system, transmission of sound and vibrations depend on frequency and direction of the input motion and characteristics of the output. It is imperative that automotive manufacturers invest a lot of effort and money to improve and enhance the vibro-acoustics performance of their products. The enhancement effort may be very difficult and time-consuming if one relies only on 'trial and error' method without prior knowledge about the sources itself. Complex noise inside a vehicle cabin originated from various sources and travel through many pathways. First stage of sound quality refinement is to find the source. It is vital for automotive engineers to identify the dominant noise sources such as engine noise, exhaust noise and noise due to vibration transmission inside of vehicle. The purpose of this paper is to find the vibro-acoustical sources of noise in a passenger vehicle compartment. The implementation of spectral analysis method is much faster than the 'trial and error' methods in which, parts should be separated to measure the transfer functions. Also by using spectral analysis method, signals can be recorded in real operational conditions which conduce to more consistent results. A multi-channel analyser is utilised to measure and record the vibro-acoustical signals. Computational algorithms are also employed to identify contribution of various sources towards the measured interior signal. These achievements can be utilised to detect, control and optimise interior noise performance of road transport vehicles.