WorldWideScience

Sample records for biosystem analyzing technology

  1. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  2. An updated validation of Promega's PowerPlex 16 System: high throughput databasing under reduced PCR volume conditions on Applied Biosystem's 96 capillary 3730xl DNA Analyzer.

    Science.gov (United States)

    Spathis, Rita; Lum, J Koji

    2008-11-01

    The PowerPlex 16 System from Promega Corporation allows single tube multiplex amplification of sixteen short tandem repeat (STR) loci including all 13 core combined DNA index system STRs. This report presents an updated validation of the PowerPlex 16 System on Applied Biosystem's 96 capillary 3730xl DNA Analyzer. The validation protocol developed in our laboratory allows for the analysis of 1536 loci (96 x 16) in c. 50 min. We have further optimized the assay by decreasing the reaction volume to one-quarter that recommended by the manufacturer thereby substantially reducing the total cost per sample without compromising reproducibility or specificity. This reduction in reaction volume has the ancillary benefit of dramatically increasing the sensitivity of the assay allowing for accurate analysis of lower quantities of DNA. Due to its substantially increased throughput capability, this extended validation of the PowerPlex 16 System should be useful in reducing the backlog of unanalyzed DNA samples currently facing public DNA forensic laboratories.

  3. Application of structural health monitoring technologies to bio-systems: current status and path forward

    Science.gov (United States)

    Bhalla, Suresh; Srivastava, Shashank; Suresh, Rupali; Moharana, Sumedha; Kaur, Naveet; Gupta, Ashok

    2015-03-01

    This paper presents a case for extension of structural health monitoring (SHM) technologies to offer solutions for biomedical problems. SHM research has made remarkable progress during the last two/ three decades. These technologies are now being extended for possible applications in the bio-medical field. Especially, smart materials, such as piezoelectric ceramic (PZT) patches and fibre-Bragg grating (FBG) sensors, offer a new set of possibilities to the bio-medical community to augment their conventional set of sensors, tools and equipment. The paper presents some of the recent extensions of SHM, such as condition monitoring of bones, monitoring of dental implant post surgery and foot pressure measurement. Latest developments, such as non-bonded configuration of PZT patches for monitoring bones and possible applications in osteoporosis detection, are also discussed. In essence, there is a whole new gamut of new possibilities for SHM technologies making their foray into the bi-medical sector.

  4. BioSystems

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NCBI BioSystems Database provides integrated access to biological systems and their component genes, proteins, and small molecules, as well as literature...

  5. Industrial Biosystems Engineering and Biorefinery Systems

    Institute of Scientific and Technical Information of China (English)

    Shulin Chen

    2008-01-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development

  6. Industrial biosystems engineering and biorefinery systems.

    Science.gov (United States)

    Chen, Shulin

    2008-06-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development.

  7. The NCBI BioSystems database.

    Science.gov (United States)

    Geer, Lewis Y; Marchler-Bauer, Aron; Geer, Renata C; Han, Lianyi; He, Jane; He, Siqian; Liu, Chunlei; Shi, Wenyao; Bryant, Stephen H

    2010-01-01

    The NCBI BioSystems database, found at http://www.ncbi.nlm.nih.gov/biosystems/, centralizes and cross-links existing biological systems databases, increasing their utility and target audience by integrating their pathways and systems into NCBI resources. This integration allows users of NCBI's Entrez databases to quickly categorize proteins, genes and small molecules by metabolic pathway, disease state or other BioSystem type, without requiring time-consuming inference of biological relationships from the literature or multiple experimental datasets.

  8. Technology for collecting and analyzing relational data

    Directory of Open Access Journals (Sweden)

    E. N. Fedorova

    2016-01-01

    summarize the information there is a mechanism of data grouping, which provides general data of the number of entries, maximum, minimum, average values for different groups of records.Results. This technology has been tested in the monitoring requirements of the services of additional professional education and the definition of the educational needs of teachers and executives of educational organizations of the Irkutsk region. The survey has involved 2,780 respondents in 36 municipalities. Creating the data model took several hours. The survey was conducted during the month.Conclusion. The proposed technology allows a short time to collect the information in relational form, and then analyze it without the need for programming with flexible assignment of the operating logic for form.

  9. Effect of various normalization methods on Applied Biosystems expression array system data

    Directory of Open Access Journals (Sweden)

    Keys David N

    2006-12-01

    Full Text Available Abstract Background DNA microarray technology provides a powerful tool for characterizing gene expression on a genome scale. While the technology has been widely used in discovery-based medical and basic biological research, its direct application in clinical practice and regulatory decision-making has been questioned. A few key issues, including the reproducibility, reliability, compatibility and standardization of microarray analysis and results, must be critically addressed before any routine usage of microarrays in clinical laboratory and regulated areas can occur. In this study we investigate some of these issues for the Applied Biosystems Human Genome Survey Microarrays. Results We analyzed the gene expression profiles of two samples: brain and universal human reference (UHR, a mixture of RNAs from 10 cancer cell lines, using the Applied Biosystems Human Genome Survey Microarrays. Five technical replicates in three different sites were performed on the same total RNA samples according to manufacturer's standard protocols. Five different methods, quantile, median, scale, VSN and cyclic loess were used to normalize AB microarray data within each site. 1,000 genes spanning a wide dynamic range in gene expression levels were selected for real-time PCR validation. Using the TaqMan® assays data set as the reference set, the performance of the five normalization methods was evaluated focusing on the following criteria: (1 Sensitivity and reproducibility in detection of expression; (2 Fold change correlation with real-time PCR data; (3 Sensitivity and specificity in detection of differential expression; (4 Reproducibility of differentially expressed gene lists. Conclusion Our results showed a high level of concordance between these normalization methods. This is true, regardless of whether signal, detection, variation, fold change measurements and reproducibility were interrogated. Furthermore, we used TaqMan® assays as a reference, to generate

  10. The Kernel Estimation in Biosystems Engineering

    Directory of Open Access Journals (Sweden)

    Esperanza Ayuga Téllez

    2008-04-01

    Full Text Available In many fields of biosystems engineering, it is common to find works in which statistical information is analysed that violates the basic hypotheses necessary for the conventional forecasting methods. For those situations, it is necessary to find alternative methods that allow the statistical analysis considering those infringements. Non-parametric function estimation includes methods that fit a target function locally, using data from a small neighbourhood of the point. Weak assumptions, such as continuity and differentiability of the target function, are rather used than "a priori" assumption of the global target function shape (e.g., linear or quadratic. In this paper a few basic rules of decision are enunciated, for the application of the non-parametric estimation method. These statistical rules set up the first step to build an interface usermethod for the consistent application of kernel estimation for not expert users. To reach this aim, univariate and multivariate estimation methods and density function were analysed, as well as regression estimators. In some cases the models to be applied in different situations, based on simulations, were defined. Different biosystems engineering applications of the kernel estimation are also analysed in this review.

  11. Analyzing the next-generation catalog a library technology report

    CERN Document Server

    Nagy, Andrew

    2011-01-01

    his issue of ""Library Technology Reports"" analyzes five different academic libraries to better understand their investments, detailing the outcome thus far and drawing conclusions about the next-generation catalog.

  12. Analyzing the Limitation of Technology in Teacher Preparation Courses.

    Science.gov (United States)

    Baldin, Yuriko Yamamoto

    2003-01-01

    Discusses whether mathematics teachers are being prepared to realize the limitations of technology in teaching activities and recognize conceptual problems in technology-based activities. Suggests a course to prepare teachers with skills to analyze existing materials as well as create their own activities. Illustrates this with examples from CAS,…

  13. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of technologies for producing substitute fuel for petroleum by utilizing organisms; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Seibutsu riyo sekiyu daitai nenryo seizo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Technologies of producing useful substances using the substance decomposing/producing functions of complex biosystems and methods of their handling are developed. In the utilization of microbes in the digestive tracts of termites and longicorns, it is made clear that several kinds of termites cleave the {beta}-O-4 ether linkage. In relation to technologies for wood decomposing complex microbial system construction and complex vector system development, a screening system is constructed in which strains that exhibit complex actions are combined. Concerning the advanced utilization of tropical oil plants, conditions are determined for inducing callus out of oil palm tissues. Out of oil palm sarcocarp tissues, mRNA (messenger ribonucleic acid) is isolated for the construction of a cDNA (complementary deoxyribonucleic acid) library. For the purpose of isolating a powerful promoter, a partial base sequence is determined for ubiquitin that frequently expresses itself in cells. A pathogenic bacterium ailing the oil palm is sampled for identification, and it is inferred that the bacterium is a kind of Ganoderma boninense. (NEDO)

  14. Abstracts of the 17. world congress of the International Commission of Agriculture and Biosystems Engineering (CIGR) : sustainable biosystems through engineering

    Energy Technology Data Exchange (ETDEWEB)

    Savoie, P.; Villeneuve, J.; Morisette, R. [Agriculture and Agri-Food Canada, Quebec City, PQ (Canada). Soils and Crops Research and Development Centre] (eds.)

    2010-07-01

    This international conference provided a forum to discuss methods to produce agricultural products more efficiently through improvements in engineering and technology. It was attended by engineers and scientists working from different perspectives on biosystems. Beyond food, farms and forests can provide fibre, bio-products and renewable energy. Seven sections of CIGR were organized in the following technical sessions: (1) land and water engineering, (2) farm buildings, equipment, structures and environment, (3) equipment engineering for plants, (4) energy in agriculture, (5) management, ergonomics and systems engineering, (6) post harvest technology and process engineering, and (7) information systems. The Canadian Society of Bioengineering (CSBE) merged its technical program within the 7 sections of CIGR. Four other groups also held their activities during the conference. The American Society of Agricultural and Biological Engineers (ASABE) organized its 9th international drainage symposium and the American Ecological Engineering Society (AEES) held its 10th annual meeting. The International Network for Information Technology in Agriculture (INFITA), and the 8th world congress on computers in agriculture also joined CIGR 2010.

  15. Expression of adhesion molecules and collagen on rat chondrocyte seeded into alginate and hyaluronate based 3D biosystems. Influence of mechanical stresses.

    Science.gov (United States)

    Gigant-Huselstein, C; Hubert, P; Dumas, D; Dellacherie, E; Netter, P; Payan, E; Stoltz, J F

    2004-01-01

    Chondrocytes use mechanical signals, via interactions with their environment, to synthesize an extracellular matrix capable to withstanding high loads. Most chondrocyte-matrix interactions are mediated via transmembrane receptors such as integrins or non-integrins receptors (i.e. annexin V and CD44). The aim of this study was to analyze, by flow cytometry, the adhesion molecules (alpha5/beta1 integrins and CD44) on rat chondrocytes seeded into 3D biosystem made of alginate and hyaluronate. These biosystems were submitted to mechanical stress by knocking the biosystems between them for 48 hours. The expression of type I and type II collagen was also evaluated. The results of the current study showed that mechanical stress induced an increase of type II collagen production and weak variations of alpha5/beta1 receptors expression no matter what biosystems. Moreover, our results indicated that hyaluronan receptor CD44 expression depends on extracellular matrix modifications. Thus, these receptors were activated by signals resulted from cell environment variations (HA addition and modifications owing to mechanical stress). It suggested that this kind of receptor play a crucial role in extracellular matrix homeostasis. Finally, on day 24, no dedifferentiation of chondrocytes was noted either in biosystems or under mechanical stress. For all biosystems, the neosynthesized matrix contained an important level of collagen, which was type II, whatever biosystems. In conclusion, it appeared that the cells, under mechanical stress, maintained their phenotype. In addition, it seems that, on rat chondrocytes, alpha5/beta1 integrins did not act as the main mechanoreceptor (as described for human chondrocytes). In return, hyaluronan receptor CD44 seems to be in relation with matrix composition.

  16. Financial options methodology for analyzing investments in new technology

    Energy Technology Data Exchange (ETDEWEB)

    Wenning, B.D. [Texas Utilities Services, Inc., Dallas, TX (United States)

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  17. Study on Algae Removal by Immobilized Biosystem on Sponge

    Institute of Scientific and Technical Information of China (English)

    PEI Haiyan; HU Wenrong

    2006-01-01

    In this study, sponges were used to immobilize domesticated sludge microbes in a limited space, forming an immobilized biosystem capable of algae and microcystins removal. The removal effects on algae, microcystins and UV260 of this biosystem and the mechanism of algae removal were studied. The results showed that active sludge from sewage treatment plants was able to remove algae from a eutrophic lake's water after 7 d of domestication. The removal efficiency for algae,organic matter and microcystins increased when the domesticated sludge was immobilized on sponges. When the hydraulic retention time (HRT) was 5h, the removal rates of algae, microcystins and UV260 were 90%, 94.17% and 84%, respectively.The immobilized biosystem consisted mostly of bacteria, the Ciliata and Sarcodina protozoans and the Rotifer metazoans.Algal decomposition by zoogloea bacteria and preying by microcreatures were the two main modes of algal removal, which occurred in two steps: first, absorption by the zoogloea; second, decomposition by the zoogloea bacteria and the predacity of the microcreatures.

  18. Polarized 3He Gas Circulating Technologies for Neutron Analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David; Hersman, Bill

    2014-12-10

    We describe the development of an integrated system for quasi-continuous operation of a large volume neutron analyzer. The system consists of a non-magnetic diaphragm compressor, a prototype large volume helium polarizer, a surrogate neutron analyzer, a non-depolarizing gas storage reservoir, a non-ferrous valve manifold for handling gas distribution, a custom rubidium-vapor gas return purifier, and wire-wound transfer lines, all of which are immersed in a two-meter external magnetic field. Over the Phase II period we focused on three major tasks required for the successful deployment of these types of systems: 1) design and implementation of gas handling hardware, 2) automation for long-term operation, and 3) improvements in polarizer performance, specifically fabrication of aluminosilicate optical pumping cells. In this report we describe the design, implementation, and testing of the gas handling hardware. We describe improved polarizer performance resulting from improved cell materials and fabrication methods. These improvements yielded valved 8.5 liter cells with relaxation times greater than 12 hours. Pumping this cell with 1500W laser power with 1.25nm linewidth yielded peak polarizations of 60%, measured both inside and outside the polarizer. Fully narrowing this laser to 0.25nm, demonstrated separately on one stack of the four, would have allowed 70% polarization with this cell. We demonstrated the removal of 5 liters of polarized helium from the polarizer with no measured loss of polarization. We circulated the gas through a titanium-clad compressor with polarization loss below 3% per pass. We also prepared for the next phase of development by refining the design of the polarizer so that it can be engineer-certified for pressurized operation. The performance of our system far exceeds comparable efforts elsewhere.

  19. Nano-Biotechnology: Structure and Dynamics of Nanoscale Biosystems

    CERN Document Server

    Manjasetty, Babu A; Ramaswamy, Y S

    2010-01-01

    Nanoscale biosystems are widely used in numerous medical applications. The approaches for structure and function of the nanomachines that are available in the cell (natural nanomachines) are discussed. Molecular simulation studies have been extensively used to study the dynamics of many nanomachines including ribosome. Carbon Nanotubes (CNTs) serve as prototypes for biological channels such as Aquaporins (AQPs). Recently, extensive investigations have been performed on the transport of biological nanosystems through CNTs. The results are utilized as a guide in building a nanomachinary such as nanosyringe for a needle free drug delivery.

  20. An elongation method for large systems toward bio-systems.

    Science.gov (United States)

    Aoki, Yuriko; Gu, Feng Long

    2012-06-07

    The elongation method, proposed in the early 1990s, originally for theoretical synthesis of aperiodic polymers, has been reviewed. The details of derivation of the localization scheme adopted by the elongation method are described along with the elongation processes. The reliability and efficiency of the elongation method have been proven by applying it to various models of bio-systems, such as gramicidin A, collagen, DNA, etc. By means of orbital shift, the elongation method has been successfully applied to delocalized π-conjugated systems. The so-called orbital shift works in such a way that during the elongation process, some strongly delocalized frozen orbitals are assigned as active orbitals and joined with the interaction of the attacking monomer. By this treatment, it has been demonstrated that the total energies and non-linear optical properties determined by the elongation method are more accurate even for bio-systems and delocalized systems like fused porphyrin wires. The elongation method has been further developed for treating any three-dimensional (3D) systems and its applicability is confirmed by applying it to entangled insulin models whose terminal is capped by both neutral and zwitterionic sequences.

  1. Biosystems analysis and engineering of microbial consortia for industrial biotechnology

    Energy Technology Data Exchange (ETDEWEB)

    Sabra, Wael; Dietz, David; Tjahjasari, Donna; Zeng, An-Ping [Institute of Bioprocess and Biosystems Engineering, Hamburg University of Technology, Hamburg (Germany)

    2010-10-15

    The development of industrial biotechnology for an economical and ecological conversion of renewable materials into chemicals and fuels requires new strategies and concepts for bioprocessing. Biorefinery has been proposed as one of the key concepts with the aim of completely utilizing the substrate(s) and producing multiple products in one process or at one production site. In this article, we argue that microbial consortia can play an essential role to this end. To illustrate this, we first briefly describe some examples of existing industrial bioprocesses involving microbial consortia. New bioprocesses under development which make use of the advantages of microbial consortia are then introduced. Finally, we address some of the key issues and challenges for the analysis and engineering of bioprocesses involving microbial consortia from a perspective of biosystems engineering. (Copyright copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  2. Fiscal 1998 industrial science and technology R and D project. Research report on R and D of genome informatics technology (Development of stable oil supply measures using complex biosystem); 1998 nendo genome informatics gijutsu kenkyu kaihtsu seika hokokusho. Fukugo seibutsukei riyo sekiyu antei kyokyu taisaku kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This report describes the fiscal 1998 result on development of genome informatics technology. As comparative analysis technique of genes, the combination of electrophoresis and PCR was used. For improvement of the throughput and reproducibility of the technique, module- shuffling primers were used, and the multi(96)-arrayed capillary fragment analyzer was devised. The system detecting SNPs rapidly was also developed successfully. As analysis technology of DNA sequence by use of triple- stranded DNA formation, study was made on construction of long cDNA libraries, selective subtraction of specific sequences from libraries, and the basic technology of homologous cloning. Study was also made on each reaction step of IGCR technique for fast analysis, and specifications of a fluorescence transfer monitor. As modeling technique of genetic sequence information, the simulation model was developed for gene expression regulatory networks during muscle differentiation, and feedback regulation of period genes. Such support systems as transcription factor prediction and gene regulatory network inference were developed from existing data. (NEDO)

  3. INNOVATIVE TECHNOLOGY VERIFICATION REPORT "FIELD MEASUREMENT TECHNOLOGIES FOR TOTAL PETROLEUM HYDROCARBONS IN SOIL" HORIBA INSTRUMENTS INCORPORATED OCMA-350 CONTENT ANALYZER

    Science.gov (United States)

    The OCMA-350 Oil Content Analyzer(OCMA-350) developed by Horiba Instruments Incorporated (Horiba), was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in Port Huen...

  4. Develop and demonstrate a methodology using Janus(A) to analyze advanced technologies.

    OpenAIRE

    Wright, Jerry Vernon

    1991-01-01

    Approved for public release; Distribution is unlimited This thesis presents a study of a methodology for analyzing advanced technologies using the Janus(A) High Resolution Combat Model. The goal of this research was to verify that the methodology using Janus(A) gave expected or realistic results. The methodology used a case where the results were known: the addition of a long range direct fire weapon into a force on force battle. Both the weapon characteristics and force mixes were used as...

  5. A Hybrid Method of Analyzing Patents for Sustainable Technology Management in Humanoid Robot Industry

    Directory of Open Access Journals (Sweden)

    Jongchan Kim

    2016-05-01

    Full Text Available A humanoid, which refers to a robot that resembles a human body, imitates a human’s intelligence, behavior, sense, and interaction in order to provide various types of services to human beings. Humanoids have been studied and developed constantly in order to improve their performance. Humanoids were previously developed for simple repetitive or hard work that required significant human power. However, intelligent service robots have been developed actively these days to provide necessary information and enjoyment; these include robots manufactured for home, entertainment, and personal use. It has become generally known that artificial intelligence humanoid technology will significantly benefit civilization. On the other hand, Successful Research and Development (R & D on humanoids is possible only if they are developed in a proper direction in accordance with changes in markets and society. Therefore, it is necessary to analyze changes in technology markets and society for developing sustainable Management of Technology (MOT strategies. In this study, patent data related to humanoids are analyzed by various data mining techniques, including topic modeling, cross-impact analysis, association rule mining, and social network analysis, to suggest sustainable strategies and methodologies for MOT.

  6. Micro-nano-biosystems: An overview of European research.

    Science.gov (United States)

    Lymberis, Andreas

    2010-06-01

    New developments in science, technologies and applications are blurring the boundaries between information and communications technology (ICT), micro-nano systems and life sciences, e.g. through miniaturisation and the ability to manipulate matter at the atomic scale and to interface live and man-made systems. Interdisciplinary research towards integrated systems and their applications based on emerging convergence of information & communication technologies, micro-nano and bio technologies is expected to have a direct influence on healthcare, ageing population and well being. Micro-Nano-Bio Systems (MNBS) research and development activities under the European Union's R&D Programs, Information & Communication Technologies priority address miniaturised, smart and integrated systems for in-vitro testing e.g. lab-on-chips and systems interacting with the human e.g. autonomous implants, endoscopic capsules and robotics for minimally invasive surgery. The MNBS group involves hundreds of key public and private international organisations working on system development and validation in diverse applications such as cancer detection and therapy follow-up, minimally invasive surgery, capsular endocsopy, wearable biochemical monitoring and repairing of vital functions with active implant devices. The paper presents MNBS rationale and activities, discusses key research and innovation challenges and proposes R&D directions to achieve the expected impact on healthcare and quality of life.

  7. The possible origin of the first cell biosystem in the thermal subsurface environment of the earth.

    Science.gov (United States)

    Trevors, Jack T

    2004-01-01

    Bacteria are the simplest living biosystems or organisms that exhibit all the characteristics of life. As such, they are excellent models to examine the cell as the basic unit of life and the cell theory which states that all organisms are composed of one or more similar cells. In this article I examine the hypothesis that the primordial soup so often referred to in science was possibly an oil/water interface and/or emulsion in the Earth's, warm, anaerobic subsurface. This warm subsurface location, protected from surface radiation, could have been a favourable location for the assembly of the first bacterial cells on the Earth capable of growth and controlled division or the first biosystem.

  8. Present and future of the numerical methods in buildings and infrastructures areas of biosystems engineering

    Directory of Open Access Journals (Sweden)

    Francisco Ayuga

    2015-04-01

    Full Text Available Biosystem engineering is a discipline resulting from the evolution of the traditional agricultural engineering to include new engineering challenges related with biological systems, from the cell to the environment. Modern buildings and infrastructures are needed to satisfy crop and animal production demands. In this paper a review on the status of numerical methods applied to solve engineering problems in the field of buildings and infrastructures in biosystem engineering is presented. The history and basic background of the finite element method is presented. This is the first numerical method implemented and also the more developed one. The history and background of other two more recent methods, with practical applications, the computer fluids dynamics and the discrete element method are also presented. Besides, a review on the scientific and professional applications on the field of buildings and infrastructures for biosystem engineering needs is presented. Today we can simulate engineering problems with solids, engineering problems with fluids and engineering problems with particles and get to practical solutions faster and cheaper than in the past. The paper encourages young engineers and researchers to make progress these tools and their engineering applications. The capacities of all numerical methods in their present development status go beyond the present practical applications. There is a broad field to work on it.

  9. A rapid automatic analyzer and its methodology for effective bentonite content based on image recognition technology

    Directory of Open Access Journals (Sweden)

    Wei Long

    2016-09-01

    Full Text Available Fast and accurate determination of effective bentonite content in used clay bonded sand is very important for selecting the correct mixing ratio and mixing process to obtain high-performance molding sand. Currently, the effective bentonite content is determined by testing the ethylene blue absorbed in used clay bonded sand, which is usually a manual operation with some disadvantages including complicated process, long testing time and low accuracy. A rapid automatic analyzer of the effective bentonite content in used clay bonded sand was developed based on image recognition technology. The instrument consists of auto stirring, auto liquid removal, auto titration, step-rotation and image acquisition components, and processor. The principle of the image recognition method is first to decompose the color images into three-channel gray images based on the photosensitive degree difference of the light blue and dark blue in the three channels of red, green and blue, then to make the gray values subtraction calculation and gray level transformation of the gray images, and finally, to extract the outer circle light blue halo and the inner circle blue spot and calculate their area ratio. The titration process can be judged to reach the end-point while the area ratio is higher than the setting value.

  10. Characterization of the dynamic behavior of nonlinear biosystems in the presence of model uncertainty using singular invariance PDEs: application to immobilized enzyme and cell bioreactors.

    Science.gov (United States)

    Kazantzis, Nikolaos; Kazantzi, Vasiliki

    2010-04-01

    A new approach to the problem of characterizing the dynamic behavior of nonlinear biosystems in the presence of model uncertainty using the notion of slow invariant manifold is proposed. The problem of interest is addressed within the context of singular partial differential equations (PDE) theory, and in particular, through a system of singular quasi-linear invariance PDEs for which a general set of conditions for solvability is provided. Within the class of analytic solutions, this set of conditions guarantees the existence and uniqueness of a locally analytic solution which represents the system's slow invariant manifold exponentially attracting all dynamic trajectories in the absence of model uncertainty. An exact reduced-order model is then obtained through the restriction of the original biosystem dynamics on the slow manifold. The analyticity property of the solution to the invariance PDEs enables the development of a series solution method that can be easily implemented using MAPLE leading to polynomial approximations up to the desired degree of accuracy. Furthermore, the aforementioned attractivity property and the transition towards the above manifold is analyzed and characterized in the presence of model uncertainty. Finally, examples of certain immobilized enzyme bioreactors are considered to elucidate aspects of the proposed context of analysis.

  11. Analyzing interdependencies between policy mixes and technological innovation systems : The case of offshore wind in Germany

    NARCIS (Netherlands)

    Reichardt, Kristin; Negro, Simona O.; Rogge, Karoline S.; Hekkert, Marko P.

    2016-01-01

    One key approach for studying emerging technologies in the field of sustainability transitions is that of technological innovation systems (TIS). While most TIS studies aim at deriving policy recommendations - typically by identifying system barriers - the actual role of these proposed policies in t

  12. Stop looking up the ladder: analyzing the impact of participatory technology assessment from a network perspective

    NARCIS (Netherlands)

    Loeber, A.; Griessler, E.; Versteeg, W.

    2011-01-01

    Alongside the gradual increase in use of participatory technology assessment (PTA), a tool to democratize decision-making on controversial technologies, a growing body of literature on how to assess the impact of PTA has developed. A distinction can be made between two generations of impact assessme

  13. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    Directory of Open Access Journals (Sweden)

    Mario Camberos C.

    2013-07-01

    Full Text Available In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on 2008, two hundred percent. This result is in correspondence with diminishing technological gap, perhaps caused by economic crisis impact.

  14. Production of biofuels and biochemicals by in vitro synthetic biosystems: Opportunities and challenges.

    Science.gov (United States)

    Zhang, Yi-Heng Percival

    2015-11-15

    The largest obstacle to the cost-competitive production of low-value and high-impact biofuels and biochemicals (called biocommodities) is high production costs catalyzed by microbes due to their inherent weaknesses, such as low product yield, slow reaction rate, high separation cost, intolerance to toxic products, and so on. This predominant whole-cell platform suffers from a mismatch between the primary goal of living microbes - cell proliferation and the desired biomanufacturing goal - desired products (not cell mass most times). In vitro synthetic biosystems consist of numerous enzymes as building bricks, enzyme complexes as building modules, and/or (biomimetic) coenzymes, which are assembled into synthetic enzymatic pathways for implementing complicated bioreactions. They emerge as an alternative solution for accomplishing a desired biotransformation without concerns of cell proliferation, complicated cellular regulation, and side-product formation. In addition to the most important advantage - high product yield, in vitro synthetic biosystems feature several other biomanufacturing advantages, such as fast reaction rate, easy product separation, open process control, broad reaction condition, tolerance to toxic substrates or products, and so on. In this perspective review, the general design rules of in vitro synthetic pathways are presented with eight supporting examples: hydrogen, n-butanol, isobutanol, electricity, starch, lactate,1,3-propanediol, and poly-3-hydroxylbutyrate. Also, a detailed economic analysis for enzymatic hydrogen production from carbohydrates is presented to illustrate some advantages of this system and the remaining challenges. Great market potentials will motivate worldwide efforts from multiple disciplines (i.e., chemistry, biology and engineering) to address the remaining obstacles pertaining to cost and stability of enzymes and coenzymes, standardized building parts and modules, biomimetic coenzymes, biosystem optimization, and scale

  15. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can be used to observe Eddy Covariance Flux and Absolute Dry Mole Fraction of CO2 from stationary and airborne...

  16. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can observe eddy covariance flux of CO2 from unmanned airborne platforms. For both phases, a total of four...

  17. Subsidizing the adoption of energy-saving technologies; Analyzing the impact of uncertainty, learning and maturation

    NARCIS (Netherlands)

    H.L.F. de Groot (Henri); D.P. van Soest (Daan); P. Mulder (Peter)

    2003-01-01

    textabstractAs part of the Kyoto Protocol, many countries have committed themselves to substantially reduce the emission of greenhouse gases within a politically imposed time constraint. Investment subsidies can be an important instrument to stimulate the adoption of energy-saving technologies to ac

  18. A thermodynamic perspective on technologies in the Anthropocene : analyzing environmental sustainability

    NARCIS (Netherlands)

    Liao, Wenjie

    2012-01-01

    Technologies and sustainable development are interrelated from a thermodynamic perspective, with industrial ecology (IE) as a major point of access for studying the relationship in the Anthropocene. To offer insights into the potential offered by thermodynamics in the environmental sustainability an

  19. Multimodal methodologies for analyzing preschool children’s engagements with digital technologies

    DEFF Research Database (Denmark)

    Chimirri, Niklas Alexander

    and between reality and virtuality. In its stead, it suggests understanding technologies as mediating devices or artifacts, which offer the possibility to communicate with others for the sake of negotiating and actualizing imaginations together, for the sake of transforming pedagogical practice...

  20. Subsidizing the Adoption of Energy-Saving Technologies: Analyzing the Impact of Uncertainty, Learning and Maturation

    OpenAIRE

    Groot, Henri de; van Soest, Daan; Mulder, Peter

    2004-01-01

    textabstractAs part of the Kyoto Protocol, many countries have committed themselves to substantially reduce the emission of greenhouse gases within a politically imposed time constraint. Investment subsidies can be an important instrument to stimulate the adoption of energy-saving technologies to achieve emission reduction targets. This paper addresses the impact of adoption subsidies on the amount of energy savings, taking into account both the endogenous and uncertain nature of technologica...

  1. Satellite Technology as a Source of Integration. A Comparative Analyze: Europe MERCOSUR

    Science.gov (United States)

    Castillo Argañarás, Luis F.

    2002-01-01

    Satellite technology as a source of integration. A comparative several changes in the field of international law, creating the need to build a new framework for integration and cooperation. analysis between the development of the European Integration for space activities and the first steps towards the same target by the MERCOSUR with a comparative point of view will show the positive and negative side effects of its development up to our present time.

  2. Analyzing the Life Cycle Energy Savings of DOE Supported Buildings Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Hostick, Donna J.; Dirks, James A.; Elliott, Douglas B.

    2009-08-31

    This report examines the factors that would potentially help determine an appropriate analytical timeframe for measuring the U.S. Department of Energy's Building Technology (BT) benefits and presents a summary-level analysis of the life cycle savings for BT’s Commercial Buildings Integration (CBI) R&D program. The energy savings for three hypothetical building designs are projected over a 100-year period using Building Energy Analysis and Modeling System (BEAMS) to illustrate the resulting energy and carbon savings associated with the hypothetical aging buildings. The report identifies the tasks required to develop a long-term analytical and modeling framework, and discusses the potential analytical gains and losses by extending an analysis into the “long-term.”

  3. 基于技术存在形式的技术垄断研究%Technology Monopoly Analyzing Based on Form of Technology

    Institute of Scientific and Technical Information of China (English)

    刘康

    2012-01-01

    This essay defined the concept of technology monopoly, distinguished technology monopoly from market monopoly and technology property, analyzed the form and the structure of technology monopoly from technology form, and studied influencing factor of technology monopoly under the market economy system. It raised the computing method of technology monopoly degree, regarding technology monopoly as a system, analyzing the relationship between innovation input and monopoly output, studied the monopoly cycle of technology, induced the structure of competitive power and connotations of industrial technology alliance.%界定了技术垄断的概念并区分了市场垄断和技术产权.以技术存在形式视角分析了技术垄断的形成和结构,以及技术垄断的外在表现,并研究了在市场经济体制下影响技术市场垄断程度的因素.提出了技术的市场垄断程度的评价体系和计算方法,并将市场中技术垄断组织作为一个系统,分析技术创新投入和技术市场垄断程度之间的关系.分析了技术垄断的周期,并从中得出了技术垄断竞争力的构成,推出产业技术联盟的技术垄断竞争力内涵.

  4. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants

    Institute of Scientific and Technical Information of China (English)

    Miguel A Pineros; Pierre-Luc Pradier; Nathanael M Shaw; Ithipong Assaranurak; Susan R McCouch; Craig Sturrock; Malcolm Bennett; Leon V Kochian; Brandon G Larson; Jon E Shaff; David J Schneider; Alexandre Xavier Falcao; Lixing Yuan; Randy T Clark; Eric J Craft; Tyler W Davis

    2016-01-01

    A plant’s ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architec-ture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyp-ing software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimen-sional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions.

  5. Analyzing Accuracy and Accessibility in Information and Communication Technology Ethical Scenario Context

    Directory of Open Access Journals (Sweden)

    M. Masrom

    2011-01-01

    Full Text Available Problem statement: Recently, the development of Information and Communication Technology (ICT is indispensable to life. The utilization of ICT has provided advantages for people, organizations and society as a whole. Nevertheless, the widespread and rapid use of ICT in society has exacerbated existing ethical issues or dilemmas and also led to the emergence of new ethical issues such as unauthorized access, software piracy, internet pornography, privacy protection, information gap and many others. Approach: Therefore, the aim of this study is to discuss several issues of the ICT ethics. It will focusing on two major issues, that is, data accuracy and accessibility. Results: The results indicated that more than half percentage of respondents tend to be ethical in data accuracy scenario and also in accessibility scenario. Several computer ethics scenarios that relate to the data accuracy and accessibility are presented and the results of analysis are then discussed. Conclusion: Based on the results in this study, computer ethics issues such as data accuracy and accessibility should receive more attention in the ICT field.

  6. Validation of the Applied Biosystems 7500 Fast Instrument for the Detection of Salmonellae with SureTect Salmonella Species PCR Kit.

    Science.gov (United States)

    Cloke, Jonathan; Evans, Katharine; Crabtree, David; Hughes, Annette; Simpson, Helen

    2016-07-01

    The Thermo Scientific SureTect™ Salmonella species real-time PCR assay is a rapid alternative method designed for the detection of salmonellae in a wide range of foods, animal feeds, and production-environment samples. The assay has previously been validated according to the AOAC Research Institute Performance Tested Methods(SM) program using Thermo Scientific PikoReal™ PCR cycler and Thermo Scientific SureTect Software Performance Tested Method 051303). This report details the method-modification study performed to validate an updated assay format, utilizing a reduced target probe concentration and an extension of the PCR cycler platform to enable the use of the kit with a Applied Biosystems 7500 Fast PCR cycler and Applied Biosystems RapidFinder™ Express 2.0 software. During this validation study, a matrix study was conducted on a subset of the method's claimed matrixes, comparing the performance of the modified SureTect Salmonella species kit (a reduced target probe concentration with a 7500 Fast platform) to the reference method detailed in ISO 6579:2002. No significant difference by probability of detection statistical analysis was found between SureTect or International Organization for Standardization methods for any of the matrixes analyzed during the study. Inclusivity and exclusivity studies using the modified method demonstrated accurate results for the 117 Salmonella and 36 non-Salmonella strains tested. Multiple production lots of the newly formatted kit were evaluated and found to be consistent with the current assay. Robustness studies confirmed that the change to the kit had no impact on the assay's performance when alterations were made to method parameters having the greatest potential impact on assay performance.

  7. Third cycle university studies in Europe in the field of agricultural engineering and in the emerging discipline of biosystems engineering.

    Science.gov (United States)

    Ayuga, F; Briassoulis, D; Aguado, P; Farkas, I; Griepentrog, H; Lorencowicz, E

    2010-01-01

    The main objectives of European Thematic Network entitled 'Education and Research in Agricultural for Biosystems Engineering in Europe (ERABEE-TN)' is to initiate and contribute to the structural development and the assurance of the quality assessment of the emerging discipline of Biosystems Engineering in Europe. ERABEE is co-financed by the European Community in the framework of the LLP Programme. The partnership consists of 35 participants from 27 Erasmus countries, out of which 33 are Higher Education Area Institutions (EDU) and 2 are Student Associations (ASS). 13 Erasmus participants (e.g. Thematic Networks, Professional Associations, and Institutions from Brazil, Croatia, Russia and Serbia) are also involved in the Thematic Network through synergies. To date, very few Biosystems Engineering programs exist in Europe and those that are initiated are at a very primitive stage of development. The innovative and novel goal of the Thematic Network is to promote this critical transition, which requires major restructuring in Europe, exploiting along this direction the outcomes accomplished by its predecessor; the USAEE-TN (University Studies in Agricultural Engineering in Europe). It also aims at enhancing the compatibility among the new programmes of Biosystems Engineering, aiding their recognition and accreditation at European and International level and facilitating greater mobility of skilled personnel, researchers and students. One of the technical objectives of ERABEE is dealing with mapping and promoting the third cycle studies (including European PhDs) and supporting the integration of research at the 1st and 2nd cycle regarding European Biosystems Engineering university studies. During the winter 2008 - spring 2009 period, members of ERABEE conducted a survey on the contemporary status of doctoral studies in Europe, and on a possible scheme for promotion of cooperation and synergies in the framework of the third cycle of studies and the European Doctorate

  8. Geoinformation modeling system for analysis of atmosphere pollution impact on vegetable biosystems using space images

    Science.gov (United States)

    Polichtchouk, Yuri; Ryukhko, Viatcheslav; Tokareva, Olga; Alexeeva, Mary

    2002-02-01

    Geoinformation modeling system structure for assessment of the environmental impact of atmospheric pollution on forest- swamp ecosystems of West Siberia is considered. Complex approach to the assessment of man-caused impact based on the combination of sanitary-hygienic and landscape-geochemical approaches is reported. Methodical problems of analysis of atmosphere pollution impact on vegetable biosystems using geoinformation systems and remote sensing data are developed. Landscape structure of oil production territories in southern part of West Siberia are determined on base of processing of space images from spaceborn Resource-O. Particularities of atmosphere pollution zones modeling caused by gas burning in torches in territories of oil fields are considered. For instance, a pollution zones were revealed modeling of contaminants dispersal in atmosphere by standard model. Polluted landscapes areas are calculated depending on oil production volume. It is shown calculated data is well approximated by polynomial models.

  9. Electro-Quasistatic Simulations in Bio-Systems Engineering and Medical Engineering

    Science.gov (United States)

    van Rienen, U.; Flehr, J.; Schreiber, U.; Schulze, S.; Gimsa, U.; Baumann, W.; Weiss, D. G.; Gimsa, J.; Benecke, R.; Pau, H.-W.

    2005-05-01

    Slowly varying electromagnetic fields play a key role in various applications in bio-systems and medical engineering. Examples are the electric activity of neurons on neurochips used as biosensors, the stimulating electric fields of implanted electrodes used for deep brain stimulation in patients with Morbus Parkinson and the stimulation of the auditory nerves in deaf patients, respectively. In order to simulate the neuronal activity on a chip it is necessary to couple Maxwell's and Hodgkin-Huxley's equations. First numerical results for a neuron coupling to a single electrode are presented. They show a promising qualitative agreement with the experimentally recorded signals. Further, simulations are presented on electrodes for deep brain stimulation in animal experiments where the question of electrode ageing and energy deposition in the surrounding tissue are of major interest. As a last example, electric simulations for a simple cochlea model are presented comparing the field in the skull bones for different electrode types and stimulations in different positions.

  10. The Rücker-Markov invariants of complex Bio-Systems: applications in Parasitology and Neuroinformatics.

    Science.gov (United States)

    González-Díaz, Humberto; Riera-Fernández, Pablo; Pazos, Alejandro; Munteanu, Cristian R

    2013-03-01

    Rücker's walk count (WC) indices are well-known topological indices (TIs) used in Chemoinformatics to quantify the molecular structure of drugs represented by a graph in Quantitative structure-activity/property relationship (QSAR/QSPR) studies. In this work, we introduce for the first time the higher-order (kth order) analogues (WCk) of these indices using Markov chains. In addition, we report new QSPR models for large complex networks of different Bio-Systems useful in Parasitology and Neuroinformatics. The new type of QSPR models can be used for model checking to calculate numerical scores S(Lij) for links Lij (checking or re-evaluation of network connectivity) in large networks of all these fields. The method may be summarized as follows: (i) first, the WCk(j) values are calculated for all jth nodes in a complex network already created; (ii) A linear discriminant analysis (LDA) is used to seek a linear equation that discriminates connected or linked (Lij=1) pairs of nodes experimentally confirmed from non-linked ones (Lij=0); (iii) The new model is validated with external series of pairs of nodes; (iv) The equation obtained is used to re-evaluate the connectivity quality of the network, connecting/disconnecting nodes based on the quality scores calculated with the new connectivity function. The linear QSPR models obtained yielded the following results in terms of overall test accuracy for re-construction of complex networks of different Bio-Systems: parasite-host networks (93.14%), NW Spain fasciolosis spreading networks (71.42/70.18%) and CoCoMac Brain Cortex co-activation network (86.40%). Thus, this work can contribute to the computational re-evaluation or model checking of connectivity (collation) in complex systems of any science field.

  11. Assessing the validity of using serious game technology to analyze physician decision making.

    Directory of Open Access Journals (Sweden)

    Deepika Mohan

    Full Text Available BACKGROUND: Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes have emerged as a method of studying physician decision making. However, little is known about their validity. METHODS: We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines. We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case. We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. FINDINGS: We recruited 209 physicians, of whom 168 (79% began and 142 (68% completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C: 10.9 [SD 4.8] vs. cognitive load (CL:10.7 [SD 5.6], p = 0.74, despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01. Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20, but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03. CONCLUSIONS: We found that physicians made decisions consistent with actual practice, that we could

  12. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    Directory of Open Access Journals (Sweden)

    Peyman Ghafari Ashtiani

    2016-08-01

    Full Text Available One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a customer to another one. The purpose of this research is to analyze the relationship between customers' loyalty and the acceptance of viral marketing based on the theory of technology acceptance model (TAM model among the civil engineers and architects who are the members of Engineering Council in Isfahan (ECI. The research method is descriptive–survey and it is applicable in target. The statistical population includes civil engineers and architects who are the members of Engineering Council in Isfahan including 14400 members. The sample size was determined 762 members based on Cochran sampling formula, the sample was selected as accessible. The data was collected by field method. Analyzing the data and recent research hypothesis, the data was extracted from the questionnaires. Then, all the data was analyzed by computer and SPSS and LISREL software. According to the results of the data, the loyalty of the civil engineers and architects members of ECI was associated with the acceptance and practical involvement of viral marketing.

  13. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    Science.gov (United States)

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  14. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    Science.gov (United States)

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  15. Analyzing the Technology of Using Ash and Slag Waste from Thermal Power Plants in the Production of Building Ceramics

    Science.gov (United States)

    Malchik, A. G.; Litovkin, S. V.; Rodionov, P. V.; Kozik, V. V.; Gaydamak, M. A.

    2016-04-01

    The work describes the problem of impounding and storing ash and slag waste at coal thermal power plants in Russia. Recovery and recycling of ash and slag waste are analyzed. Activity of radionuclides, the chemical composition and particle sizes of ash and slag waste were determined; the acidity index, the basicity and the class of material were defined. The technology for making ceramic products with the addition of ash and slag waste was proposed. The dependencies relative to the percentage of ash and slag waste and the optimal parameters for baking were established. The obtained materials were tested for physical and mechanical properties, namely for water absorption, thermal conductivity and compression strength. Based on the findings, future prospects for use of ash and slag waste were identified.

  16. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    Science.gov (United States)

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger.

  17. Analyzing Department of Defense's use of other transactions as a method for accessing non-traditional technology

    OpenAIRE

    Gilliland, John E.

    2001-01-01

    As U.S. Defense budgets and military research and development spending experienced significant decline between 1988 and 1998, the Defense Technology and Industrial Base essentially merged with the national industrial base. DOD reform occurred more slowly than changes in the private sector fueled by advances in technology. U.S. national security relies upon the ability of the military to maintain technological superiority. To attract advanced technology companies that normally do not participa...

  18. Corneal biomechanical properties after femtosecond laser assisted LASIK with the corneal visualization Scheimpflug technology and ocular response analyzer

    Directory of Open Access Journals (Sweden)

    Jing Li

    2017-02-01

    Full Text Available AIM: To investigate the changes of corneal biomechanical properties before and after femtosecond laser assisted LASIK(FS-LASIKusing Corneal Visualisation Scheimpflug Technology(Corvis STand Ocular Response Analyzer(ORA, and the correlation with other myopic parameters. METHODS:Sixty three patients(63 eyeswho had myopic femtosecond laser assisted LASIK(FS-LASIKwere enrolled in the study. The right eye of each patient was analyzed in this study. The corneal biomechanical parameters pre-operative and 1mo post-operative was measured with the Corvis ST(Oculus, Wetzlar, Germanyand ORA(Reichert, Buffalo, New York, USA. Comparison of the biomechanical property values before and after surgery was peformed using Paired t-test or Mann-Whitney U. Pearson or Spearman correlations were used to evaluate the relationship between parameters.RESULTS: The postoperative 1st A-time, Vin, 2nd A length, Vout, HC time and Radius demonstrate significant decreases comparing with preoperative values(P=0.00, P=0.00, P=0.00, P=0.00, P=0.00, P=0.00 respectively. The postoperative 2nd A-time, DA and PD significantly increases(P=0.00, P=0.00, P=0.00, however, the 1st A length had no significant difference after surgery. The CH and CRF were significantly lower after FS-LASIK(P=0.00, P=0.00. A statistically significant correlation coefficient was found between preoperative central corneal thickness(CCTwith postoperative-preoperative changes of 1st A-time, 2nd A-time, DA and Radius respectively(P=0.01, P=0.04, P=0.03, P=0.01. CONCLUSION:There were significantly changes of corneal biomechanical properties after FS-LASIK surgery. The changes of corneal biomechanical properties after FS-LASIK can be reflected by some parameters of Corvis ST and ORA. The mainly influence of corneal biomechanical alteration was possibly correlation with corneal thickness.

  19. 声音制作技术的现象学解读%Analyzing Sound Production Technology in Phenomenology of Technology

    Institute of Scientific and Technical Information of China (English)

    李松林

    2011-01-01

    声音是自然界存在的一种属性,也是人类最古老的交流方式之一。声音制作技术是对声音的录制、传播过程,它的发展与人类发展、进步的历史是息息相关的,经历了从原始社会、近代乃至当代社会漫长的历史时期。技术现象学是有关技术与人类关系的学说,运用伊德的技术现象学理论,可以把声音制作技术发展的历史分为萌芽阶段、产生阶段、发展阶段、成熟阶段与后现代阶段。声音制作技术前进、发展过程中的基本规律就是通过对声音的制作与传播,使意识得到了延绵,其本质是"声音现象学",制作一种"存在"。声音制作技术还具有改变人的存在方式的深层次哲学意义,哲学的基本命题"存在"也许将成为一种新的存在方式。%Sound is a property of natural existence,and one of the most ancient forms of communication.Sound production technology is a sound recording and its the communication process.Sound development and progress of history is closely related with human beings,which has gone for a long historical period from primitive society and modern to contemporary society.Phenomenology of technology deals with the theory of the technology and the phenomenology of human relations.Buy using the Idhe's theory,this paper analyzes the history of sound production technology development stage,which could be divided into the embryonic stage,production stage,development stage,mature stage and the post-modern stage.The basic law of sound production technology in the development process is the consciousness of the stretches through the production and communication of sound.Its essence is the "voice of phenomenology",producing a kind of "existence".Sound production technology also has the deep philosophical meaning changing the way of human existence,the basic philosophy of the proposition "existence" may become a new kind of existence.

  20. Interactions of zero-frequency and oscillating magnetic fields with biostructures and biosystems.

    Science.gov (United States)

    Volpe, Pietro

    2003-06-01

    . In the framework of studies on the origin and adaptation of life on Earth or in the Universe, theoretical insights paving the way to elucidate the mechanisms of the MF interactions with biostructures and biosystems are considered.

  1. Climate technology transfer at the local, national and global levels: analyzing the relationships between multi-level structures

    NARCIS (Netherlands)

    Tessema Abissa, Fisseha

    2014-01-01

    This thesis examines the relationships between multi-leveled decision structures for climate technology transfer through an analysis of top-down macro-policy and bottom-up micro-implementation. It examines how international climate technology transfer policy under the UNFCCC filters down to the nati

  2. Does Personality Matter? Applying Holland's Typology to Analyze Students' Self-Selection into Science, Technology, Engineering, and Mathematics Majors

    Science.gov (United States)

    Chen, P. Daniel; Simpson, Patricia A.

    2015-01-01

    This study utilized John Holland's personality typology and the Social Cognitive Career Theory (SCCT) to examine the factors that may affect students' self-selection into science, technology, engineering, and mathematics (STEM) majors. Results indicated that gender, race/ethnicity, high school achievement, and personality type were statistically…

  3. Validation of the Applied Biosystems 7500 Fast Instrument for Detection of Listeria Species with the SureTect Listeria Species PCR Assay.

    Science.gov (United States)

    Cloke, Jonathan; Arizanova, Julia; Crabtree, David; Simpson, Helen; Evans, Katharine; Vaahtoranta, Laura; Palomäki, Jukka-Pekka; Artimo, Paulus; Huang, Feng; Liikanen, Maria; Koskela, Suvi; Chen, Yi

    2016-01-01

    The Thermo Scientific™ SureTect™ Listeria species Real-Time PCR Assay was certified during 2013 by the AOAC Research Institute (RI) Performance Tested Methods(SM) program as a rapid method for the detection of Listeria species from a wide range of food matrixes and surface samples. A method modification study was conducted in 2015 to extend the matrix claims of the product to a wider range of food matrixes. This report details the method modification study undertaken to extend the use of this PCR kit to the Applied Biosystems™ 7500 Fast PCR Instrument and Applied Biosystems RapidFinder™ Express 2.0 software allowing use of the assay on a 96-well format PCR cycler in addition to the current workflow, using the 24-well Thermo Scientific PikoReal™ PCR Instrument and Thermo Scientific SureTect software. The method modification study presented in this report was assessed by the AOAC-RI as being a level 2 method modification study, necessitating a method developer study on a representative range of food matrixes covering raw ground turkey, 2% fat pasteurized milk, and bagged lettuce as well as stainless steel surface samples. All testing was conducted in comparison to the reference method detailed in International Organization for Standardization (ISO) 6579:2002. No significant difference by probability of detection statistical analysis was found between the SureTect Listeria species PCR Assay or the ISO reference method methods for any of the three food matrixes and the surface samples analyzed during the study.

  4. Analyzing the U.S. Marine Corps Enterprise Information Technology Framework for IT Acquisition and Portfolio Governance

    Science.gov (United States)

    2012-09-01

    technology will enable the diffusion of destructive power to smaller and smaller groups” ( Albers , Garska, & Stein, 2000). Thereby, through the NCW...2000) conducted research into the potential power of the Internet age, under the direction of the Assistant Secretary of Defense (C3I). Albers ...effectiveness at the operational level” ( Albers et al., 2000). Therefore, the MCEN is a current manifestation of the Marine Corps’ vision of NCW

  5. Analyzing the Effect of Technology-Based Intervention in Language Laboratory to Improve Listening Skills of First Year Engineering Students

    Directory of Open Access Journals (Sweden)

    Pasupathi Madhumathi

    2013-04-01

    Full Text Available First year students pursuing engineering education face problems with their listening skills. Most of the Indian schools use a bilingual method for teaching subjects from primary school through high school. Nonetheless, students entering university education develop anxiety in listening to classroomlectures in English. This article reports an exploratory study that aimed to find out whether the listening competences of students improved when technology was deployed in language laboratory. It also investigated the opinions of the students about using teacher-suggested websites for acquiring listening skills. The results of the study indicated that the use of technology in a language laboratory for training students in listening competences had reduced the anxiety of the students when listening to English. Further, there was a significant improvement on the part of students in acquiring listening skills through technology-based intervention.Muchos estudiantes de ingeniería de primer año en India tienen problemas con sus habilidades de escucha en inglés; experimentan ansiedad al momento de escuchar conferencias en inglés, pese a que provienen de colegios donde se sigue un modelo bilingüe para enseñar materias desde la primariahasta la secundaria. Con el objetivo de averiguar si las competencias de escucha de los estudiantes mejoran cuando se introduce la tecnología en el laboratorio de idiomas, se realizó un estudio exploratorio en el que se tuvieron en cuenta las opiniones de los estudiantes acerca del uso de sitios web sugeridos por el docente para adquirir habilidades de escucha. Los resultados indican que el uso de la tecnología en el laboratorio de idiomas reduce la ansiedad de los estudiantes al momento de escuchar conferencias en inglés y que progresan significativamente en sus habilidades de escucha.

  6. Network Stack Analyzing and Protocol Add Technology in Linux%Linux网络协议栈分析及协议添加的实现

    Institute of Scientific and Technical Information of China (English)

    唐续; 刘心松; 杨峰

    2003-01-01

    In order to improve the performance of Linux network,new protocols should be implemented and added in original protocol stack. For this demand,this paper has analyzed Linux network stack architecture and implement technology,then presented a method that appended new protocols in the network stack of Linux. The processes of protocol register ,protocol operation,protocol header implement,packets receiving, user interface are involved in this method.

  7. Seismic Technology Adapted to Analyzing and Developing Geothermal Systems Below Surface-Exposed High-Velocity Rocks Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; DeAngelo, Michael V. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Ermolaeva, Elena [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Remington, Randy [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Sava, Diana [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wagner, Donald [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wei, Shuijion [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology

    2013-02-01

    The objective of our research was to develop and demonstrate seismic data-acquisition and data-processing technologies that allow geothermal prospects below high-velocity rock outcrops to be evaluated. To do this, we acquired a 3-component seismic test line across an area of exposed high-velocity rocks in Brewster County, Texas, where there is high heat flow and surface conditions mimic those found at numerous geothermal prospects. Seismic contractors have not succeeded in creating good-quality seismic data in this area for companies who have acquired data for oil and gas exploitation purposes. Our test profile traversed an area where high-velocity rocks and low-velocity sediment were exposed on the surface in alternating patterns that repeated along the test line. We verified that these surface conditions cause non-ending reverberations of Love waves, Rayleigh waves, and shallow critical refractions to travel across the earth surface between the boundaries of the fast-velocity and slow-velocity material exposed on the surface. These reverberating surface waves form the high level of noise in this area that does not allow reflections from deep interfaces to be seen and utilized. Our data-acquisition method of deploying a box array of closely spaced geophones allowed us to recognize and evaluate these surface-wave noise modes regardless of the azimuth direction to the surface anomaly that backscattered the waves and caused them to return to the test-line profile. With this knowledge of the surface-wave noise, we were able to process these test-line data to create P-P and SH-SH images that were superior to those produced by a skilled seismic data-processing contractor. Compared to the P-P data acquired along the test line, the SH-SH data provided a better detection of faults and could be used to trace these faults upward to the boundaries of exposed surface rocks. We expanded our comparison of the relative value of S-wave and P-wave seismic data for geothermal

  8. Ab initio O(N) elongation-counterpoise method for BSSE-corrected interaction energy analyses in biosystems

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi; Xie, Peng; Liu, Kai [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Yamamoto, Ryohei [Department of Molecular and Material Sciences, Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Imamura, Akira [Hiroshima Kokusai Gakuin University, 6-20-1 Nakano, Aki-ku, Hiroshima 739-0321 (Japan); Aoki, Yuriko, E-mail: aoki.yuriko.397@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2015-03-14

    An Elongation-counterpoise (ELG-CP) method was developed for performing accurate and efficient interaction energy analysis and correcting the basis set superposition error (BSSE) in biosystems. The method was achieved by combining our developed ab initio O(N) elongation method with the conventional counterpoise method proposed for solving the BSSE problem. As a test, the ELG-CP method was applied to the analysis of the DNAs’ inter-strands interaction energies with respect to the alkylation-induced base pair mismatch phenomenon that causes a transition from G⋯C to A⋯T. It was found that the ELG-CP method showed high efficiency (nearly linear-scaling) and high accuracy with a negligibly small energy error in the total energy calculations (in the order of 10{sup −7}–10{sup −8} hartree/atom) as compared with the conventional method during the counterpoise treatment. Furthermore, the magnitude of the BSSE was found to be ca. −290 kcal/mol for the calculation of a DNA model with 21 base pairs. This emphasizes the importance of BSSE correction when a limited size basis set is used to study the DNA models and compare small energy differences between them. In this work, we quantitatively estimated the inter-strands interaction energy for each possible step in the transition process from G⋯C to A⋯T by the ELG-CP method. It was found that the base pair replacement in the process only affects the interaction energy for a limited area around the mismatch position with a few adjacent base pairs. From the interaction energy point of view, our results showed that a base pair sliding mechanism possibly occurs after the alkylation of guanine to gain the maximum possible number of hydrogen bonds between the bases. In addition, the steps leading to the A⋯T replacement accompanied with replications were found to be unfavorable processes corresponding to ca. 10 kcal/mol loss in stabilization energy. The present study indicated that the ELG-CP method is promising for

  9. Discussion on Modern Agricultural Science and Technology Demonstration Garden with the Guiding of Agricultural Ecotourism-analyzing Conception Planning for Modern Agricultural Science and Technology Demonstration Field of Ten Thousands’ Mu Coffee and Nuts of Huaqiaoba

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2013-03-01

    Full Text Available Modern agricultural science and technology demonstration fields emerge and flourish with the increasing of the adjustment of agricultural industrial structure and the rising of characteristic ecotourism. The study tried to discuss the modern agricultural science and technology demonstration garden with the guiding of agricultural ecotourism by analyzing conception planning for modern agricultural science and technology demonstration field of ten thousands’ Mu coffee and nuts in Huaqiaoba, Mangshi, Dehong, Yunnan. The planning respected current situation of natural ecology, established a tourism theme image of “planting coffee trees and also drawing golden phoenixes” in view of SWOT analyzing of ecotourism, put forward a planning idea of “nature and ecology, culture and human, science and technology and modern” and especially expounded the structure of total planning layout of “one axis, one circle, one nucleus, sixteen areas and twenty four points” and the contents of specific functional areas around three key functions: coffee planting demonstration, agricultural ecotourism, industrial leisure vacation.

  10. Greenhouse gas (GHG) emission in organic farming. Approximate quantification of its generation at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM)

    Science.gov (United States)

    Campos, Jorge; Barbado, Elena; Maldonado, Mariano; Andreu, Gemma; López de Fuentes, Pilar

    2016-04-01

    As it well-known, agricultural soil fertilization increases the rate of greenhouse gas (GHG) emission production such as CO2, CH4 and N2O. Participation share of this activity on the climate change is currently under study, as well as the mitigation possibilities. In this context, we considered that it would be interesting to know how this share is in the case of organic farming. In relation to this, a field experiment was carried out at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM). The orchard included different management growing areas, corresponding to different schools of organic farming. Soil and gas samples were taken from these different sites. Gas samples were collected throughout the growing season from an accumulated atmosphere inside static chambers inserted into the soil. Then, these samples were carried to the laboratory and there analyzed. The results obtained allow knowing approximately how ecological fertilization contributes to air pollution due to greenhouse gases.

  11. Analyzing Orientations

    Science.gov (United States)

    Ruggles, Clive L. N.

    Archaeoastronomical field survey typically involves the measurement of structural orientations (i.e., orientations along and between built structures) in relation to the visible landscape and particularly the surrounding horizon. This chapter focuses on the process of analyzing the astronomical potential of oriented structures, whether in the field or as a desktop appraisal, with the aim of establishing the archaeoastronomical "facts". It does not address questions of data selection (see instead Chap. 25, "Best Practice for Evaluating the Astronomical Significance of Archaeological Sites", 10.1007/978-1-4614-6141-8_25) or interpretation (see Chap. 24, "Nature and Analysis of Material Evidence Relevant to Archaeoastronomy", 10.1007/978-1-4614-6141-8_22). The main necessity is to determine the azimuth, horizon altitude, and declination in the direction "indicated" by any structural orientation. Normally, there are a range of possibilities, reflecting the various errors and uncertainties in estimating the intended (or, at least, the constructed) orientation, and in more formal approaches an attempt is made to assign a probability distribution extending over a spread of declinations. These probability distributions can then be cumulated in order to visualize and analyze the combined data from several orientations, so as to identify any consistent astronomical associations that can then be correlated with the declinations of particular astronomical objects or phenomena at any era in the past. The whole process raises various procedural and methodological issues and does not proceed in isolation from the consideration of corroborative data, which is essential in order to develop viable cultural interpretations.

  12. Bacterial diversity of a consortium degrading high-molecular-weight polycyclic aromatic hydrocarbons in a two-liquid phase biosystem.

    Science.gov (United States)

    Lafortune, Isabelle; Juteau, Pierre; Déziel, Eric; Lépine, François; Beaudet, Réjean; Villemur, Richard

    2009-04-01

    -five percent of clones and strains were affiliated to Alphaproteobacteria and Betaproteobacteria; among them, several were affiliated to bacterial species known for their PAH degradation activities such as those belonging to the Sphingomonadaceae. Finally, three genes involved in the degradation of aromatic molecules were detected in the consortium and two in IAFILS9. This study provides information on the bacterial composition of a HWM PAH-degrading consortium and its dynamics in a TLP biosystem during PAH degradation.

  13. Human Performance and Biosystems

    Science.gov (United States)

    2013-03-08

    to low-level exposures of photo-electro- magnetic stimuli. Potential long-term benefits may include accelerated recovery from mental fatigue and...protocol in preparation DoD Benefit : Effective and safe augmentation of warfighter cognitive capabilities under sleep deprived (or other stressful...Microscopy of cells reveals lipid bodies lipid chlorophyll 1. Generate thousands of mutants 2. Combine mutants into one mixed culture 3. Measure

  14. Laser influence to biosystems

    Directory of Open Access Journals (Sweden)

    Jevtić Sanja D.

    2015-01-01

    Full Text Available In this paper a continous (cw lasers in visible region were applied in order to study the influence of quantum generator to certain plants. The aim of such projects is to analyse biostimulation processes of living organizms which are linked to defined laser power density thresholds (exposition doses. The results of irradiation of corn and wheat seeds using He-Ne laser in the cw regime of 632.8nm, 50mW are presented and compared to results for other laser types. The dry and wet plant seeds were irradiated in defined time intervals and the germination period plant was monitored by days. Morphological data (stalk thickness, height, cob lenght for chosen plants were monitored. From the recorded data, for the whole vegetative period, we performed appropriate statistical data processing. One part of experiments were the measurements of coefficient of reflection in visible range. Correlation estimations were calculated and discussed for our results. Main conclusion was that there were a significant increments in plant's height and also a cob lenght elongation for corn.

  15. 面向智能电网的物联网技术及应用分析%Analyzing of Smart Grid Networking Technology and Application

    Institute of Scientific and Technical Information of China (English)

    凌俊斌; 刘婷

    2014-01-01

    Networking technology for smart grid is the inevitable trend of development of this technology. This paper introduced the applications of networking technology in the smart grid, and the base application system construction situation for smart grid networking technology.%物联网技术应用于智能电网,是这一技术发展的必然趋势。简要介绍物联网技术在智能电网中的应用,以及面向智能电网的物联网技术的基础应用体系建设的具体情况。

  16. (Environmental technology)

    Energy Technology Data Exchange (ETDEWEB)

    Boston, H.L.

    1990-10-12

    The traveler participated in a conference on environmental technology in Paris, sponsored by the US Embassy-Paris, US Environmental Protection Agency (EPA), the French Environmental Ministry, and others. The traveler sat on a panel for environmental aspects of energy technology and made a presentation on the potential contributions of Oak Ridge National Laboratory (ORNL) to a planned French-American Environmental Technologies Institute in Chattanooga, Tennessee, and Evry, France. This institute would provide opportunities for international cooperation on environmental issues and technology transfer related to environmental protection, monitoring, and restoration at US Department of Energy (DOE) facilities. The traveler also attended the Fourth International Conference on Environmental Contamination in Barcelona. Conference topics included environmental chemistry, land disposal of wastes, treatment of toxic wastes, micropollutants, trace organics, artificial radionuclides in the environment, and the use biomonitoring and biosystems for environmental assessment. The traveler presented a paper on The Fate of Radionuclides in Sewage Sludge Applied to Land.'' Those findings corresponded well with results from studies addressing the fate of fallout radionuclides from the Chernobyl nuclear accident. There was an exchange of new information on a number of topics of interest to DOE waste management and environmental restoration needs.

  17. A COGNITIVE AND MEMETIC SCIENCE APPROACH TO ANALYZE THE HUMAN FACTORS IN PREDICTING THE EVOLUTION OF PAPER TECHNOLOGY AND PRODUCTS IN THE 21sT CENTURY

    Institute of Scientific and Technical Information of China (English)

    FumihikoONABE

    2004-01-01

    Predicting the future of paper industry is conventionally conducted from the technological and market-oriented aspects as well as a variety of constraints lying ahead of the industry such as resource, energy, and environmental issues.

  18. Advances in hematology analyzers.

    Science.gov (United States)

    DeNicola, Dennis B

    2011-05-01

    The complete blood count is one of the basic building blocks of the minimum database in veterinary medicine. Over the past 20 years, there has been a tremendous advancement in the technology of hematology analyzers and their availability to the general practitioner. There are 4 basic methodologies that can be used to generate data for a complete blood count: manual methods, quantitative buffy coat analysis, automated impedance analysis, and flow cytometric analysis. This article will review the principles of these methodologies, discuss some of their advantages and disadvantages, and describe some of the hematology analyzers that are available for the in-house veterinary laboratory.

  19. Biochemical Technology Program progress report for the period January 1--June 30, 1976. [Centrifugal analyzers and advanced analytical systems for blood and body fluids

    Energy Technology Data Exchange (ETDEWEB)

    Mrochek, J.E.; Burtis, C.A.; Scott, C.D. (comps.)

    1976-09-01

    This document, which covers the period January 1-June 30, 1976, describes progress in the following areas: (1) advanced analytical techniques for the clinical laboratory, (2) fast clinical analyzers, (3) development of a miniaturized analytical clinical laboratory system, (4) centrifugal fast analyzers for animal toxicological studies, and (5) chemical profile of body fluids.

  20. Waste Not, Want Not: Analyzing the Economic and Environmental Viability of Waste-to-Energy (WTE) Technology for Site-Specific Optimization of Renewable Energy Options

    Energy Technology Data Exchange (ETDEWEB)

    Funk, K.; Milford, J.; Simpkins, T.

    2013-02-01

    Waste-to-energy (WTE) technology burns municipal solid waste (MSW) in an environmentally safe combustion system to generate electricity, provide district heat, and reduce the need for landfill disposal. While this technology has gained acceptance in Europe, it has yet to be commonly recognized as an option in the United States. Section 1 of this report provides an overview of WTE as a renewable energy technology and describes a high-level model developed to assess the feasibility of WTE at a site. Section 2 reviews results from previous life cycle assessment (LCA) studies of WTE, and then uses an LCA inventory tool to perform a screening-level analysis of cost, net energy production, greenhouse gas (GHG) emissions, and conventional air pollution impacts of WTE for residual MSW in Boulder, Colorado. Section 3 of this report describes the federal regulations that govern the permitting, monitoring, and operating practices of MSW combustors and provides emissions limits for WTE projects.

  1. Key Technologies Progress of Automatic Biochemical Analyzer%全自动生化分析仪关键技术进展

    Institute of Scientific and Technical Information of China (English)

    王炜

    2010-01-01

    @@ 0引言 全自动生化分析仪(Chemistry Analyzer,简称生化仪)是临床检验中最经常使用的重要分析仪器之一,主要用于测定血清、血浆或其他体液的各种生化指标,如葡萄糖、白蛋白、总蛋白、胆固醇、肌肝、转氨酶等.

  2. Analyzing the platelet proteome.

    Science.gov (United States)

    García, Angel; Zitzmann, Nicole; Watson, Steve P

    2004-08-01

    During the last 10 years, mass spectrometry (MS) has become a key tool for protein analysis and has underpinned the emerging field of proteomics. Using high-throughput tandem MS/MS following protein separation, it is potentially possible to analyze hundreds to thousands of proteins in a sample at a time. This technology can be used to analyze the protein content (i.e., the proteome) of any cell or tissue and complements the powerful field of genomics. The technology is particularly suitable for platelets because of the absence of a nucleus. Cellular proteins can be separated by either gel-based methods such as two-dimensional gel electrophoresis or one-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis followed by liquid chromatography (LC) -MS/MS or by multidimensional LC-MS/MS. Prefractionation techniques, such as subcellular fractionations or immunoprecipitations, can be used to improve the analysis. Each method has particular advantages and disadvantages. Proteomics can be used to compare the proteome of basal and diseased platelets, helping to reveal information on the molecular basis of the disease.

  3. Field Evaluation of MERCEM Mercury Emission Analyzer System at the Oak Ridge TSCA Incinerator East Tennessee Technology Park Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-03-01

    The authors reached the following conclusions: (1) The two-month evaluation of the MERCEM total mercury monitor from Perkin Elmer provided a useful venue in determining the feasibility of using a CEM to measure total mercury in a saturated flue gas. (2) The MERCEM exhibited potential at a mixed waste incinerator to meet requirements proposed in PS12 under conditions of operation with liquid feeds only at stack mercury concentrations in the range of proposed MACT standards. (3) Performance of the MERCEM under conditions of incinerating solid and liquid wastes simultaneously was less reliable than while feeding liquid feeds only for the operating conditions and configuration of the host facility. (4) The permeation tube calibration method used in this test relied on the CEM internal volumetric and time constants to relate back to a concentration, whereas a compressed gas cylinder concentration is totally independent of the analyzer mass flowmeter and flowrates. (5) Mercury concentration in the compressed gas cylinders was fairly stable over a 5-month period. (6) The reliability of available reference materials was not fully demonstrated without further evaluation of their incorporation into routine operating procedures performed by facility personnel. (7) The degree of mercury control occurring in the TSCA Incinerator off-gas cleaning system could not be quantified from the data collected in this study. (8) It was possible to conduct the demonstration at a facility incinerating radioactively contaminated wastes and to release the equipment for later unrestricted use elsewhere. (9) Experience gained by this testing answered additional site-specific and general questions regarding the operation and maintenance of CEMs and their use in compliance monitoring of total mercury emissions from hazardous waste incinerators.

  4. Proteomic approaches to biomarker discovery in lung cancers by SELDI technology

    Institute of Scientific and Technical Information of China (English)

    肖雪媛; 卫秀平; 何大澄

    2003-01-01

    The purpose of the present work is to identify protein profiles that could be used to discover specific biomarkers in serum and discriminate lung cancer. Thirty serum samples from patients with lung cancer (15 cases of primary brochogenic carcinoma, 9 cases of metastasis lung cancer and 6 cases of lung cancer after chemotherapy) and twelve from healthy individuals were analyzed by SELDI (Surfaced Enhanced Laser Desorption/Ionization) technology. Anion-exchange columns were used to fractionate the sera with 6 designated pH washing solutions. Two types of protein chip arrays, IMAC-Cu and WCX2, were employed. Protein chips were examined in PBSII ProteinChip Reader (Ciphergen Biosystems Inc.) and the resulting profiles between cancer and normal were analyzed with Biomarker Wizard System. In total, 15 potential lung cancer biomarkers, of which 6 were up-regulated and 9 were down-regulated, were discovered in the serum samples from patients with lung cancer. 5 of 15 these biomarkers were able to be detected on both WCX2 and IMAC-Cu protein chips. The sensitivities provided by the individual markers range from 44.8% to 93.1% and the specificities were 85.0%-94.4%. Our results suggest that serum is a capable resource for detection of lung cancer with specific biomarkers. Moreover, protein chip array system was shown to be a useful tool for identification, as well as detection of disease biomarkers in sera.

  5. COMPARING AND ANALYZING THE SIMILARITIES AND DIFFERENCES BETWEEN CPU HYPER-THREADING AND DUAL-CORE TECHNOLOGIES%比较分析CPU超线程技术与双核技术的异同

    Institute of Scientific and Technical Information of China (English)

    林杰; 余建坤

    2011-01-01

    Hyper-threading and dual-core are two important technologies during the CPU evolution. Hyper-threading technology simulates a physical processor as two "virtual" processors to reduce the idle time of the execution units and some resources, thus increasing CPU utilization. Dual-core technology encapsulates two physical processing cores into one CPU to improve the performance of programs. The paper describes the basic model of CPU, analyzes Hyper-threading and dual-core technology principles, and compares their similarities and differences from three perspectives of system architecture, parallel degree and improved efficiency.%超线程技术和双核技术是CPU发展历程中的重要技术.超线程技术把一个物理处理器模拟成两个“虚拟”的处理器,减少执行单元和一些资源的闲置时间,提高CPU的利用率.双核技术是将两个物理处理核心封装在一个CPU中,提高程序的执行效率.介绍CPU的基本模型,分析超线程和双核的技术原理,并从系统架构、并行程度和提升的效率三个方面比较它们的异同点.

  6. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  7. Serum Protein Fingerprint of Patients with Pancreatic Cancer by SELDI Technology

    Institute of Scientific and Technical Information of China (English)

    MA Ning; GE Chun-lin; LUAN Feng-ming; YAO Dian-bo; HU Chao-jun; LI Ning; LIU Yong-feng

    2008-01-01

    Objective:To study the serum protein fingerprint of patients with pancreatic cancer and to screen for protein molecules closely related to pancreatic cancer during the onset and progression of the disease using surface-enhanced laser desorption and ionization time of fight mass spectrometry(SELDI-TOF-MS).Methods:Serum samples from 20 pancreatic cancers,20 healthy volunteers and 18 patients with other pancreatic diseases.WCX magnetic beans and PBSII-C protein chips reader(Ciphergen Biosystems Ins.)were used.The protein fingerprint expression of all the Serum samples and the resulting profiles between cancer and normal were analyzed with Biomarker Wizard system.Results:A group of proteomic peaks were detected.Four differently expressed potential biomarkers were identified with the relative molecular weights of 5705 Da,4935 Da,5318 Da and 3243 Da.Among them,two proteins with m/z5705,5318Da down-regulated,and two proteins with m/z 4935,3243 Da were up-regulated in pancreatic cancers.Conclusion:SELDI technology can be used to screen significant proteins of differential expression in the serum of pancreatic cancer patients.These different proteins could be specific biomarkers of the patients with pancreatic cancer in the serum and have the potential value of further investigation.

  8. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  9. Technology.

    Science.gov (United States)

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  10. Remote Laser Diffraction PSD Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2000-06-01

    Particle size distribution (PSD) analysis of radioactive slurry samples were obtained using a modified "off-the-shelf" classical laser light scattering particle size analyzer. A Horiba Instruments Inc. Model La-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a "hot cell" (gamma radiation) environment. The general details of the modifications to this analyzer are presented in this paper. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not achievable - making this technology far superior than the traditional methods used previously. Remote deployment and utilization of this technology is in an exploratory stage. The risk of malfunction in this radiation environment is countered by gaining of this tremendously useful fundamental engineering data. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  11. 基于交叉复用技术的OFDM- WDM光传输系统的性能分析%Analyzing of WDM optical transmission systems based on interleaved OFDM technology

    Institute of Scientific and Technical Information of China (English)

    梁有程; 陈海涛; 张豪杰

    2015-01-01

    基于正交频分复用(OFDM)与波分复用(WDM)的基本原理,构建了一套基于OFDM技术的WDM传输系统的系统模型,并对其相关系统性能进行了理论分析。基于维纳相位噪声模型,文章对OFDM系统中相位噪声引起的公共相位误差和子载波间干扰进行了分析,在此基础上深入分析了基于交叉复用OFDM技术的OFDM子载波间干扰的消除方法,降低相位噪声对系统性能的影响。最后结合理论分析结果对系统模型的相关性能进行模拟计算,分析结果表明:采用交叉复用OFDM技术在降低相位噪声对OFDM系统性能的影响的同时,提高了系统的信号处理能力。%Based on orthogonal frequency division multiplexing(OFDM) and wavelength division multiplexing ( WDM ) principle , a model of OFDM- WDM transmission system was designed . The common phase error and inter- carrier interference in OFDM system caused by the phase noise were analyzed using Wiener phase noise model. In order to suppress the phase noise, a general inter- carrier interference suppression scheme was provided based on the interleaved OFDM technology. The analytical and numerical results show that this interleaved OFDM technology effect in reducing inter- carrier interference on the performance of OFDM system, at the same time, the signal processing ability of the system.

  12. The Intermodulation Lockin Analyzer

    CERN Document Server

    Tholen, Erik A; Forchheimer, Daniel; Schuler, Vivien; Tholen, Mats O; Hutter, Carsten; Haviland, David B

    2011-01-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lock-in analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback and stability in operation. The use of the analyzer is demonstrated for Intermodulation Atomic Force Microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  13. Analog multivariate counting analyzers

    CERN Document Server

    Nikitin, A V; Armstrong, T P

    2003-01-01

    Characterizing rates of occurrence of various features of a signal is of great importance in numerous types of physical measurements. Such signal features can be defined as certain discrete coincidence events, e.g. crossings of a signal with a given threshold, or occurrence of extrema of a certain amplitude. We describe measuring rates of such events by means of analog multivariate counting analyzers. Given a continuous scalar or multicomponent (vector) input signal, an analog counting analyzer outputs a continuous signal with the instantaneous magnitude equal to the rate of occurrence of certain coincidence events. The analog nature of the proposed analyzers allows us to reformulate many problems of the traditional counting measurements, and cast them in a form which is readily addressed by methods of differential calculus rather than by algebraic or logical means of digital signal processing. Analog counting analyzers can be easily implemented in discrete or integrated electronic circuits, do not suffer fro...

  14. Analyzing binding data.

    Science.gov (United States)

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  15. Analyzing in the Present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Pedersen, Lene Tanggaard

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... of various interviews conveyed diverse significance to the listening researcher at different times became a method of continuously opening up the empirical material in a reflexive, breakdown-oriented process of analysis. We argue that situating analysis in the present of analyzing emphasizes and acknowledges...... the interdependency between researcher and researched. On this basis, we advocate an explicit “open-state-of mind” listening as a key aspect of analyzing qualitative material, often described only as a matter of reading transcribed empirical materials, reading theory, and writing. The article contributes...

  16. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  17. Analyzing Microarray Data.

    Science.gov (United States)

    Hung, Jui-Hung; Weng, Zhiping

    2017-03-01

    Because there is no widely used software for analyzing RNA-seq data that has a graphical user interface, this protocol provides an example of analyzing microarray data using Babelomics. This analysis entails performing quantile normalization and then detecting differentially expressed genes associated with the transgenesis of a human oncogene c-Myc in mice. Finally, hierarchical clustering is performed on the differentially expressed genes using the Cluster program, and the results are visualized using TreeView.

  18. Analyzing the Alternatives

    Science.gov (United States)

    Grayson, Jennifer

    2010-01-01

    Technologies like solar, wind, and geothermal are exciting, but relatively new and untested in the context of universities, many of which are large enough to be cities unto themselves. And, like cities, universities count on a reliable, uninterruptible source of power. It is imperative, too, that precious dollars are wisely invested in the right…

  19. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  20. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  1. Analyzing radioligand binding data.

    Science.gov (United States)

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  2. Analyzing EUV mask costs

    Science.gov (United States)

    Lercel, Michael; Kasprowicz, Bryan

    2016-10-01

    The introduction of Extreme Ultraviolet Lithography (EUV) as a replacement for multiple patterning is based on improvements of cycle time, yield, and cost. Earlier cost studies have assumed a simple assumption that EUV masks (being more complex with the multilayer coated blank) are not more than three times as expensive as advanced ArFi (ArF immersion) masks. EUV masks are expected to be more expensive during the ramp of the technology because of the added cost of the complex mask blank, the use of EUV specific mask tools, and a ramp of yield learning relative to the more mature technologies. This study concludes that, within a range of scenarios, the hypothesis that EUV mask costs are not more than three times that of advanced ArFi masks is valid and conservative.

  3. 模糊环境下不对称企业的技术创新投资期权博弈分析%Option-Game Approach to Analyze Technology Innovation Investment With Cost Asymmetry Under the Fuzzy Environment

    Institute of Scientific and Technical Information of China (English)

    谭英双; 衡爱民; 龙勇; 吴宏伟; 江礼梅

    2011-01-01

    Basing on the asymmetric duopoly option-game model with investment cost asymmetry, this research discusses the present value of profit flows and the sunk investment costs for the trapezoidal fuzzy number, and Firms' technology innovation investment strategy is analyzed. It constructs followers, leaders of investment value and investment threshold of fuzzy expressions under the fuzzy environment to conduct numerical analysis. And it is concluded that there is still existing the best increasing investment strategy under fuzzy environment, with the development of the trapezoidal fuzzy number of the sunk cost of expected investment, the investment value of business declines, but the critical value for the investment ascends. This offers a kind of explanation to the investment strategies under the fuzzy environment.%本文在不对称双头垄断期权博弈模型基础上讨论了利润流现值和沉没投资成本为梯形模糊数的情形并进行了扩展,对企业技术创新投资策略进行了分析。构建了模糊环境下追随者、领导者的投资价值和投资临界值的模糊表达式并进行数值分析。分析表明模糊环境下仍存在最优投资策略,随着梯形模糊数的沉没投资成本期望值的增加,企业的投资价值下降而投资临界值上升。为模糊环境下投资决策提供了一种解释。

  4. Portable Fuel Quality Analyzer

    Science.gov (United States)

    2014-01-27

    other transportation industries, such as trucking. The PFQA could also be used in fuel blending operations performed at petroleum, ethanol and biodiesel plants. ...JAN 2014 2. REPORT TYPE Project Summary 3. DATES COVERED 29-07-2013 to 27-01-2014 4. TITLE AND SUBTITLE PORTABLE FUEL QUALITY ANALYZER

  5. Analyzing Workforce Education. Monograph.

    Science.gov (United States)

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  6. Analyzing Stereotypes in Media.

    Science.gov (United States)

    Baker, Jackie

    1996-01-01

    A high school film teacher studied how students recognized messages in film, examining how film education could help students identify and analyze racial and gender stereotypes. Comparison of students' attitudes before and after the film course found that the course was successful in raising students' consciousness. (SM)

  7. Magnetoresistive emulsion analyzer.

    Science.gov (United States)

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and nanomedicine including drug design and screening.

  8. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias

    2012-01-01

    an algorithm to retrieve kinase predictions from the public NetworKIN webpage in a semiautomated way and applies hereafter advanced statistics to facilitate a user-tailored in-depth analysis of the phosphoproteomic data sets. The interface of the software provides a high degree of analytical flexibility...... and is designed to be intuitive for most users. PhosphoSiteAnalyzer is a freeware program available at http://phosphosite.sourceforge.net ....

  9. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  10. Analyzing machine noise for real time maintenance

    Science.gov (United States)

    Yamato, Yoji; Fukumoto, Yoshifumi; Kumazaki, Hiroki

    2017-02-01

    Recently, IoT technologies have been progressed and applications of maintenance area are expected. However, IoT maintenance applications are not spread in Japan yet because of one-off solution of sensing and analyzing for each case, high cost to collect sensing data and insufficient maintenance automation. This paper proposes a maintenance platform which analyzes sound data in edges, analyzes only anomaly data in cloud and orders maintenance automatically to resolve existing technology problems. We also implement a sample application and compare related work.

  11. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  12. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    financial statement. Plumlee (2003) finds for instance that such information imposes significant costs on even expert users such as analysts and fund managers and reduces their use of it. Analysts’ ability to incorporate complex information in their analyses is a decreasing function of its complexity......, because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...

  13. Mineral/Water Analyzer

    Science.gov (United States)

    1983-01-01

    An x-ray fluorescence spectrometer developed for the Viking Landers by Martin Marietta was modified for geological exploration, water quality monitoring, and aircraft engine maintenance. The aerospace system was highly miniaturized and used very little power. It irradiates the sample causing it to emit x-rays at various energies, then measures the energy levels for sample composition analysis. It was used in oceanographic applications and modified to identify element concentrations in ore samples, on site. The instrument can also analyze the chemical content of water, and detect the sudden development of excessive engine wear.

  14. Analyzing Aeroelasticity in Turbomachines

    Science.gov (United States)

    Reddy, T. S. R.; Srivastava, R.

    2003-01-01

    ASTROP2-LE is a computer program that predicts flutter and forced responses of blades, vanes, and other components of such turbomachines as fans, compressors, and turbines. ASTROP2-LE is based on the ASTROP2 program, developed previously for analysis of stability of turbomachinery components. In developing ASTROP2- LE, ASTROP2 was modified to include a capability for modeling forced responses. The program was also modified to add a capability for analysis of aeroelasticity with mistuning and unsteady aerodynamic solutions from another program, LINFLX2D, that solves the linearized Euler equations of unsteady two-dimensional flow. Using LINFLX2D to calculate unsteady aerodynamic loads, it is possible to analyze effects of transonic flow on flutter and forced response. ASTROP2-LE can be used to analyze subsonic, transonic, and supersonic aerodynamics and structural mistuning for rotors with blades of differing structural properties. It calculates the aerodynamic damping of a blade system operating in airflow so that stability can be assessed. The code also predicts the magnitudes and frequencies of the unsteady aerodynamic forces on the airfoils of a blade row from incoming wakes. This information can be used in high-cycle fatigue analysis to predict the fatigue lives of the blades.

  15. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  16. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  17. Analyzing geographic clustered response

    Energy Technology Data Exchange (ETDEWEB)

    Merrill, D.W.; Selvin, S.; Mohr, M.S.

    1991-08-01

    In the study of geographic disease clusters, an alternative to traditional methods based on rates is to analyze case locations on a transformed map in which population density is everywhere equal. Although the analyst's task is thereby simplified, the specification of the density equalizing map projection (DEMP) itself is not simple and continues to be the subject of considerable research. Here a new DEMP algorithm is described, which avoids some of the difficulties of earlier approaches. The new algorithm (a) avoids illegal overlapping of transformed polygons; (b) finds the unique solution that minimizes map distortion; (c) provides constant magnification over each map polygon; (d) defines a continuous transformation over the entire map domain; (e) defines an inverse transformation; (f) can accept optional constraints such as fixed boundaries; and (g) can use commercially supported minimization software. Work is continuing to improve computing efficiency and improve the algorithm. 21 refs., 15 figs., 2 tabs.

  18. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  19. Analyzing Pseudophosphatase Function.

    Science.gov (United States)

    Hinton, Shantá D

    2016-01-01

    Pseudophosphatases regulate signal transduction cascades, but their mechanisms of action remain enigmatic. Reflecting this mystery, the prototypical pseudophosphatase STYX (phospho-serine-threonine/tyrosine-binding protein) was named with allusion to the river of the dead in Greek mythology to emphasize that these molecules are "dead" phosphatases. Although proteins with STYX domains do not catalyze dephosphorylation, this in no way precludes their having other functions as integral elements of signaling networks. Thus, understanding their roles in signaling pathways may mark them as potential novel drug targets. This chapter outlines common strategies used to characterize the functions of pseudophosphatases, using as an example MK-STYX [mitogen-activated protein kinase (MAPK) phospho-serine-threonine/tyrosine binding], which has been linked to tumorigenesis, apoptosis, and neuronal differentiation. We start with the importance of "restoring" (when possible) phosphatase activity in a pseudophosphatase so that the active mutant may be used as a comparison control throughout immunoprecipitation and mass spectrometry analyses. To this end, we provide protocols for site-directed mutagenesis, mammalian cell transfection, co-immunoprecipitation, phosphatase activity assays, and immunoblotting that we have used to investigate MK-STYX and the active mutant MK-STYXactive. We also highlight the importance of utilizing RNA interference (RNAi) "knockdown" technology to determine a cellular phenotype in various cell lines. Therefore, we outline our protocols for introducing short hairpin RNA (shRNA) expression plasmids into mammalians cells and quantifying knockdown of gene expression with real-time quantitative PCR (qPCR). A combination of cellular, molecular, biochemical, and proteomic techniques has served as powerful tools in identifying novel functions of the pseudophosphatase MK-STYX. Likewise, the information provided here should be a helpful guide to elucidating the

  20. TEAMS Model Analyzer

    Science.gov (United States)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  1. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  2. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  3. Bios data analyzer.

    Science.gov (United States)

    Sabelli, H; Sugerman, A; Kovacevic, L; Kauffman, L; Carlson-Sabelli, L; Patel, M; Konecki, J

    2005-10-01

    The Bios Data Analyzer (BDA) is a set of computer programs (CD-ROM, in Sabelli et al., Bios. A Study of Creation, 2005) for new time series analyses that detects and measures creative phenomena, namely diversification, novelty, complexes, nonrandom complexity. We define a process as creative when its time series displays these properties. They are found in heartbeat interval series, the exemplar of bios .just as turbulence is the exemplar of chaos, in many other empirical series (galactic distributions, meteorological, economic and physiological series), in biotic series generated mathematically by the bipolar feedback, and in stochastic noise, but not in chaotic attractors. Differencing, consecutive recurrence and partial autocorrelation indicate nonrandom causation, thereby distinguishing chaos and bios from random and random walk. Embedding plots distinguish causal creative processes (e.g. bios) that include both simple and complex components of variation from stochastic processes (e.g. Brownian noise) that include only complex components, and from chaotic processes that decay from order to randomness as the number of dimensions is increased. Varying bin and dimensionality show that entropy measures symmetry and variety, and that complexity is associated with asymmetry. Trigonometric transformations measure coexisting opposites in time series and demonstrate bipolar, partial, and uncorrelated opposites in empirical processes and bios, supporting the hypothesis that bios is generated by bipolar feedback, a concept which is at variance with standard concepts of polar and complementary opposites.

  4. 基于Solexa高通量测序的黄曲条跳甲转录组学研究%Transcriptome characteristics of Phyllotreta striolata (Fabricius) (Coleoptera: Chrysomelidae ) analyzed by using Illumina' s Solexa sequencing technology

    Institute of Scientific and Technical Information of China (English)

    贺华良; 宾淑英; 吴仲真; 林进添

    2012-01-01

    黄曲条跳甲Phyllotreta striolata( Fabricius)是十字花科蔬菜的重要害虫.为深入了解其遗传信息,本研究应用新一代高通量测序技术Illumina's Solexa平台对黄曲条跳甲成虫的转录组进行测序,并结合SOAPdenovo拼接聚类等分析软件,获取大量的EST和挖掘功能基因.本文最终获得了4 924条序列重叠群(contig),其中包含2 209种与黑腹果蝇Drosophila melanogaster蛋白基因具直系同源的独立基因(unigene)和610种黄曲条跳甲物种特有的unigene.结合Gene Ontology (GO)数据库进行分析,发现大部分的unigene具结合能力(binding capability)和催化活性( catalytic activity);上百种unigene可聚类于生物学过程分类中的配子发生、生殖腺发育和交配行为等重要功能.另外,结合KEGG Pathway数据库分析发现,共有363种unigene参与或涉及了40种代谢路径,其中生物钟调控路径和植物次生代谢物路径等相关基因的发现,有助于深入研究黄曲条跳甲行为发生的内在机理.Solexa高通量测序技术作为昆虫功能基因组研究的重要手段,为发掘黄曲条跳甲功能基因发挥了重要作用,也为在分子水平上研发黄曲条跳甲的防治新策略提供了更翔实的基因信息.%The striped flea beetle, Phyllotreta striolata ( Fabricius) , is an important pest damaging cruciferous vegetables. In order to investigate the profile of gene expression and elucidate the functional genes, we sequenced the transcriptome of the adult of P. Striolata by Illumina ' s Solexa sequencing technology, and analyzed the data of expressed sequence tags (ESTs) by using SOAPdenovo system. A total of 4 924 contigs were obtained including 2 209 unigenes of orthologous genes relating to Drosophila melanogaster and 610 species-specific unigenes of P. Striolata based on Gene Ontology and KEGG databases. We found that most of unigenes contain function domains with binding capacity and catalytic activity. More than 100

  5. 基于RS与GIS的大庆市城市空间形态演化分析%Analyzing Daqing's Urban Spatial Form Evolution: Basing on the Technology of RS and GIS

    Institute of Scientific and Technical Information of China (English)

    王士君; 王若菊; 王永超; 刘成玉

    2012-01-01

    Taking the current largest oil industrial city of China--Daqing as researching object. In this paper GIS technology is used to extracts the related information such as the scale, shape, growth and location coordinate of urban construct land use from satellitic imagine data of Landsat MSS and TM in 1984, 1995 and 2007.Based on the above data, calculate the speed, intensity, compact degree, fractal dimension of urban sprawl and level of urban smart growth, so as to analyze the urban sprawl characteristics of Daqing since 1984. The paper also explores the formation reasons of these characteristics using economic and social data, the method of regional statistic analyse as well. Daqing' s large scale of construction land use, fast sprawl, spatial imbalance, anomaly of shape, low compact degree, direction consistency of urban sprawl and urban center movement are universal characters of general oil industrial cities expanding and development. But these characters last only a short time. The formation reasons of these characters are oil exploitation strategy, the limitation of nature environment, the guidance of traffic infrastructure, the control of urban planning, as well as the fimction transfer of central place.%以我国目前最大石油城市——大庆市为研究对象,选取1984、1995、2007年3个城市建设关键时间节点,以Landsat MSS、TM卫星影像为主要数据源,利用GIS技术提取城市建设用地规模、形态、增长幅度、区位坐标等信息,计算城市空间扩张速度、强度、城市空间形态紧凑度、分维指数、城市增长理性程度等空间特征量,分析大庆市1984年以来城市空间扩张现象和特征,并结合经济社会统计相关数据及区域分析方法,探究其因果关系。研究认为,大庆市城市空间扩张与形态演化具有石油城市发展的典型性和一般规律,表现出建设用地总量大、扩张速度快、强度高、增量分布不均衡、空间形态不规则、

  6. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  7. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  8. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  9. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  10. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  11. Analyzing the Biology on the System Level

    Institute of Scientific and Technical Information of China (English)

    Wei Tong

    2004-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology, and summarizes the analysis methods, experimental technologies, research developments, and so on in the four key fields of systems biology-systemic structures, dynamics, control methods, and design principles.

  12. Analyze of Employment of Oral Technology Professional Students From Higher Vocational College--Taking Dental Technology Professional,Taizhou Polytechnic College for Example%高职口腔医学技术专业学生就业情况分析--以泰州职业技术学院口腔医学技术专业为例

    Institute of Scientific and Technical Information of China (English)

    俞大力

    2014-01-01

    义齿加工行业需要大量的技术技能型专门人才,但是,高职口腔医学技术专业的学生在就业中存在较高比例的不对口就业及改行现象。通过对泰州职业技术学院口腔医学技术专业历届毕业生的调查,对学生就业情况及形成原因进行分析,建议学校方面可采取加大专业宣传力度、加强入学教育、进行专业认知实习、专业思想教育贯穿教学全过程、重视职业生涯规划、提升学生就业能力、加强实习基地建设、指导学生转变就业观念等对策,提高对口就业率、减少改行现象。%Denture processing industry needs a large number of technical skilled talents , however , higher propor-tion of the wrong job and the phenomenon of changing jobs is existed in the oral technology professional gradu-ates . Through the survey of previous graduates, analyze the employment situation and the reasons , suggest that the authority increase the professional publicity , strengthen the entrance education , recognize profession , en-hance the professional education thought, pay attention to the professional plan, promote students' employment ability, strengthen the construction of practice base, guide students to change employment idea and so on.

  13. Soft Decision Analyzer and Method

    Science.gov (United States)

    Steele, Glen F. (Inventor); Lansdowne, Chatwin (Inventor); Zucha, Joan P. (Inventor); Schlesinger, Adam M. (Inventor)

    2016-01-01

    A soft decision analyzer system is operable to interconnect soft decision communication equipment and analyze the operation thereof to detect symbol wise alignment between a test data stream and a reference data stream in a variety of operating conditions.

  14. O profissional da informática e sua personalidade analisada por meio da técnica de Rorschach The information technology professionals and their personality analyzed by Rorschach technique

    Directory of Open Access Journals (Sweden)

    Seille Cristine Garcia Santos

    2005-12-01

    Full Text Available Este artigo apresenta os resultados de um estudo comparativo da capacidade de análise, iniciativa e relacionamento humano entre informatas gerentes e operacionais. Participaram 66 informatas, de 9 empresas e 5 departamentos de informática com até 150 funcionários, de Porto Alegre e Grande Porto Alegre. Foram utilizados a técnica de Rorschach (Sistema Klopfer e um questionário estruturado. O mesmo questionário foi respondido pelo superior imediato de cada participante do estudo. Os resultados mostram que não existem diferenças significativas (t-Test e correlação de Pearson entre informatas gerentes e informatas operacionais com relação à capacidade de análise, iniciativa e relacionamento humano. Os operacionais se diferenciam dos gerentes no que diz respeito à liberação das reações emocionais com menos controle. É discutida a presença em ambos os grupos de indicativos no Rorschach de dificuldades para interagir com outras pessoas.This article presents the outcome of a comparative study carried out about the analysis capabilities, initiative, and human relationship among IT (Information Technology managers and IT systems analysts and programmers. Sixty-six IT individuals from nine different companies and five IT divisions with up to 150 employees working in Porto Alegre and its metropolitan region were surveyed. The Rorschach's technique (Klopfer System and a structured questionnaire were applied. The same questionnaire was answered by the immediate superior of each subordinate surveyed by this study. The results show that there is no significant difference (t-Test and Pearson correlation between IT managers and IT systems analysts and programmers regarding their analysis capabilities, initiative and human relationship. The operational individuals distinguish themselves from the managerial ones regarding the release of less controlled emotional reactions. Difficulties to interact with others based on Rorschach's indicatives

  15. Development of pulse neutron coal analyzer

    Science.gov (United States)

    Jing, Shi-wie; Gu, De-shan; Qiao, Shuang; Liu, Yu-ren; Liu, Lin-mao; Shi-wei, Jing

    2005-04-01

    This article introduced the development of pulsed neutron coal analyzer by pulse fast-thermal neutron analysis technology in the Radiation Technology Institute of Northeast Normal University. The 14MeV pulse neutron generator and bismuth germanate detector and 4096 multichannel analyzer were applied in this system. The multiple linear regression method employed to process data solved the interferential problem of multiple elements. The prototype (model MZ-MKFY) had been applied in Changshan and Jilin power plant for about a year. The results of measuring the main parameters of coal such as low caloric power, whole total water, ash content, volatile content, and sulfur content, with precision acceptable to the coal industry, are presented.

  16. Advancements in analyzing food quality

    Science.gov (United States)

    This editorial provides insight on investigations regarding advancement in the application of technology and it’s advancement to food quality. The discussion elaborates on the advantages of recent analytical technologies and techniques, along with their impact on food safety, characterization of its...

  17. Transcriptome characteristics of Paspalum vaginatum analyzed with Illumina sequencing technology%基于高通量测序的海滨雀稗转录组学研究

    Institute of Scientific and Technical Information of China (English)

    贾新平; 叶晓青; 梁丽建; 邓衍明; 孙晓波; 佘建明

    2014-01-01

    The transcriptome of Paspalum vaginatum leaf was sequenced using an Illumina HiSeq 2000 plat-form,which is a new generation of high-throughput sequencing technology used to study expression profiles and to predict functional genes.In the target sample,a total of 47520544 reads containing 4752054400 bp of se-quence information were generated.A total of 81220 unigenes containing 87542503 bp sequence information were formed by initial sequence splicing,with an average read length of 1077 bp.Unigene qualities for several aspects were assessed,such as length distribution,GC content and gene expression level.The sequencing data was of high quality and reliability.The 46169 unigenes were annotated using BLAST searches against the Nr, Nt and SwissProt databases.All the assembled unigenes could be broadly divided into biological processes,cel-lular components and 48 branches of molecular function categories by gene ontology,including metabolic process,binding and cellular processes.The unigenes were further annotated based on COG category,which could be grouped into 25 functional categories.The unigenes could be broadly divided into 112 classes according to their metabolic pathway,including the phenylalanine metabolism pathway,plant-pathogen interaction,plant hormone biosynthesis and signal transduction,flavonoid biosynthesis,terpenoid backbone biosynthesis,lipid metabolism,and RNA degradation.There were 22721 SSR in 81220 unigenes and in the SSR,A/T was the highest repeat,following by CCG/CGG and AGC/CTG.This study is the first comprehensive transcriptome a-nalysis for Paspalum vaginatum ,providing valuable genome data sources for the molecular biology of this grass.%采用新一代高通量测序技术 Illumina HiSeq 2000对海滨雀稗叶片转录组进行测序,结合生物信息学方法开展基因表达谱研究和功能基因预测。通过测序,获得了47520544个序列读取片段(reads),包含了4752054400个碱基序列(bp)信息。对 reads

  18. Analyzing Valuation Practices through Contracts

    DEFF Research Database (Denmark)

    Tesnière, Germain; Labatut, Julie; Boxenbaum, Eva

    This paper seeks to analyze the most recent changes in how societies value animals. We analyze this topic through the prism of contracts between breeding companies and farmers. Focusing on new valuation practices and qualification of breeding animals, we question the evaluation of difficult...

  19. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  20. ANALYZE Users' Guide

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, S.

    1982-10-01

    This report is a reproduction of the visuals that were used in the ANALYZE Users' Guide lectures of the videotaped LLNL Continuing Education Course CE2018-H, State Space Lectures. The course was given in Spring 1982 through the EE Department Education Office. Since ANALYZE is menu-driven, interactive, and has self-explanatory questions (sort of), these visuals and the two 50-minute videotapes are the only documentation which comes with the code. More information about the algorithms contained in ANALYZE can be obtained from the IEEE book on Programs for Digital Signal Processing.

  1. M-Learning and Technological Literacy: Analyzing Benefits for Apprenticeship

    Science.gov (United States)

    Cortés, Carlos Manuel Pacheco; Cortés, Adriana Margarita Pacheco

    2014-01-01

    The following study consists on comparative literature review conducted by several researchers and instructional designers; for a wide comprehension of Mobile-Learning (abbreviated "M-Learning") as an educational platform to provide "anytime-anywhere" access to interactions and resources on-line, and "Technological…

  2. Analyzing Technology Adoption - The Case of Kerala Home Gardens

    Directory of Open Access Journals (Sweden)

    Reeba Jacob

    2016-05-01

    Full Text Available Homegardens are traditional agroforestry system with a unique structure and function. It is the predominant farming system in Kerala. The study was undertaken in Thiruvananthapuram district covering a sample of 100 homegardens farmers from all the five agro-ecological units with an aim to assess the level of adoption of selected Kerala Agricultural University (KAU production practices in homegardens. Results of the study identified that majority of the farmers (63% belonged to medium level of adoption. Adoption quotient was worked out and compared with standard Rogers curve. Correlation analysis of the independent variables with the dependent variable viz., level of adoption indicated that age, farming experience, knowledge, evaluative perception, mass media contribution, livestock possession and extension contribution had direct significant effect on level of adoption of KAU production practices by homegarden farmers.

  3. C2Analyzer:Co-target-Co-function Analyzer

    Institute of Scientific and Technical Information of China (English)

    Md Aftabuddin; Chittabrata Mal; Arindam Deb; Sudip Kundu

    2014-01-01

    MicroRNAs (miRNAs) interact with their target mRNAs and regulate biological pro-cesses at post-transcriptional level. While one miRNA can target many mRNAs, a single mRNA can also be targeted by a set of miRNAs. The targeted mRNAs may be involved in different bio-logical processes that are described by gene ontology (GO) terms. The major challenges involved in analyzing these multitude regulations include identification of the combinatorial regulation of miR-NAs as well as determination of the co-functionally-enriched miRNA pairs. The C2Analyzer:Co-target-Co-function Analyzer, is a Perl-based, versatile and user-friendly web tool with online instructions. Based on the hypergeometric analysis, this novel tool can determine whether given pairs of miRNAs are co-functionally enriched. For a given set of GO term(s), it can also identify the set of miRNAs whose targets are enriched in the given GO term(s). Moreover, C2Analyzer can also identify the co-targeting miRNA pairs, their targets and GO processes, which they are involved in. The miRNA-miRNA co-functional relationship can also be saved as a .txt file, which can be used to further visualize the co-functional network by using other software like Cytoscape. C2Analyzer is freely available at www.bioinformatics.org/c2analyzer.

  4. Analyzing petabytes of data with Hadoop

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  5. Remote Laser Diffraction Particle Size Distribution Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Batcheller, Thomas Aquinas; Huestis, Gary Michael; Bolton, Steven Michael

    2001-03-01

    In support of a radioactive slurry sampling and physical characterization task, an “off-the-shelf” laser diffraction (classical light scattering) particle size analyzer was utilized for remote particle size distribution (PSD) analysis. Spent nuclear fuel was previously reprocessed at the Idaho Nuclear Technology and Engineering Center (INTEC—formerly recognized as the Idaho Chemical Processing Plant) which is on DOE’s INEEL site. The acidic, radioactive aqueous raffinate streams from these processes were transferred to 300,000 gallon stainless steel storage vessels located in the INTEC Tank Farm area. Due to the transfer piping configuration in these vessels, complete removal of the liquid can not be achieved. Consequently, a “heel” slurry remains at the bottom of an “emptied” vessel. Particle size distribution characterization of the settled solids in this remaining heel slurry, as well as suspended solids in the tank liquid, is the goal of this remote PSD analyzer task. A Horiba Instruments Inc. Model LA-300 PSD analyzer, which has a 0.1 to 600 micron measurement range, was modified for remote application in a “hot cell” (gamma radiation) environment. This technology provides rapid and simple PSD analysis, especially down in the fine and microscopic particle size regime. Particle size analysis of these radioactive slurries down in this smaller range was not previously achievable—making this technology far superior than the traditional methods used. Successful acquisition of this data, in conjunction with other characterization analyses, provides important information that can be used in the myriad of potential radioactive waste management alternatives.

  6. The Photo-Pneumatic CO2 Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  7. Grid and Data Analyzing and Security

    Directory of Open Access Journals (Sweden)

    Fatemeh SHOKRI

    2012-12-01

    Full Text Available This paper examines the importance of secure structures in the process of analyzing and distributing information with aid of Grid-based technologies. The advent of distributed network has provided many practical opportunities for detecting and recording the time of events, and made efforts to identify the events and solve problems of storing information such as being up-to-date and documented. In this regard, the data distribution systems in a network environment should be accurate. As a consequence, a series of continuous and updated data must be at hand. In this case, Grid is the best answer to use data and resource of organizations by common processing.

  8. ConvAn: a convergence analyzing tool for optimization of biochemical networks.

    Science.gov (United States)

    Kostromins, Andrejs; Mozga, Ivars; Stalidzans, Egils

    2012-01-01

    Dynamic models of biochemical networks usually are described as a system of nonlinear differential equations. In case of optimization of models for purpose of parameter estimation or design of new properties mainly numerical methods are used. That causes problems of optimization predictability as most of numerical optimization methods have stochastic properties and the convergence of the objective function to the global optimum is hardly predictable. Determination of suitable optimization method and necessary duration of optimization becomes critical in case of evaluation of high number of combinations of adjustable parameters or in case of large dynamic models. This task is complex due to variety of optimization methods, software tools and nonlinearity features of models in different parameter spaces. A software tool ConvAn is developed to analyze statistical properties of convergence dynamics for optimization runs with particular optimization method, model, software tool, set of optimization method parameters and number of adjustable parameters of the model. The convergence curves can be normalized automatically to enable comparison of different methods and models in the same scale. By the help of the biochemistry adapted graphical user interface of ConvAn it is possible to compare different optimization methods in terms of ability to find the global optima or values close to that as well as the necessary computational time to reach them. It is possible to estimate the optimization performance for different number of adjustable parameters. The functionality of ConvAn enables statistical assessment of necessary optimization time depending on the necessary optimization accuracy. Optimization methods, which are not suitable for a particular optimization task, can be rejected if they have poor repeatability or convergence properties. The software ConvAn is freely available on www.biosystems.lv/convan.

  9. On-Demand Urine Analyzer

    Science.gov (United States)

    Farquharson, Stuart; Inscore, Frank; Shende, Chetan

    2010-01-01

    A lab-on-a-chip was developed that is capable of extracting biochemical indicators from urine samples and generating their surface-enhanced Raman spectra (SERS) so that the indicators can be quantified and identified. The development was motivated by the need to monitor and assess the effects of extended weightlessness, which include space motion sickness and loss of bone and muscle mass. The results may lead to developments of effective exercise programs and drug regimes that would maintain astronaut health. The analyzer containing the lab-on-a- chip includes materials to extract 3- methylhistidine (a muscle-loss indicator) and Risedronate (a bone-loss indicator) from the urine sample and detect them at the required concentrations using a Raman analyzer. The lab-on- a-chip has both an extractive material and a SERS-active material. The analyzer could be used to monitor the onset of diseases, such as osteoporosis.

  10. Simulation of a Hyperbolic Field Energy Analyzer

    CERN Document Server

    Gonzalez-Lizardo, Angel

    2016-01-01

    Energy analyzers are important plasma diagnostic tools with applications in a broad range of disciplines including molecular spectroscopy, electron microscopy, basic plasma physics, plasma etching, plasma processing, and ion sputtering technology. The Hyperbolic Field Energy Analyzer (HFEA) is a novel device able to determine ion and electron energy spectra and temperatures. The HFEA is well suited for ion temperature and density diagnostics at those situations where ions are scarce. A simulation of the capacities of the HFEA to discriminate particles of a particular energy level, as well as to determine temperature and density is performed in this work. The electric field due the combination of the conical elements, collimator lens, and Faraday cup applied voltage was computed in a well suited three-dimensional grid. The field is later used to compute the trajectory of a set of particles with a predetermined energy distribution. The results include the observation of the particle trajectories inside the sens...

  11. Analyzing the Grammar of English

    CERN Document Server

    Teschner, Richard V

    2007-01-01

    Analyzing the Grammar of English offers a descriptive analysis of the indispensable elements of English grammar. Designed to be covered in one semester, this textbook starts from scratch and takes nothing for granted beyond a reading and speaking knowledge of English. Extensively revised to function better in skills-building classes, it includes more interspersed exercises that promptly test what is taught, simplified and clarified explanations, greatly expanded and more diverse activities, and a new glossary of over 200 technical terms.Analyzing the Grammar of English is the only English gram

  12. An update on chemistry analyzers.

    Science.gov (United States)

    Vap, L M; Mitzner, B

    1996-09-01

    This update of six chemistry analyzers available to the clinician discusses several points that should be considered prior to the purchase of equipment. General topics include how to best match an instrument to clinic needs and the indirect costs associated with instrument operation. Quality assurance recommendations are discussed and common terms are defined. Specific instrument features, principles of operation, performance, and costs are presented. The information provided offers potential purchasers an objective approach to the evaluation of a chemistry analyzer for the veterinary clinic.

  13. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  14. Strategies for Analyzing Tone Languages

    Science.gov (United States)

    Coupe, Alexander R.

    2014-01-01

    This paper outlines a method of auditory and acoustic analysis for determining the tonemes of a language starting from scratch, drawing on the author's experience of recording and analyzing tone languages of north-east India. The methodology is applied to a preliminary analysis of tone in the Thang dialect of Khiamniungan, a virtually undocumented…

  15. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  16. FORTRAN Static Source Code Analyzer

    Science.gov (United States)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  17. Analyzing Classroom Instruction in Reading.

    Science.gov (United States)

    Rutherford, William L.

    A method for analyzing instructional techniques employed during reading group instruction is reported, and the characteristics of the effective reading teacher are discussed. Teaching effectiveness is divided into two categories: (1) how the teacher acts and interacts with children on a personal level and (2) how the teacher performs his…

  18. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  19. Introduction: why analyze single cells?

    Science.gov (United States)

    Di Carlo, Dino; Tse, Henry Tat Kwong; Gossett, Daniel R

    2012-01-01

    Powerful methods in molecular biology are abundant; however, in many fields including hematology, stem cell biology, tissue engineering, and cancer biology, data from tools and assays that analyze the average signals from many cells may not yield the desired result because the cells of interest may be in the minority-their behavior masked by the majority-or because the dynamics of the populations of interest are offset in time. Accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. In this chapter, we discuss the rationale for performing analyses on individual cells in more depth, cover the fields of study in which single-cell behavior is yielding new insights into biological and clinical questions, and speculate on how single-cell analysis will be critical in the future.

  20. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa; [Ukendt], editors

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  1. Analyzing viewpoint diversity in twitter

    OpenAIRE

    2013-01-01

    Information diversity has a long tradition in human history. Recently there have been claims that diversity is diminishing in information available in social networks. On the other hand, some studies suggest that diversity is actually quite high in social networks such as Twitter. However these studies only focus on the concept of source diversity and they only focus on American users. In this paper we analyze different dimensions of diversity. We also provide an experimental design in which ...

  2. Analyzing ion distributions around DNA.

    Science.gov (United States)

    Lavery, Richard; Maddocks, John H; Pasi, Marco; Zakrzewska, Krystyna

    2014-07-01

    We present a new method for analyzing ion, or molecule, distributions around helical nucleic acids and illustrate the approach by analyzing data derived from molecular dynamics simulations. The analysis is based on the use of curvilinear helicoidal coordinates and leads to highly localized ion densities compared to those obtained by simply superposing molecular dynamics snapshots in Cartesian space. The results identify highly populated and sequence-dependent regions where ions strongly interact with the nucleic and are coupled to its conformational fluctuations. The data from this approach is presented as ion populations or ion densities (in units of molarity) and can be analyzed in radial, angular and longitudinal coordinates using 1D or 2D graphics. It is also possible to regenerate 3D densities in Cartesian space. This approach makes it easy to understand and compare ion distributions and also allows the calculation of average ion populations in any desired zone surrounding a nucleic acid without requiring references to its constituent atoms. The method is illustrated using microsecond molecular dynamics simulations for two different DNA oligomers in the presence of 0.15 M potassium chloride. We discuss the results in terms of convergence, sequence-specific ion binding and coupling with DNA conformation.

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  4. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  5. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  6. Combining two technologies for full genome sequencing of human.

    Science.gov (United States)

    Skryabin, K G; Prokhortchouk, E B; Mazur, A M; Boulygina, E S; Tsygankova, S V; Nedoluzhko, A V; Rastorguev, S M; Matveev, V B; Chekanov, N N; D A, Goranskaya; Teslyuk, A B; Gruzdeva, N M; Velikhov, V E; Zaridze, D G; Kovalchuk, M V

    2009-10-01

    At present, the new technologies of DNA sequencing are rapidly developing allowing quick and efficient characterisation of organisms at the level of the genome structure. In this study, the whole genome sequencing of a human (Russian man) was performed using two technologies currently present on the market - Sequencing by Oligonucleotide Ligation and Detection (SOLiD™) (Applied Biosystems) and sequencing technologies of molecular clusters using fluorescently labeled precursors (Illumina). The total number of generated data resulted in 108.3 billion base pairs (60.2 billion from Illumina technology and 48.1 billion from SOLiD technology). Statistics performed on reads generated by GAII and SOLiD showed that they covered 75% and 96% of the genome respectively. Short polymorphic regions were detected with comparable accuracy however, the absolute amount of them revealed by SOLiD was several times less than by GAII. Optimal algorithm for using the latest methods of sequencing was established for the analysis of individual human genomes. The study is the first Russian effort towards whole human genome sequencing.

  7. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  8. Method for analyzing microbial communities

    Science.gov (United States)

    Zhou, Jizhong [Oak Ridge, TN; Wu, Liyou [Oak Ridge, TN

    2010-07-20

    The present invention provides a method for quantitatively analyzing microbial genes, species, or strains in a sample that contains at least two species or strains of microorganisms. The method involves using an isothermal DNA polymerase to randomly and representatively amplify genomic DNA of the microorganisms in the sample, hybridizing the resultant polynucleotide amplification product to a polynucleotide microarray that can differentiate different genes, species, or strains of microorganisms of interest, and measuring hybridization signals on the microarray to quantify the genes, species, or strains of interest.

  9. Analyzing and mining image databases.

    Science.gov (United States)

    Berlage, Thomas

    2005-06-01

    Image mining is the application of computer-based techniques that extract and exploit information from large image sets to support human users in generating knowledge from these sources. This review focuses on biomedical applications, in particular automated imaging at the cellular level. An image database is an interactive software application that combines data management, image analysis and visual data mining. The main characteristic of such a system is a layer that represents objects within an image, and that represents a large spectrum of quantitative and semantic object features. The image analysis needs to be adapted to each particular experiment, so 'end-user programming' will be desirable to make the technology more widely applicable.

  10. Thermal and evolved gas analyzer

    Science.gov (United States)

    Williams, M. S.; Boynton, W. V.; James, R. L.; Verts, W. T.; Bailey, S. H.; Hamara, D. K.

    1998-01-01

    The Thermal and Evolved Gas Analyzer (TEGA) instrument will perform calorimetry and evolved gas analysis on soil samples collected from the Martian surface. TEGA is one of three instruments, along with a robotic arm, that form the Mars Volatile and Climate Survey (MVACS) payload. The other instruments are a stereo surface imager, built by Peter Smith of the University of Arizona and a meteorological station, built by JPL. The MVACS lander will investigate a Martian landing site at approximately 70 deg south latitude. Launch will take place from Kennedy Space Center in January, 1999. The TEGA project started in February, 1996. In the intervening 24 months, a flight instrument concept has been designed, prototyped, built as an engineering model and flight model, and tested. The instrument performs laboratory-quality differential-scanning calorimetry (DSC) over the temperature range of Mars ambient to 1400K. Low-temperature volatiles (water and carbon dioxide ices) and the carbonates will be analyzed in this temperature range. Carbonates melt and evolve carbon dioxide at temperatures above 600 C. Evolved oxygen (down to a concentration of 1 ppm) is detected, and C02 and water vapor and the isotopic variations of C02 and water vapor are detected and their concentrations measured. The isotopic composition provides important tests of the theory of solar system formation.

  11. VOSA: A VO SED Analyzer

    Science.gov (United States)

    Rodrigo, C.; Bayo, A.; Solano, E.

    2017-03-01

    VOSA (VO Sed Analyzer, http://svo2.cab.inta-csic.es/theory/vosa) is a public web-tool developed by the Spanish Virtual Observatory (http://svo.cab.inta-csic.es/) and designed to help users to (1) build Spectral Energy Distributions (SEDs) combining private photometric measurements with data available in VO services, (2) obtain relevant properties of these objects (distance, extinction, etc) from VO catalogs, (3) analyze them comparing observed photometry with synthetic photometry from different collections of theoretical models or observational templates, using different techniques (chi-square minimization, Bayesian analysis) to estimate physical parameters of the observed objects (teff, logg, metallicity, stellar radius/distance ratio, infrared excess, etc), and use these results to (4) estimate masses and ages via interpolation of collections of isochrones and evolutionary tracks from the VO. In particular, VOSA offers the advantage of deriving physical parameters using all the available photometric information instead of a restricted subset of colors. The results can be downloaded in different formats or sent to other VO tools using SAMP. We have upgraded VOSA to provide access to Gaia photometry and give a homogeneous estimation of the physical parameters of thousands of objects at a time. This upgrade has required the implementation of a new computation paradigm, including a distributed environment, the capability of submitting and processing jobs in an asynchronous way, the use of parallelized computing to speed up processes (˜ ten times faster) and a new design of the web interface.

  12. Coaxial charged particle energy analyzer

    Science.gov (United States)

    Kelly, Michael A. (Inventor); Bryson, III, Charles E. (Inventor); Wu, Warren (Inventor)

    2011-01-01

    A non-dispersive electrostatic energy analyzer for electrons and other charged particles having a generally coaxial structure of a sequentially arranged sections of an electrostatic lens to focus the beam through an iris and preferably including an ellipsoidally shaped input grid for collimating a wide acceptance beam from a charged-particle source, an electrostatic high-pass filter including a planar exit grid, and an electrostatic low-pass filter. The low-pass filter is configured to reflect low-energy particles back towards a charged particle detector located within the low-pass filter. Each section comprises multiple tubular or conical electrodes arranged about the central axis. The voltages on the lens are scanned to place a selected energy band of the accepted beam at a selected energy at the iris. Voltages on the high-pass and low-pass filters remain substantially fixed during the scan.

  13. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  14. Miles Technicon H.2 automated hematology analyzer.

    Science.gov (United States)

    1992-11-01

    Automated hematology analyzers are used in all large hospitals and most commercial laboratories, as well as in most smaller hospitals and laboratories, to perform complete blood counts (including white blood cell, red blood cell, and platelet counts; hemoglobin concentration; and RBC indices) and white blood cell differential counts. Our objectives in this study are to provide user guidance for selecting, purchasing, and using an automated hematology analyzer, as well as to present an overview of the technology used in an automated five-part differential unit. Specifications for additional automated units are available in ECRI's Clinical Laboratory Product Comparison System. We evaluated the Miles Technicon H.2 unit and rated it Acceptable. The information in this Single Product Evaluation is also useful for purchasing other models; our criteria will guide users in assessing components, and our findings and discussions on some aspects of automated hematology testing are common to many available systems. We caution readers not to base purchasing decisions on our rating of the Miles unit alone, but on a thorough understanding of the issues surrounding automated hematology analyzers, which can be gained only by reading this report in its entirety. The willingness of manufacturers to cooperate in our studies and the knowledge they gain through participating lead to the development of better products. Readers should refer to the Guidance Section, "Selecting and Purchasing an Automated Hematology Analyzer," where we discuss factors such as standardization, training, human factors, manufacturer support, patient population, and special features that the laboratory must consider before obtaining any automated unit; we also provide an in-depth review of cost issues, including life-cycle cost analyses, acquisition methods and costs of hardware and supplies, and we describe the Hemacost and Hemexmpt cost worksheets for use with our PresValu and PSV Manager CAHDModel software

  15. Analyzing and mining automated imaging experiments.

    Science.gov (United States)

    Berlage, Thomas

    2007-04-01

    Image mining is the application of computer-based techniques that extract and exploit information from large image sets to support human users in generating knowledge from these sources. This review focuses on biomedical applications of this technique, in particular automated imaging at the cellular level. Due to increasing automation and the availability of integrated instruments, biomedical users are becoming increasingly confronted with the problem of analyzing such data. Image database applications need to combine data management, image analysis and visual data mining. The main point of such a system is a software layer that represents objects within an image and the ability to use a large spectrum of quantitative and symbolic object features. Image analysis needs to be adapted to each particular experiment; therefore, 'end user programming' will be desired to make the technology more widely applicable.

  16. Complex networks theory for analyzing metabolic networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; YU Hong; LUO Jianhua; CAO Z.W.; LI Yixue

    2006-01-01

    One of the main tasks of post-genomic informatics is to systematically investigate all molecules and their interactions within a living cell so as to understand how these molecules and the interactions between them relate to the function of the organism,while networks are appropriate abstract description of all kinds of interactions. In the past few years, great achievement has been made in developing theory of complex networks for revealing the organizing principles that govern the formation and evolution of various complex biological, technological and social networks. This paper reviews the accomplishments in constructing genome-based metabolic networks and describes how the theory of complex networks is applied to analyze metabolic networks.

  17. The Tragedy of a Modern Prometheus:Analyzing the Sci-technological Alienation in Mary Shelly ’s Frankenstein%一个现代普罗米修斯的悲剧--科技异化视域下解读玛丽·雪莱的《弗兰肯斯坦》

    Institute of Scientific and Technical Information of China (English)

    朱岩岩

    2015-01-01

    With the development of postmodern views on science and technology ,sci-technological alienation is attracting increasing attention nowadays .In Mary Shelly’s Frankenstein ,a British science fiction in 19th centu-ry ,the author depicts an unbelievable scientific experiment of a Prometheus-like scientist and his tragic fate brought by his unfathomable endeavor .The novel shows not only the fast paces of sci-technological development then ,but also the sincere reflections of people on the potential disasters that the unbridled scientific development may incur .Besides ,from the perspective of sci-technological alienation ,the article analyzes Frankenstein’s ma-nia in scientific pursuit ,his blasphemy in imitating God in creating a man ,his betrayal of humanity and morality in his experiments and finally his suffering from a tragic fate .%随着后现代主义科学观的兴起,科技异化的现象越来越引起人们关注。十九世纪英国作家玛丽·雪莱的科幻小说《弗兰肯斯坦》通过一个科学家制造怪物的冒险经历和由此遭受的悲剧命运,反映出当时英国科技突飞猛进的发展进程,并深刻反思科技异化给人类带来的恶果。以科技异化为理论依据剖析小说同名主人公弗兰肯斯坦疯狂追逐科技、狂妄模仿上帝造人和最终幡然悔悟的人生历程,借以诠释一个现代普罗米修斯的悲剧并警示放纵科技野心可能给人类带来无妄之灾。

  18. Analyzing and modeling heterogeneous behavior

    Science.gov (United States)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  19. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  20. Analyzing Agricultural Agglomeration in China

    Directory of Open Access Journals (Sweden)

    Erling Li

    2017-02-01

    Full Text Available There has been little scholarly research on Chinese agriculture’s geographic pattern of agglomeration and its evolutionary mechanisms, which are essential to sustainable development in China. By calculating the barycenter coordinates, the Gini coefficient, spatial autocorrelation and specialization indices for 11 crops during 1981–2012, we analyze the evolutionary pattern and mechanisms of agricultural agglomeration. We argue that the degree of spatial concentration of Chinese planting has been gradually increasing and that regional specialization and diversification have progressively been strengthened. Furthermore, Chinese crop production is moving from the eastern provinces to the central and western provinces. This is in contrast to Chinese manufacturing growth which has continued to be concentrated in the coastal and southeastern regions. In Northeast China, the Sanjiang and Songnen plains have become agricultural clustering regions, and the earlier domination of aquaculture and rice production in Southeast China has gradually decreased. In summary, this paper provides a political economy framework for understanding the regionalization of Chinese agriculture, focusing on the interaction among the objectives, decisionmaking behavior, path dependencies and spatial effects.

  1. Geospatial Technology

    Science.gov (United States)

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  2. Objects in Films: analyzing signs

    Directory of Open Access Journals (Sweden)

    GAMBARATO, Renira Rampazzo

    2009-12-01

    Full Text Available The focus of this essay is the analysis of daily objects as signs in films. Objects from everyday life acquire several functions in films: they can be solely used as scene objects or to support a particular film style. Other objects are specially chosen to translate a character’s interior state of mind or the filmmaker’s aesthetical or ethical commitment to narrative concepts. In order to understand such functions and commitments, we developed a methodology for film analysis which focuses on the objects. Object interpretation, as the starting point of film analysis, is not a new approach. For instance, French film critic André Bazin proposed that use of object interpretation in the 1950s. Similarly, German film theorist Siegfried Kracauer stated it in the 1960s. However, there is currently no existing analytical model to use when engaging in object interpretation in film. This methodology searches for the most representative objects in films which involves both quantitative and qualitative analysis; we consider the number of times each object appears in a film (quantitative analysis as well as the context of their appearance, i.e. the type of shot used and how that creates either a larger or smaller relevance and/or expressiveness (qualitative analysis. In addition to the criteria of relevance and expressiveness, we also analyze the functionality of an object by exploring details and specifying the role various objects play in films. This research was developed at Concordia University, Montreal, Canada and was supported by the Foreign Affairs and International Trade, Canada (DFAIT.

  3. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution.

  4. Usage of data mining for analyzing customer mindset

    Directory of Open Access Journals (Sweden)

    Priti Sadaria

    2012-09-01

    Full Text Available As this is the era of Information Technology, no filed remains untouched by computer science. The technology has become an integral part of the business process. By implementing different data mining techniques and algorithms on the feedback collected from the customer, we can analyzed the data. With help of this analyzed information we have clear idea about the customer’s mind set and can take meaning full decision for production and marketing of particular product. To study about customer mindset differentmodels like classification and association models are used in data mining.

  5. Analyzing 5 years of EC-TEL proceedings

    NARCIS (Netherlands)

    Reinhardt, Wolfgang; Meier, Christian; Drachsler, Hendrik; Sloep, Peter

    2011-01-01

    Reinhardt, W., Meier, C., Drachsler, H., & Sloep, P. B. (2011). Analyzing 5 years of EC-TEL proceedings. In C. D. Kloos, D. Gillet, R. M. Crespo García, F. Wild, & M. Wolpers (Eds.), Towards Ubiquitous Learning: 6th European Conference of Technology Enhanced Learning, EC-TEL 2011 (pp. 531-536). Sept

  6. Oxygen analyzers: failure rates and life spans of galvanic cells.

    Science.gov (United States)

    Meyer, R M

    1990-07-01

    Competing technologies exist for measuring oxygen concentrations in breathing circuits. Over a 4-year period, two types of oxygen analyzers were studied prospectively in routine clinical use to determine the incidence and nature of malfunctions. Newer AC-powered galvanic analyzers (North American Dräger O2med) were compared with older, battery-powered polarographic analyzers (Ohmeda 201) by recording all failures and necessary repairs. The AC-powered galvanic analyzer had a significantly lower incidence of failures (0.12 +/- 0.04 failures per machine-month) than the battery-powered polarographic analyzer (4.0 +/- 0.3 failures per machine-month). Disposable capsules containing the active galvanic cells lasted 12 +/- 7 months. Although the galvanic analyzers tended to remain out of service longer, awaiting the arrival of costly parts, the polarographic analyzers were more expensive to keep operating when calculations included the cost of time spent on repairs. Stocking galvanic capsules would have decreased the amount of time the galvanic analyzers were out of service, while increasing costs. In conclusion, galvanic oxygen analyzers appear capable of delivering more reliable service at a lower overall cost. By keeping the galvanic capsules exposed to room air during periods of storage, it should be possible to prolong their life span, further decreasing the cost of using them. In addition, recognizing the aberrations in their performance that warn of the exhaustion of the galvanic cells should permit timely recording and minimize downtime.

  7. [Health technology in Mexico].

    Science.gov (United States)

    Cruz, C; Faba, G; Martuscelli, J

    1992-01-01

    The features of the health technology cycle are presented, and the effects of the demographic, epidemiologic and economic transition on the health technology demand in Mexico are discussed. The main problems of science and technology in the context of a decreasing scientific and technological activity due to the economic crisis and the adjustment policies are also analyzed: administrative and planning problems, low impact of scientific production, limitations of the Mexican private sector, and the obstacles for technology assessment. Finally, this paper also discusses the main support strategies for science and technology implemented by the Mexican government during the 1980s and the challenges and opportunities that lie ahead.

  8. 基于同步相量测量技术的励磁系统调节性能分析方法及其系统实现%A method for analyzing the regulating performance of excitation system based on synchronized phasor measurement technology and its system realization

    Institute of Scientific and Technical Information of China (English)

    王波; 陆进军

    2012-01-01

    基于同步相量测量技术提出一种利用相量测量单元(PMU)采集的机组及其励磁系统动态电气数据量实现励磁系统动态调节性能分析和评估的方法.该方法通过检测与提取机组扰动及其励磁系统响应的过程,并结合相关励磁系统性能参数的工程化处理方法实现对励磁系统主要性能指标的计算与分析,同时基于WAMS系统结构还设计和实现了完整的软件系统功能.通过对该分析方法的阐述及实际工程应用的效果可以看出,与常规分析方法相比该方法具有在线分析、离线研究、简单实用、使用方便等优点,其计算结果可作为励磁系统动态调节性能评价的重要参考.%A method for analyzing and evaluating the dynamic regulating performance of excitation system using dynamic electrical data of generator unit and its excitation system acquired by phasor measurement unit (PMU) is proposed based on synchronized phasor measurement technology. Combined with an engineering processing of corresponding excitation system performance parameters, the method realizes the calculation and analysis of the main excitation system performance indexes through detecting and extracting the course of a generator unit disturbance and its excitation system response. Meanwhile, its whole software system functions are designed and realized based on the system structure of a WAMS. The method introduction and practical project applications show that compared with conventional analysis methods, this method has the advantages of online analysis, offline research, simplicity, practicality, and convenience and its computed results can be taken as an important reference for the evaluation of dynamic regulating performances of an excitation system.

  9. Technology Empowered and Collaborative Learning:Analyzing Undergraduate Education Transformations in American Public Colleges and Universities Triggered by Red Balloon Project%技术推动协同创新--解析“红气球项目”引发的美国公立大学本科教育变革

    Institute of Scientific and Technical Information of China (English)

    祝智庭; 陈丹

    2013-01-01

    “红气球项目”是美国州立学院和大学联合会发起的公立大学本科教育变革项目。在联通主义、社会建构主义等理论影响下,“红气球项目”以信息技术、网络、社交媒体等应用为推动力,创建了全国性的协作模式,为美国公立大学本科教育面临的挑战和困难提供了解决思路和方案,由此引发了公立大学的深刻变革。本文对由“红气球项目”引发的变革进行了阐释,并对大学变革实践案例以及“红气球项目”变革与其他教育变革的异同进行了分析,总结出“红气球项目”带来的美国本科教育变革的思路、特征及启示。%The Red Balloon Project is a program focused on undergraduate education transformations in American public colleges and universities launched by the American Association of State Colleges and Universities. Influenced by the connectivism and the social constructivism and empowered by information technology,network and social media,the project creates a nationwide collaboration to provide solutions and ways towards challenges and difficulties faced by un-dergraduate education in public institutions. The project finally triggers profound changes in education. The paper elab-orates these changes and analyzes the similarities and differences among several cases,and finally abstracts the opin-ions,characteristics and enlightenments from these transformational cases found in the Red Balloon Project.

  10. Social Media: A Phenomenon to be Analyzed

    OpenAIRE

    danah boyd

    2015-01-01

    The phenomenon of “social media” has more to do with its cultural positioning than its technological affordances. Rooted in the broader “Web 2.0” landscape, social media helped engineers, entrepreneurs, and everyday people reimagine the role that technology could play in information dissemination, community development, and communication. While the technologies invoked by the phrase social media have a long history, what unfolded in the 2000s reconfigured socio-technical practices in signific...

  11. An implementation of Apertium based Assamese morphological analyzer

    OpenAIRE

    Rahman, Mirzanur; Sarma, Shikhar Kumar

    2015-01-01

    Morphological Analysis is an important branch of linguistics for any Natural Language Processing Technology. Morphology studies the word structure and formation of word of a language. In current scenario of NLP research, morphological analysis techniques have become more popular day by day. For processing any language, morphology of the word should be first analyzed. Assamese language contains very complex morphological structure. In our work we have used Apertium based Finite-State-Transduce...

  12. Analyzing the attributes of Indiana's STEM schools

    Science.gov (United States)

    Eltz, Jeremy

    "Primary and secondary schools do not seem able to produce enough students with the interest, motivation, knowledge, and skills they will need to compete and prosper in the emerging world" (National Academy of Sciences [NAS], 2007a, p. 94). This quote indicated that there are changing expectations for today's students which have ultimately led to new models of education, such as charters, online and blended programs, career and technical centers, and for the purposes of this research, STEM schools. STEM education as defined in this study is a non-traditional model of teaching and learning intended to "equip them [students] with critical thinking, problem solving, creative and collaborative skills, and ultimately establishes connections between the school, work place, community and the global economy" (Science Foundation Arizona, 2014, p. 1). Focusing on science, technology, engineering, and math (STEM) education is believed by many educational stakeholders to be the solution for the deficits many students hold as they move on to college and careers. The National Governors Association (NGA; 2011) believes that building STEM skills in the nation's students will lead to the ability to compete globally with a new workforce that has the capacity to innovate and will in turn spur economic growth. In order to accomplish the STEM model of education, a group of educators and business leaders from Indiana developed a comprehensive plan for STEM education as an option for schools to use in order to close this gap. This plan has been promoted by the Indiana Department of Education (IDOE, 2014a) with the goal of increasing STEM schools throughout Indiana. To determine what Indiana's elementary STEM schools are doing, this study analyzed two of the elementary schools that were certified STEM by the IDOE. This qualitative case study described the findings and themes from two elementary STEM schools. Specifically, the research looked at the vital components to accomplish STEM

  13. Analyzing the User Behavior toward Electronic Commerce Stimuli.

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-Del-Amo, María-Del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies -which constitute the web atmosphere or webmosphere of a website- on shopping human behavior (i.e., users' internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 ("free" versus "hierarchical" navigational structure) × 2 ("on" versus "off" music) × 2 ("moving" versus "static" images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  14. Analyzing the user behavior towards Electronic Commerce stimuli

    Directory of Open Access Journals (Sweden)

    Carlota Lorenzo-Romero

    2016-11-01

    Full Text Available Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e. navigational structure as utilitarian stimulus versus nonverbal web technology (music and presentation of products as hedonic stimuli. Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human bebaviour (i.e. users’ internal states -affective, cognitive, and satisfaction- and behavioral responses - approach responses, and real shopping outcomes- within the retail online store created by computer, taking into account some mediator variables (i.e. involvement, atmospheric responsiveness, and perceived risk. A 2(free versus hierarchical navigational structure x2(on versus off music x2(moving versus static images between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior.

  15. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...

  16. Social Media: A Phenomenon to be Analyzed

    Directory of Open Access Journals (Sweden)

    danah boyd

    2015-04-01

    Full Text Available The phenomenon of “social media” has more to do with its cultural positioning than its technological affordances. Rooted in the broader “Web 2.0” landscape, social media helped engineers, entrepreneurs, and everyday people reimagine the role that technology could play in information dissemination, community development, and communication. While the technologies invoked by the phrase social media have a long history, what unfolded in the 2000s reconfigured socio-technical practices in significant ways. Reflecting on the brief history of social media, this essay argues for the need to better understand this phenomenon.

  17. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  18. Using Three-Dimensional Fluorescence Spectrum Technology to Analyze the Effects of Natural Dissolved Organic Matter on the Pesticide Residues in the Soil%溶解性有机物对土壤中农药残留与分布影响的光谱学研究

    Institute of Scientific and Technical Information of China (English)

    雷宏军; 潘红卫; 韩宇平; 刘鑫; 徐建新

    2015-01-01

    The behavior of pesticide in soil is influenced by dissolved organic matter (DOM ) through competi‐tion adsorption ,adsorption ,solubilization ,accelerated degradation ,and so on .Thus DOM and its components play an important role in the environmental risk in the soil ecosystem and groundwater environment .Current‐ly ,most studies focused on the short‐term effect of high concentration of DOM on the pesticide residues . However ,soil DOM is mainly at low level .Therefore ,there is of some practical significance to probe into the environmental behavior of soil pesticides under natural level of DOM .Thus a site investigation was conducted in the farmland with long‐term application history of pesticide .By using the three dimensional excitation‐emis‐sion fluorescence matrix (3D‐EEM ) technology ,together with the fluorescence regional integration (FRI) quantitative method ,the long‐term effects of pesticide residues under low concentration of natural DOM were analyzed .Results showed that :(1) The long‐term effects of the natural DOM components on the environment behavior of most soil organo‐chlorine pesticides were not significant except for a few pesticides such as γ‐HCH ,p ,p’‐DDE ,etc .(2) The influencing effects of DOM components on different type of pesticides were varied .Among which ,the content of tyrosine component showed a significantly negative correlation (p<0.05) with the concentration of γ‐HCH and p ,p’‐DDE .There were significant positive correlations (p<0.05) between the by‐products of microbial degradation in DOM components and the concentration of hepta‐chlor .There were also a significant positive correlation (p<0.05) between the content of active humus com‐ponent of humic acid in the DOM and the concentration of heptachlor epoxide .These results suggested that the distribution of different types of pesticides residue in the soil was influenced by different components at differ‐ent levels of significance .(3

  19. Designing of Acousto-optic Spectrum Analyzer

    Institute of Scientific and Technical Information of China (English)

    WANG Dan-zhi; SHAO Ding-rong; LI Shu-jian

    2004-01-01

    The structure of the acousto-optic spectrum analyzer was investigated including the RF amplifying circuit, the optical structures and the postprocessing circuit, and the design idea of the module was applied to design the spectrum analyzer. The modularization spectrum analyzer takes on the performance stabilization and higher reliability, and according to different demands, the different modules can be used. The spectrum analyzer had such performances as the detecting frequency error of 0.58MHz,detecting responsivity of 90 dBm and bandwidth of 50 Mhz.

  20. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  1. Computational models for analyzing lipoprotein profiles

    NARCIS (Netherlands)

    Graaf, A.A. de; Schalkwijk, D.B. van

    2011-01-01

    At present, several measurement technologies are available for generating highly detailed concentration-size profiles of lipoproteins, offering increased diagnostic potential. Computational models are useful in aiding the interpretation of these complex datasets and making the data more accessible f

  2. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social...... has much to offer in analyzing the policy process....

  3. Performance evaluation of PL-11 platelet analyzer

    Institute of Scientific and Technical Information of China (English)

    张有涛

    2013-01-01

    Objective To evaluate and report the performance of PL-11 platelet analyzer. Methods Intravenous blood sam-ples anticoagulated with EDTA-K2 and sodium citrate were tested by the PL-11 platelet analyzer to evaluate the intra-assay and interassay coefficient of variation(CV),

  4. Analyzing metabolomics-based challenge tests

    NARCIS (Netherlands)

    Vis, D.J.; Westerhuis, J.A.; Jacobs, D.M.; Duynhoven, van J.P.M.; Wopereis, S.; Ommen, van B.; Hendriks, M.M.W.B.; Smilde, A.K.

    2015-01-01

    Challenge tests are used to assess the resilience of human beings to perturbations by analyzing responses to detect functional abnormalities. Well known examples are allergy tests and glucose tolerance tests. Increasingly, metabolomics analysis of blood or serum samples is used to analyze the biolog

  5. An Alternative for Industrial Arts: Communication Technology.

    Science.gov (United States)

    Maughan, George R., Jr.; Ritz, John M.

    1978-01-01

    Presents a rationale for including the study of communication technology as a part of the general education process in industrial arts. Analyzes communication technology and suggests methods of implementing the technology in industrial arts. (CSS)

  6. On the Methods of Technology Translation

    Institute of Scientific and Technical Information of China (English)

    尹雁

    2007-01-01

    Technology article has the characteristics as clearness, accurateness, conciseness and strictness. Therefore, more attention should be paid in translating the technology articles. This paper analyzes the methods used usually in technology articles translation.

  7. ANALYZING OF MULTICOMPONENT UNDERSAMPLED SIGNALS BY HAF

    Institute of Scientific and Technical Information of China (English)

    Tao Ran; Shan Tao; Zhou Siyong; Wang Yue

    2001-01-01

    The phenomenon of frequency ambiguity may appear in radar or communication systems. S. Barbarossa(1991) had unwrapped the frequency ambiguity of single component undersampled signals by Wigner-Ville distribution(WVD). But there has no any effective algorithm to analyze multicomponent undersampled signals by now. A new algorithm to analyze multicomponent undersampled signals by high-order ambiguity function (HAF) is proposed hera HAF analyzes polynomial phase signals by the method of phase rank reduction, its advantage is that it does not have boundary effect and is not sensitive to the cross-items of multicomponent signals.The simulation results prove the effectiveness of HAF algorithm.

  8. A method for analyzing strategic product launch

    OpenAIRE

    XIAO Junji

    2007-01-01

    This paper proposes a method to analyze how the manufacturers make product launch decisions in a multi-product oligopoly market, and how the heterogeneity in their products affects the manufacturers' decisions on model launch and withdrawal.

  9. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  10. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  11. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  12. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  13. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  14. Analyzing storage media of digital camera

    OpenAIRE

    Chow, KP; Tse, KWH; Law, FYW; Ieong, RSC; Kwan, MYK; Tse, H.; Lai, PKY

    2009-01-01

    Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.

  15. The Information Flow Analyzing Based on CPC

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhang; LI Hui

    2005-01-01

    The information flow chart within product life cycle is given out based on collaborative production commerce (CPC) thoughts. In this chart, the separated information systems are integrated by means of enterprise knowledge assets that are promoted by CPC from production knowledge. The information flow in R&D process is analyzed in the environment of virtual R&D group and distributed PDM. In addition, the information flow throughout the manufacturing and marketing process is analyzed in CPC environment.

  16. QUBIT DATA STRUCTURES FOR ANALYZING COMPUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov

    2014-11-01

    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  17. Analyte comparisons between 2 clinical chemistry analyzers.

    OpenAIRE

    Sutton, A; Dawson, H; Hoff, B; Grift, E; Shoukri, M

    1999-01-01

    The purpose of this study was to assess agreement between a wet reagent and a dry reagent analyzer. Thirteen analytes (albumin, globulin, alkaline phosphatase, alanine aminotransferase, amylase, urea nitrogen, calcium, cholesterol, creatinine, glucose, potassium, total bilirubin, and total protein) for both canine and feline serum were evaluated. Concordance correlations, linear regression, and plots of difference against mean were used to analyze the data. Concordance correlations were excel...

  18. Spreadsheets for Analyzing and Optimizing Space Missions

    Science.gov (United States)

    Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick

    2009-01-01

    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.

  19. Analyzing the mediated voice - a datasession

    DEFF Research Database (Denmark)

    Lawaetz, Anna

    Broadcasted voices are technologically manipulated. In order to achieve a certain autencity or sound of “reality” paradoxically the voices are filtered and trained in order to reach the listeners. This “mis-en-scene” is important knowledge when it comes to the development of a consistent method o...... of analysis of the mediated voice...

  20. Dashboard for Analyzing Ubiquitous Learning Log

    Science.gov (United States)

    Lkhagvasuren, Erdenesaikhan; Matsuura, Kenji; Mouri, Kousuke; Ogata, Hiroaki

    2016-01-01

    Mobile and ubiquitous technologies have been applied to a wide range of learning fields such as science, social science, history and language learning. Many researchers have been investigating the development of ubiquitous learning environments; nevertheless, to date, there have not been enough research works related to the reflection, analysis…

  1. Description of the prototype diagnostic residual gas analyzer for ITER.

    Science.gov (United States)

    Younkin, T R; Biewer, T M; Klepper, C C; Marcus, C

    2014-11-01

    The diagnostic residual gas analyzer (DRGA) system to be used during ITER tokamak operation is being designed at Oak Ridge National Laboratory to measure fuel ratios (deuterium and tritium), fusion ash (helium), and impurities in the plasma. The eventual purpose of this instrument is for machine protection, basic control, and physics on ITER. Prototyping is ongoing to optimize the hardware setup and measurement capabilities. The DRGA prototype is comprised of a vacuum system and measurement technologies that will overlap to meet ITER measurement requirements. Three technologies included in this diagnostic are a quadrupole mass spectrometer, an ion trap mass spectrometer, and an optical penning gauge that are designed to document relative and absolute gas concentrations.

  2. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2003-01-01

    This study analyzes how a group of ‘mediators’ in a large, multinational company adapted a computer-mediated communication technology (a ‘virtual workspace’) to the organizational context (and vice versa) by modifying features of the technology, providing ongoing support for users, and promoting...... of technology-use mediation is more complex and indeterminate than earlier literature suggests. In particular, we want to draw attention to the fact that advanced computer-mediated communication technologies are equivocal and that technology-use mediation consequently requires ongoing sensemaking (Weick 1995)....... appropriate conventions of use. Our findings corroborate earlier research on technology-use mediation, which suggests that such mediators can exert considerable influence on how a particular technology will be established and used in an organization. However, this study also indicates that the process...

  3. Analyzing the User Behavior toward Electronic Commerce Stimuli

    Science.gov (United States)

    Lorenzo-Romero, Carlota; Alarcón-del-Amo, María-del-Carmen; Gómez-Borja, Miguel-Ángel

    2016-01-01

    Based on the Stimulus-Organism-Response paradigm this research analyzes the main differences between the effects of two types of web technologies: Verbal web technology (i.e., navigational structure as utilitarian stimulus) versus non-verbal web technology (music and presentation of products as hedonic stimuli). Specific webmosphere stimuli have not been examined yet as separate variables and their impact on internal and behavioral responses seems unknown. Therefore, the objective of this research consists in analyzing the impact of these web technologies –which constitute the web atmosphere or webmosphere of a website– on shopping human behavior (i.e., users’ internal states -affective, cognitive, and satisfaction- and behavioral responses – approach responses, and real shopping outcomes-) within the retail online store created by computer, taking into account some mediator variables (i.e., involvement, atmospheric responsiveness, and perceived risk). A 2 (“free” versus “hierarchical” navigational structure) × 2 (“on” versus “off” music) × 2 (“moving” versus “static” images) between-subjects computer experimental design is used to test empirically this research. In addition, an integrated methodology was developed allowing the simulation, tracking and recording of virtual user behavior within an online shopping environment. As main conclusion, this study suggests that the positive responses of online consumers might increase when they are allowed to freely navigate the online stores and their experience is enriched by animate gifts and music background. The effect caused by mediator variables modifies relatively the final shopping human behavior. PMID:27965549

  4. Analyzing visual signals as visual scenes.

    Science.gov (United States)

    Allen, William L; Higham, James P

    2013-07-01

    The study of visual signal design is gaining momentum as techniques for studying signals become more sophisticated and more freely available. In this paper we discuss methods for analyzing the color and form of visual signals, for integrating signal components into visual scenes, and for producing visual signal stimuli for use in psychophysical experiments. Our recommended methods aim to be rigorous, detailed, quantitative, objective, and where possible based on the perceptual representation of the intended signal receiver(s). As methods for analyzing signal color and luminance have been outlined in previous publications we focus on analyzing form information by discussing how statistical shape analysis (SSA) methods can be used to analyze signal shape, and spatial filtering to analyze repetitive patterns. We also suggest the use of vector-based approaches for integrating multiple signal components. In our opinion elliptical Fourier analysis (EFA) is the most promising technique for shape quantification but we await the results of empirical comparison of techniques and the development of new shape analysis methods based on the cognitive and perceptual representations of receivers. Our manuscript should serve as an introductory guide to those interested in measuring visual signals, and while our examples focus on primate signals, the methods are applicable to quantifying visual signals in most taxa.

  5. Mapping Technology Space by Normalizing Technology Relatedness Networks

    CERN Document Server

    Alstott, Jeff; Yan, Bowen; Luo, Jianxi

    2015-01-01

    Technology is a complex system, with technologies relating to each other in a space that can be mapped as a network. The technology relatedness network's structure can reveal properties of technologies and of human behavior, if it can be mapped accurately. Technology networks have been made from patent data, using several measures of relatedness. These measures, however, are influenced by factors of the patenting system that do not reflect technologies or their relatedness. We created technology networks that precisely controlled for these impinging factors and normalized them out, using data from 3.9 million patents. The normalized technology relatedness networks were sparse, with only ~20% of technology domain pairs more related than would be expected by chance. Different measures of technology relatedness became more correlated with each other after normalization, approaching a single dimension of technology relatedness. The normalized network corresponded with human behavior: we analyzed the patenting his...

  6. Analyzing the flight of a quadcopter using a smartphone

    CERN Document Server

    Monteiro, Martín; Cabeza, Cecilia; Marti, Arturo C

    2015-01-01

    Remotely-controlled helicopters and planes have been used as toys for decades. However, only recently, advances in sensor technologies have made possible to easily flight and control theses devices at an affordable price. Along with their increasing availability the educational opportunities are also proliferating. Here, a simple experiment in which a smartphone is mounted on a quadcopter is proposed to investigate the basics of a flight. Thanks to the smartphone's built-in accelerometer and gyroscope, take off, landing and yaw are analyzed.

  7. A resource-efficient adaptive Fourier analyzer

    Science.gov (United States)

    Hajdu, C. F.; Zamantzas, C.; Dabóczi, T.

    2016-10-01

    We present a resource-efficient frequency adaptation method to complement the Fourier analyzer proposed by Péceli. The novel frequency adaptation scheme is based on the adaptive Fourier analyzer suggested by Nagy. The frequency adaptation method was elaborated with a view to realizing a detector connectivity check on an FPGA in a new beam loss monitoring (BLM) system, currently being developed for beam setup and machine protection of the particle accelerators at the European Organisation for Nuclear Research (CERN). The paper summarizes the Fourier analyzer to the extent relevant to this work and the basic principle of the related frequency adaptation methods. It then outlines the suggested new scheme, presents practical considerations for implementing it and underpins it with an example and the corresponding operational experience.

  8. Detecting influenza outbreaks by analyzing Twitter messages

    CERN Document Server

    Culotta, Aron

    2010-01-01

    We analyze over 500 million Twitter messages from an eight month period and find that tracking a small number of flu-related keywords allows us to forecast future influenza rates with high accuracy, obtaining a 95% correlation with national health statistics. We then analyze the robustness of this approach to spurious keyword matches, and we propose a document classification component to filter these misleading messages. We find that this document classifier can reduce error rates by over half in simulated false alarm experiments, though more research is needed to develop methods that are robust in cases of extremely high noise.

  9. Analyzing Broadband Divide in the Farming Sector

    DEFF Research Database (Denmark)

    Jensen, Michael; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup

    2013-01-01

    Agriculture industry has been evolving for centuries. Currently, the technological development of Internet oriented farming tools allows to increase the productivity and efficiency of this sector. Many of the already available tools and applications require high bandwidth in both directions......, upstream and downstream connection. The main constraint is that farms are naturally located in rural areas where the required access broadband data rates are not available. This paper studies the broadband divide in relation to the Danish agricultural sector. Results show how there is an important...... difference between the broadband availability for farms and the rest of the households/buildings the country. This divide may be slowing down the potential technological development of the farming industry, in order to keep their competitiveness in the market. Therefore, broadband development in rural areas...

  10. Analyzing Distributed Processing For Electric Utilities

    Science.gov (United States)

    Klein, Stanley A.; Kirkham, Harold; Beardmore, Julie A.

    1990-01-01

    Distributed Processing Trade-Off Model for Electric Utility Operation computer program based upon study performed at California Institute of Technology for NASA's Jet Propulsion Laboratory. Study presented technique addressing question of tradeoffs between expanding communications network or expanding capacity of distributed computers in energy-management systems (EMS) of electric utility. Gives EMS planners macroscopic tool for evaluation of architectures of distributed-processing systems and major technical and economic tradeoffs as well as interactions within systems.

  11. Analyzing volatile compounds in dairy products

    Science.gov (United States)

    Volatile compounds give the first indication of the flavor in a dairy product. Volatiles are isolated from the sample matrix and then analyzed by chromatography, sensory methods, or an electronic nose. Isolation may be performed by solvent extraction or headspace analysis, and gas chromatography i...

  12. GSM Trace Quality Analyzer (TQA) software

    OpenAIRE

    Blanchart Forne, Marc

    2016-01-01

    Connectivity is now the must-have service for enhancing passenger experience. To proof and also to show to the customers the quality of the connectivity system an user friendly mock-up has to be designed. A packet analyzer software designed to validate an existing SATCOM simulator and to improve future airline architecture networks.

  13. Imaging thermal plasma mass and velocity analyzer

    Science.gov (United States)

    Yau, Andrew W.; Howarth, Andrew

    2016-07-01

    We present the design and principle of operation of the imaging ion mass and velocity analyzer on the Enhanced Polar Outflow Probe (e-POP), which measures low-energy (1-90 eV/e) ion mass composition (1-40 AMU/e) and velocity distributions using a hemispherical electrostatic analyzer (HEA), a time-of-flight (TOF) gate, and a pair of toroidal electrostatic deflectors (TED). The HEA and TOF gate measure the energy-per-charge and azimuth of each detected ion and the ion transit time inside the analyzer, respectively, providing the 2-D velocity distribution of each major ionospheric ion species and resolving the minor ion species under favorable conditions. The TED are in front of the TOF gate and optionally sample ions at different elevation angles up to ±60°, for measurement of 3-D velocity distribution. We present examples of observation data to illustrate the measurement capability of the analyzer, and show the occurrence of enhanced densities of heavy "minor" O++, N+, and molecular ions and intermittent, high-velocity (a few km/s) upward and downward flowing H+ ions in localized regions of the quiet time topside high-latitude ionosphere.

  14. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak…

  15. Thermal and Evolved-Gas Analyzer Illustration

    Science.gov (United States)

    2008-01-01

    This is a computer-aided drawing of the Thermal and Evolved-Gas Analyzer, or TEGA, on NASA's Phoenix Mars Lander. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  16. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  17. Analyzing computer system performance with Perl

    CERN Document Server

    Gunther, Neil J

    2011-01-01

    This expanded second edition of Analyzing Computer System Performance with Perl::PDQ, builds on the success of the first edition. It contains new chapters on queues, tools and virtualization, and new Perl listing format to aid readability of PDQ models.

  18. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  19. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand states....

  20. Analyzing the Information Economy: Tools and Techniques.

    Science.gov (United States)

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's "Production and…

  1. Studying Reliability Using Identical Handheld Lactate Analyzers

    Science.gov (United States)

    Stewart, Mark T.; Stavrianeas, Stasinos

    2008-01-01

    Accusport analyzers were used to generate lactate performance curves in an investigative laboratory activity emphasizing the importance of reliable instrumentation. Both the calibration and testing phases of the exercise provided students with a hands-on opportunity to use laboratory-grade instrumentation while allowing for meaningful connections…

  2. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  3. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature... drying. Chemical dryers are not an acceptable method of removing water from the sample. Water removal...

  4. Fluidization quality analyzer for fluidized beds

    Science.gov (United States)

    Daw, C. Stuart; Hawk, James A.

    1995-01-01

    A control loop and fluidization quality analyzer for a fluidized bed utilizes time varying pressure drop measurements. A fast-response pressure transducer measures the overall bed pressure drop, or over some segment of the bed, and the pressure drop signal is processed to produce an output voltage which changes with the degree of fluidization turbulence.

  5. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak enrollment…

  6. DEVELOPMENT OF AN ON-LINE COAL WASHABILITY ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Miller; C.L. Lin; G.H. Luttrell; G.T. Adel; Barbara Marin

    2001-06-26

    Washability analysis is the basis for nearly all coal preparation plant separations. Unfortunately, there are no on- line techniques for determining this most fundamental of all coal cleaning information. In light of recent successes at the University of Utah, it now appears possible to determine coal washability on-line through the use of x-ray computed tomography (CT) analysis. The successful development of such a device is critical to the establishment of process control and automated coal blending systems. In this regard, Virginia Tech, Terra Tek Inc., and U.S. coal producers have joined with the University of Utah and to undertake the development of an X-ray CT-based on- line coal washability analyzer with financial assistance from DOE. Each project participant brought special expertise to the project in order to create a new dimension in coal cleaning technology. The project involves development of appropriate software and extensive testing/evaluation of well-characterized coal samples from operating coal preparation plants. Data collected to date suggest that this new technology is capable of serving as a universal analyzer that can not only provide washability analysis, but also particle size distribution analysis, ash analysis, and perhaps pyritic sulfur analysis.

  7. On-line chemical composition analyzer development

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, M.J.; Garrison, A.A.; Muly, E.C.; Moore, C.F.

    1992-02-01

    The energy consumed in distillation processes in the United States represents nearly three percent of the total national energy consumption. If effective control of distillation columns can be accomplished, it has been estimated that it would result in a reduction in the national energy consumption of 0.3%. Real-time control based on mixture composition could achieve these savings. However, the major distillation processes represent diverse applications and at present there does not exist a proven on-line chemical composition sensor technology which can be used to control these diverse processes in real-time. This report presents a summary of the findings of the second phase of a three phase effort undertaken to develop an on-line real-time measurement and control system utilizing Raman spectroscopy. A prototype instrument system has been constructed utilizing a Perkin Elmer 1700 Spectrometer, a diode pumped YAG laser, two three axis positioning systems, a process sample cell land a personal computer. This system has been successfully tested using industrially supplied process samples to establish its performance. Also, continued application development was undertaken during this Phase of the program using both the spontaneous Raman and Surface-enhanced Raman modes of operation. The study was performed for the US Department of Energy, Office of Industrial Technologies, whose mission is to conduct cost-shared R D for new high-risk, high-payoff industrial energy conservation technologies. Although this document contains references to individual manufacturers and their products, the opinions expressed on the products reported do not necessarily reflect the position of the Department of Energy.

  8. Operating System Performance Analyzer for Embedded Systems

    Directory of Open Access Journals (Sweden)

    Shahzada Khayyam Nisar

    2011-11-01

    Full Text Available RTOS provides a number of services to an embedded system designs such as case management, memory management, and Resource Management to build a program. Choosing the best OS for an embedded system is based on the available OS for system designers and their previous knowledge and experience. This can cause an imbalance between the OS and embedded systems. RTOS performance analysis is critical in the design and integration of embedded software to ensure that limits the application meet at runtime. To select an appropriate operating system for an embedded system for a particular application, the OS services to be analyzed. These OS services are identified by parameters to establish performance metrics. Performance Metrics selected include context switching, Preemption time and interrupt latency. Performance Metrics are analyzed to choose the right OS for an embedded system for a particular application.

  9. CRIE: An automated analyzer for Chinese texts.

    Science.gov (United States)

    Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En

    2016-12-01

    Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.

  10. Raman Gas Analyzer (RGA): Natural Gas Measurements.

    Science.gov (United States)

    Petrov, Dmitry V; Matrosov, Ivan I

    2016-06-08

    In the present work, an improved model of the Raman gas analyzer (RGA) of natural gas (NG) developed by us is described together with its operating principle. The sensitivity has been improved and the number of measurable gases has been expanded. Results of its approbation on a real NG sample are presented for different measurement times. A comparison of the data obtained with the results of chromatographic analysis demonstrates their good agreement. The time stability of the results obtained using this model is analyzed. It is experimentally established that the given RGA can reliably determine the content of all molecular NG components whose content exceeds 0.005% for 100 s; moreover, in this case the limiting sensitivity for some NG components is equal to 0.002%.

  11. Methods of analyzing composition of aerosol particles

    Science.gov (United States)

    Reilly, Peter T.A.

    2013-02-12

    An aerosol particle analyzer includes a laser ablation chamber, a gas-filled conduit, and a mass spectrometer. The laser ablation chamber can be operated at a low pressure, which can be from 0.1 mTorr to 30 mTorr. The ablated ions are transferred into a gas-filled conduit. The gas-filled conduit reduces the electrical charge and the speed of ablated ions as they collide and mix with buffer gases in the gas-filled conduit. Preferably, the gas filled-conduit includes an electromagnetic multipole structure that collimates the nascent ions into a beam, which is guided into the mass spectrometer. Because the gas-filled conduit allows storage of vast quantities of the ions from the ablated particles, the ions from a single ablated particle can be analyzed multiple times and by a variety of techniques to supply statistically meaningful analysis of composition and isotope ratios.

  12. An improved prism energy analyzer for neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, J., E-mail: jennifer.schulz@helmholtz-berlin.de [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Ott, F. [Laboratoire Leon Brillouin, Bât 563 CEA Saclay, 91191 Gif sur Yvette Cedex (France); Krist, Th. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany)

    2014-04-21

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å.

  13. The EPOS Automated Selective Chemistry Analyzer evaluated.

    Science.gov (United States)

    Moses, G C; Lightle, G O; Tuckerman, J F; Henderson, A R

    1986-01-01

    We evaluated the analytical performance of the EPOS (Eppendorf Patient Oriented System) Automated Selective Chemistry Analyzer, using the following tests for serum analytes: alanine and aspartate aminotransferases, lactate dehydrogenase, creatine kinase, gamma-glutamyltransferase, alkaline phosphatase, and glucose. Results from the EPOS correlated well with those from comparison instruments (r greater than or equal to 0.990). Precision and linearity limits were excellent for all tests; linearity of the optical and pipetting systems was satisfactory. Reagent carryover was negligible. Sample-to-sample carryover was less than 1% for all tests, but only lactate dehydrogenase was less than the manufacturer's specified 0.5%. Volumes aspirated and dispensed by the sample and reagent II pipetting systems differed significantly from preset values, especially at lower settings; the reagent I system was satisfactory at all volumes tested. Minimal daily maintenance and an external data-reduction system make the EPOS a practical alternative to other bench-top chemistry analyzers.

  14. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  15. LEGAL-EASE:Analyzing Chinese Financial Statements

    Institute of Scientific and Technical Information of China (English)

    EDWARD; MA

    2008-01-01

    In this article,we will focus on under- standing and analyzing the typical accounts of Chinese financial statements,including the balance sheet and income statement. Accounts are generally incorrectly prepared. This can be due to several factors,incom- petence,as well as more serious cases of deliberate attempts to deceive.Regardless, accounts can be understood and errors or specific acts of misrepresentation uncovered. We will conduct some simple analysis to demonstrate how these can be spotted.

  16. MORPHOLOGICAL ANALYZER MYSTEM 3.0

    Directory of Open Access Journals (Sweden)

    A. I. Zobnin

    2015-01-01

    Full Text Available The large part of the Russian National Corpus has automatic morphological markup. It is based on the morphological analyzer Mystem developed in Yandex with some postprocessing of the results (for example, all indeclinable nouns acquire the tag '0', verbs are divided into separate paradigms by aspect, etc.. Recently a new (third version of Mystem has been released (see https://tech.yandex.ru/mystem/.  In this article we give an overview of its capabilities.

  17. Coordinating, Scheduling, Processing and Analyzing IYA09

    Science.gov (United States)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  18. Organization theory. Analyzing health care organizations.

    Science.gov (United States)

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible.

  19. Analyzing negative ties in social networks

    Directory of Open Access Journals (Sweden)

    Mankirat Kaur

    2016-03-01

    Full Text Available Online social networks are a source of sharing information and maintaining personal contacts with other people through social interactions and thus forming virtual communities online. Social networks are crowded with positive and negative relations. Positive relations are formed by support, endorsement and friendship and thus, create a network of well-connected users whereas negative relations are a result of opposition, distrust and avoidance creating disconnected networks. Due to increase in illegal activities such as masquerading, conspiring and creating fake profiles on online social networks, exploring and analyzing these negative activities becomes the need of hour. Usually negative ties are treated in same way as positive ties in many theories such as balance theory and blockmodeling analysis. But the standard concepts of social network analysis do not yield same results in respect of each tie. This paper presents a survey on analyzing negative ties in social networks through various types of network analysis techniques that are used for examining ties such as status, centrality and power measures. Due to the difference in characteristics of flow in positive and negative tie networks some of these measures are not applicable on negative ties. This paper also discusses new methods that have been developed specifically for analyzing negative ties such as negative degree, and h∗ measure along with the measures based on mixture of positive and negative ties. The different types of social network analysis approaches have been reviewed and compared to determine the best approach that can appropriately identify the negative ties in online networks. It has been analyzed that only few measures such as Degree and PN centrality are applicable for identifying outsiders in network. For applicability in online networks, the performance of PN measure needs to be verified and further, new measures should be developed based upon negative clique concept.

  20. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  1. Analyzing Malware Based on Volatile Memory

    Directory of Open Access Journals (Sweden)

    Liang Hu

    2013-11-01

    Full Text Available To explain the necessity of comprehensive and automatically analysis process for volatile memory, this paper summarized ordinarily analyzing methods and their common points especially for concerned data source. Then, a memory analysis framework Volatiltiy-2.2 and statistical output file size are recommended. In addition, to address the limitation of plug-ins classification in analyzing procedure, a user perspective classify is necessary and proposed. Furthermore, according to target data source differences on the base of result data set volume and employed relational method is introduced for comprehensive analysis guideline procedure. Finally, a test demo including DLLs loading order list analyzing is recommend, in which DLL load list is regard as different kind of characteristics typical data source with process and convert into process behavior fingerprint. The clustering for the fingerprint is employed string similar degree algorithm model in the demo, which has a wide range applications in traditional malware behavior analysis, and it is proposed that these methods also can be applied for volatile memory

  2. Modular Construction of Shape-Numeric Analyzers

    Directory of Open Access Journals (Sweden)

    Bor-Yuh Evan Chang

    2013-09-01

    Full Text Available The aim of static analysis is to infer invariants about programs that are precise enough to establish semantic properties, such as the absence of run-time errors. Broadly speaking, there are two major branches of static analysis for imperative programs. Pointer and shape analyses focus on inferring properties of pointers, dynamically-allocated memory, and recursive data structures, while numeric analyses seek to derive invariants on numeric values. Although simultaneous inference of shape-numeric invariants is often needed, this case is especially challenging and is not particularly well explored. Notably, simultaneous shape-numeric inference raises complex issues in the design of the static analyzer itself. In this paper, we study the construction of such shape-numeric, static analyzers. We set up an abstract interpretation framework that allows us to reason about simultaneous shape-numeric properties by combining shape and numeric abstractions into a modular, expressive abstract domain. Such a modular structure is highly desirable to make its formalization and implementation easier to do and get correct. To achieve this, we choose a concrete semantics that can be abstracted step-by-step, while preserving a high level of expressiveness. The structure of abstract operations (i.e., transfer, join, and comparison follows the structure of this semantics. The advantage of this construction is to divide the analyzer in modules and functors that implement abstractions of distinct features.

  3. Assistive Technology

    Science.gov (United States)

    ... Page Resize Text Printer Friendly Online Chat Assistive Technology Assistive technology (AT) is any service or tool that helps ... be difficult or impossible. For older adults, such technology may be a walker to improve mobility or ...

  4. A Biosystems Approach to Industrial Patient Monitoring and Diagnostic Devices

    CERN Document Server

    Baura, Gail

    2008-01-01

    A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diag

  5. Preservation mechanisms of trehalose in food and biosystems.

    Science.gov (United States)

    Patist, Alex; Zoerb, Hans

    2005-02-10

    The stability or shelf-life of food and biomaterials has always been a critical issue in the food and pharmaceutical industry. Trehalose (alpha-D-glucopyranosyl-alpha-D-glucopyranoside), a non-reducing diglucose sugar found in nature, confers to certain plant and animal cells the ability to survive dehydration for decades and to restore activity soon after rehydration. The interaction between trehalose and cell membranes or proteins, however, remains a debated subject, and a significant amount of work has been done to elucidate the mechanisms resulting in this unique behavior of preservation. This study shows how an interfacial phenomena approach has led to the use of trehalose as an excipient during freeze drying of a variety of products in the pharmaceutical industry. It also suggests opportunities as an ingredient for dried and processed food, as well as a non-toxic cryoprotectant of vaccines and organs for surgical transplants.

  6. An Axiomatic, Unified Representation of Biosystems and Quantum Dynamics

    CERN Document Server

    Baianu, I

    2004-01-01

    An axiomatic representation of system dynamics is introduced in terms of categories, functors, organismal supercategories, limits and colimits of diagrams. Specific examples are considered in Complex Systems Biology, such as ribosome biogenesis and Hormonal Control in human subjects. "Fuzzy" Relational Structures are also proposed for flexible representations of biological system dynamics and organization.

  7. Carbon Nanomaterials: Applications in Physico-chemical Systemsand Biosystems

    Directory of Open Access Journals (Sweden)

    Maheshwar Sharon

    2008-07-01

    Full Text Available In the present article, various forms of carbon and carbon nanomaterials (CNMs and a new approach to classify them on the basis of sp2-sp3 configuration are presented. Utilising the concept of junction formation (like p:n junction a concept is developed to explain the special reactivity of nanosized carbon materials. Geometric consideration of chiral and achiral symmetry of single-walled carbon nanotubes is presented which is also responsible for manifesting special propertiesof carbon nanotubes. A brief introduction to various common synthesis techniques of CNMs is given. These is increased chemical and biological activities have resulted in many engineer ednanoparticles, which are being designed for specific purposes, including diagnostic or the rapeuticmedical uses and environmental remediation.Defence Science Journal, 2008, 58(4, pp.460-485, DOI:http://dx.doi.org/10.14429/dsj.58.1668

  8. Flowering and sex expression in Acer L. : a biosystemic study

    NARCIS (Netherlands)

    Jong, de P.C.

    1976-01-01

    A review and an analysis is given of flowering and sex expression in Acer. The process of sex differentiation was studied in physiological experiments and could be influenced by accelerated flowering and by removal of female.gif flower buds just after bud break. The paper further includes notes on t

  9. The DOE Automated Radioxenon Sampler-Analyzer (ARSA) Beta-Gamma Coincidence Spectrometer Data Analyzer

    Science.gov (United States)

    2000-09-01

    detected using the counting system given the daily fluctuations in Radon gas interference, the background counts, the memory effect of previous...THE DOE AUTOMATED RADIOXENON SAMPLER-ANALYZER (ARSA) BETA-GAMMA COINCIDENCE SPECTROMETER DATA ANALYZER T.R. Heimbigner, T.W. Bowyer, J.I...1830 ABSTRACT The Automated Radioxenon Sampler/Analyzer (ARSA) developed at the Pacific Northwest National Laboratory for the Comprehensive

  10. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-04-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size separated particles is collected electrostatically on a metal filament, resistively desorbed and consequently analyzed for its molecular composition in a time of flight mass spectrometer. We report of technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of known masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  11. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-09-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the Earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size-separated particles is collected electrostatically on a metal filament, resistively desorbed and subsequently analyzed for its molecular composition in a time of flight mass spectrometer. We report on technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of defined masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  12. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  13. Spectrum Analyzers Incorporating Tunable WGM Resonators

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry; Maleki, Lute

    2009-01-01

    A photonic instrument is proposed to boost the resolution for ultraviolet/ optical/infrared spectral analysis and spectral imaging allowing the detection of narrow (0.00007-to-0.07-picometer wavelength resolution range) optical spectral signatures of chemical elements in space and planetary atmospheres. The idea underlying the proposal is to exploit the advantageous spectral characteristics of whispering-gallery-mode (WGM) resonators to obtain spectral resolutions at least three orders of magnitude greater than those of optical spectrum analyzers now in use. Such high resolutions would enable measurement of spectral features that could not be resolved by prior instruments.

  14. Analyzing Engineered Nanoparticles using Photothermal Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Yamada, Shoko

    using redox activity measurements. With a new setup adapted to miniaturization, stable pH was achieved, platinum was found to be more suitable than gold for open circuit potential-time measurements, miniaturized platinum working electrodes and quasi silver/silver chloride reference electrodes were...... of design rules for the responsivity of the string-based photothermal spectrometer. Responsivity is maximized for a thin, narrow and long string irradiated by high power radiation. Various types of nanoparticles and binary mixtures of them were successfully detected and analyzed. Detection of copper...

  15. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  16. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ. (Italy). Dipt di Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  17. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  18. DEVELOPMENT OF AN ON-LINE COAL WASHABILITY ANALYZER

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Miller

    1999-09-30

    Washability analysis is the basis for nearly all coal preparation plant separations. Unfortunately, there are no on-line techniques for determining this most fundamental of all coal cleaning information. In light of recent successes at the University of Utah, it now appears possible to determine coal washability on-line through the use of x-ray computed tomography (CT) analysis. The successful development of such a device is critical to the establishment of process control and automated coal blending systems. In this regard, Virginia Tech, Terra Tek Inc., and several eastern coal companies have joined with the University of Utah and agreed to undertake the development of a x-ray CT-based on-line coal washability analyzer with financial assistance from DOE. The three-year project will cost $594,571, of which 33% ($194,575) will be cost-shared by the participants. The project involves development of appropriate software and extensive testing/evaluation of well-characterized coal samples from operating coal preparation plants. Each project participant brings special expertise to the project which is expected to create a new dimension in coal cleaning technology. Finally, it should be noted that the analyzer may prove to be a universal analyzer capable of providing not only washability analysis, but also particle size distribution analysis, ash analysis and perhaps pyritic sulfur analysis.

  19. CALIBRATION OF ONLINE ANALYZERS USING NEURAL NETWORKS

    Energy Technology Data Exchange (ETDEWEB)

    Rajive Ganguli; Daniel E. Walsh; Shaohai Yu

    2003-12-05

    Neural networks were used to calibrate an online ash analyzer at the Usibelli Coal Mine, Healy, Alaska, by relating the Americium and Cesium counts to the ash content. A total of 104 samples were collected from the mine, with 47 being from screened coal, and the rest being from unscreened coal. Each sample corresponded to 20 seconds of coal on the running conveyor belt. Neural network modeling used the quick stop training procedure. Therefore, the samples were split into training, calibration and prediction subsets. Special techniques, using genetic algorithms, were developed to representatively split the sample into the three subsets. Two separate approaches were tried. In one approach, the screened and unscreened coal was modeled separately. In another, a single model was developed for the entire dataset. No advantage was seen from modeling the two subsets separately. The neural network method performed very well on average but not individually, i.e. though each prediction was unreliable, the average of a few predictions was close to the true average. Thus, the method demonstrated that the analyzers were accurate at 2-3 minutes intervals (average of 6-9 samples), but not at 20 seconds (each prediction).

  20. Analyzing Mode Confusion via Model Checking

    Science.gov (United States)

    Luettgen, Gerald; Carreno, Victor

    1999-01-01

    Mode confusion is one of the most serious problems in aviation safety. Today's complex digital flight decks make it difficult for pilots to maintain awareness of the actual states, or modes, of the flight deck automation. NASA Langley leads an initiative to explore how formal techniques can be used to discover possible sources of mode confusion. As part of this initiative, a flight guidance system was previously specified as a finite Mealy automaton, and the theorem prover PVS was used to reason about it. The objective of the present paper is to investigate whether state-exploration techniques, especially model checking, are better able to achieve this task than theorem proving and also to compare several verification tools for the specific application. The flight guidance system is modeled and analyzed in Murphi, SMV, and Spin. The tools are compared regarding their system description language, their practicality for analyzing mode confusion, and their capabilities for error tracing and for animating diagnostic information. It turns out that their strengths are complementary.

  1. Analyzing Network Coding Gossip Made Easy

    CERN Document Server

    Haeupler, Bernhard

    2010-01-01

    We give a new technique to analyze the stopping time of gossip protocols that are based on random linear network coding (RLNC). Our analysis drastically simplifies, extends and strengthens previous results. We analyze RLNC gossip in a general framework for network and communication models that encompasses and unifies the models used previously in this context. We show, in most settings for the first time, that it converges with high probability in the information-theoretically optimal time. Most stopping times are of the form O(k + T) where k is the number of messages to be distributed and T is the time it takes to disseminate one message. This means RLNC gossip achieves "perfect pipelining". Our analysis directly extends to highly dynamic networks in which the topology can change completely at any time. This remains true even if the network dynamics are controlled by a fully adaptive adversary that knows the complete network state. Virtually nothing besides simple O(kT) sequential flooding protocols was prev...

  2. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  3. Sentiment Analyzer for Arabic Comments System

    Directory of Open Access Journals (Sweden)

    Alaa El-Dine Ali Hamouda

    2013-04-01

    Full Text Available Today, the number of users of social network is increasing. Millions of users share opinions on different aspects of life every day. Therefore social network are rich sources of data for opinion mining and sentiment analysis. Also users have become more interested in following news pages on Facebook. Several posts; political for example, have thousands of users’ comments that agree/disagree with the post content. Such comments can be a good indicator for the community opinion about the post content. For politicians, marketers, decision makers …, it is required to make sentiment analysis to know the percentage of users agree, disagree and neutral respect to a post. This raised the need to analyze theusers’ comments in Facebook. We focused on Arabic Facebook news pages for the task of sentiment analysis. We developed a corpus for sentiment analysis and opinion mining purposes. Then, we used different machine learning algorithms – decision tree, support vector machines, and naive bayes - to develop sentiment analyzer. The performance of the system using each technique was evaluated and compared with others.

  4. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  5. Atmospheric Aerosol Chemistry Analyzer: Demonstration of feasibility

    Energy Technology Data Exchange (ETDEWEB)

    Mroz, E.J.; Olivares, J.; Kok, G.

    1996-04-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project objective was to demonstrate the technical feasibility of an Atmospheric Aerosol Chemistry Analyzer (AACA) that will provide a continuous, real-time analysis of the elemental (major, minor and trace) composition of atmospheric aerosols. The AACA concept is based on sampling the atmospheric aerosol through a wet cyclone scrubber that produces an aqueous suspension of the particles. This suspension can then be analyzed for elemental composition by ICP/MS or collected for subsequent analysis by other methods. The key technical challenge was to develop a wet cyclone aerosol sampler suitable for respirable particles found in ambient aerosols. We adapted an ultrasonic nebulizer to a conventional, commercially available, cyclone aerosol sampler and completed collection efficiency tests for the unit, which was shown to efficiently collect particles as small as 0.2 microns. We have completed the necessary basic research and have demonstrated the feasibility of the AACA concept.

  6. Analyzing rare diseases terms in biomedical terminologies

    Directory of Open Access Journals (Sweden)

    Erika Pasceri

    2012-03-01

    Full Text Available Rare disease patients too often face common problems, including the lack of access to correct diagnosis, lack of quality information on the disease, lack of scientific knowledge of the disease, inequities and difficulties in access to treatment and care. These things could be changed by implementing a comprehensive approach to rare diseases, increasing international cooperation in scientific research, by gaining and sharing scientific knowledge about and by developing tools for extracting and sharing knowledge. A significant aspect to analyze is the organization of knowledge in the biomedical field for the proper management and recovery of health information. For these purposes, the sources needed have been acquired from the Office of Rare Diseases Research, the National Organization of Rare Disorders and Orphanet, organizations that provide information to patients and physicians and facilitate the exchange of information among different actors involved in this field. The present paper shows the representation of rare diseases terms in biomedical terminologies such as MeSH, ICD-10, SNOMED CT and OMIM, leveraging the fact that these terminologies are integrated in the UMLS. At the first level, it was analyzed the overlap among sources and at a second level, the presence of rare diseases terms in target sources included in UMLS, working at the term and concept level. We found that MeSH has the best representation of rare diseases terms.

  7. Fully Analyzing an Algebraic Polya Urn Model

    CERN Document Server

    Morcrette, Basile

    2012-01-01

    This paper introduces and analyzes a particular class of Polya urns: balls are of two colors, can only be added (the urns are said to be additive) and at every step the same constant number of balls is added, thus only the color compositions varies (the urns are said to be balanced). These properties make this class of urns ideally suited for analysis from an "analytic combinatorics" point-of-view, following in the footsteps of Flajolet-Dumas-Puyhaubert, 2006. Through an algebraic generating function to which we apply a multiple coalescing saddle-point method, we are able to give precise asymptotic results for the probability distribution of the composition of the urn, as well as local limit law and large deviation bounds.

  8. Modeling and analyzing architectural change with alloy

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ingstrup, Mads

    2010-01-01

    to the uptake of reconfiguration techniques in industry. Using the Alloy language and associated tool, we propose a practical way to formally model and analyze runtime architectural change expressed as architectural scripts. Our evaluation shows the performance to be acceptable; our experience......Although adaptivity based on reconfiguration has the potential to improve dependability of systems, the cost of a failed attempt at reconfiguration is prohibitive in precisely the applications where high dependability is required. Existing work on formal modeling and verification of architectural...... reconfigurations partly achieve the goal of ensuring correctness, however the formalisms used often lack tool support and the ensuing models have uncertain relation to a concrete implementation. Thus a practical way to ensure with formal certainty that specific architectural changes are correct remains a barrier...

  9. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  10. Drug stability analyzer for long duration spaceflights

    Science.gov (United States)

    Shende, Chetan; Smith, Wayne; Brouillette, Carl; Farquharson, Stuart

    2014-06-01

    Crewmembers of current and future long duration spaceflights require drugs to overcome the deleterious effects of weightlessness, sickness and injuries. Unfortunately, recent studies have shown that some of the drugs currently used may degrade more rapidly in space, losing their potency well before their expiration dates. To complicate matters, the degradation products of some drugs can be toxic. Consequently there is a need for an analyzer that can determine if a drug is safe at the time of use, as well as to monitor and understand space-induced degradation, so that drug types, formulations, and packaging can be improved. Towards this goal we have been investigating the ability of Raman spectroscopy to monitor and quantify drug degradation. Here we present preliminary data by measuring acetaminophen, and its degradation product, p-aminophenol, as pure samples, and during forced degradation reactions.

  11. Basis-neutral Hilbert-space analyzers

    CERN Document Server

    Martin, Lane; Kondakci, H Esat; Larson, Walker D; Shabahang, Soroush; Jahromi, Ali K; Malhotra, Tanya; Vamivakas, A Nick; Atia, George K; Abouraddy, Ayman F

    2016-01-01

    Interferometry is one of the central organizing principles of optics. Key to interferometry is the concept of optical delay, which facilitates spectral analysis in terms of time-harmonics. In contrast, when analyzing a beam in a Hilbert space spanned by spatial modes -- a critical task for spatial-mode multiplexing and quantum communication -- basis-specific principles are invoked that are altogether distinct from that of `delay.' Here, we extend the traditional concept of temporal delay to the spatial domain, thereby enabling the analysis of a beam in an arbitrary spatial-mode basis -- exemplified using Hermite-Gaussian and radial Laguerre-Gaussian modes. Such generalized delays correspond to optical implementations of fractional transforms; for example, the fractional Hankel transform is the generalized delay associated with the space of Laguerre-Gaussian modes, and an interferometer incorporating such a `delay' obtains modal weights in the associated Hilbert space. By implementing an inherently stable, rec...

  12. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  13. Computer model for analyzing sodium cold traps

    Energy Technology Data Exchange (ETDEWEB)

    McPheeters, C C; Raue, D J

    1983-05-01

    A computer model was developed to simulate the processes that occur in sodium cold traps. The Model for Analyzing Sodium Cold Traps (MASCOT) simulates any desired configuration of mesh arrangements and dimensions and calculates pressure drops and flow distributions, temperature profiles, impurity concentration profiles, and impurity mass distributions. The calculated pressure drop as a function of impurity mass content determines the capacity of the cold trap. The accuracy of the model was checked by comparing calculated mass distributions with experimentally determined mass distributions from literature publications and with results from our own cold trap experiments. The comparisons were excellent in all cases. A parametric study was performed to determine which design variables are most important in maximizing cold trap capacity.

  14. Analyzing Hydrological Sustainability Through Water Balance

    Science.gov (United States)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2010-05-01

    The objective of the Water Framework Directive (2000/60/EC) is to assist in the development of management plans that will lead to the sustainable use of water resources in all EU member states. However, defining the degree of sustainability aimed at is not a straightforward task. It requires detailed knowledge of the hydrogeological characteristics of the basin in question, its environmental needs, the amount of human water demand, and the opportunity to construct a proper water balance that describes the behavior of the hydrological system and estimates available water resources. An analysis of the water balance in the Selva basin (Girona, NE Spain) points to the importance of regional groundwater fluxes in satisfying current exploitation rates, and shows that regional scale approaches are often necessary to evaluate water availability. In addition, we discuss the pressures on water resources, and analyze potential actions, based on the water balance results, directed towards achieving sustainable water management in the basin.

  15. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn (Oak Ridge, TN); Chen, Da-Ren (Creve Coeur, MO)

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  16. Analyzing BSE transmission to quantify regional risk.

    Science.gov (United States)

    de Koeijer, Aline A

    2007-10-01

    As a result of consumer fears and political concerns related to BSE as a risk to human health, a need has arisen recently for more sensitive methods to detect BSE and more accurate methods to determine BSE incidence. As a part of the development of such methods, it is important to be able to identify groups of animals with above-average BSE risk. One of the well-known risk factors for BSE is age, as very young animals do not develop the disease, and very old animals are less likely to develop the disease. Here, we analyze which factors have a strong influence on the age distribution of BSE in a population. Building on that, we develop a simple set of calculation rules for classifying the BSE risk in a given cattle population. Required inputs are data on imports and on the BSE control measures in place over the last 10 or 20 years.

  17. Buccal microbiology analyzed by infrared spectroscopy

    Science.gov (United States)

    de Abreu, Geraldo Magno Alves; da Silva, Gislene Rodrigues; Khouri, Sônia; Favero, Priscila Pereira; Raniero, Leandro; Martin, Airton Abrahão

    2012-01-01

    Rapid microbiological identification and characterization are very important in dentistry and medicine. In addition to dental diseases, pathogens are directly linked to cases of endocarditis, premature delivery, low birth weight, and loss of organ transplants. Fourier Transform Infrared Spectroscopy (FTIR) was used to analyze oral pathogens Aggregatibacter actinomycetemcomitans ATCC 29523, Aggregatibacter actinomycetemcomitans-JP2, and Aggregatibacter actinomycetemcomitans which was clinically isolated from the human blood-CI. Significant spectra differences were found among each organism allowing the identification and characterization of each bacterial species. Vibrational modes in the regions of 3500-2800 cm-1, the 1484-1420 cm-1, and 1000-750 cm-1 were used in this differentiation. The identification and classification of each strain were performed by cluster analysis achieving 100% separation of strains. This study demonstrated that FTIR can be used to decrease the identification time, compared to the traditional methods, of fastidious buccal microorganisms associated with the etiology of the manifestation of periodontitis.

  18. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...... as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...

  19. Three Practical Methods for Analyzing Slope Stability

    Institute of Scientific and Technical Information of China (English)

    XU Shiguang; ZHANG Shitao; ZHU Chuanbing; YIN Ying

    2008-01-01

    Since the environmental capacity and the arable as well as the inhabitant lands have actually reached a full balance, the slopes are becoming the more and more important options for various engineering constructions. Because of the geological complexity of the slope, the design and thedecision-making of a slope-based engineering is still not ractical to rely solely on the theoretical analysis and numerical calculation, but mainly on the experience of the experts. Therefore, it hasimportant practical significance to turn some successful experience into mathematic equations. Basedupon the abundant typical slope engineering construction cases in Yunnan, Southwestern China, 3methods for yzing the slope stability have been developed in this paper. First of all, the corresponded analogous mathematic equation for analyzing slope stability has been established through case studies. Then, artificial neural network and multivariate regression analysis have alsobeen set up when 7 main influencing factors are adopted

  20. PMD: A Resource for Archiving and Analyzing Protein Microarray data.

    Science.gov (United States)

    Xu, Zhaowei; Huang, Likun; Zhang, Hainan; Li, Yang; Guo, Shujuan; Wang, Nan; Wang, Shi-Hua; Chen, Ziqing; Wang, Jingfang; Tao, Sheng-Ce

    2016-01-27

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by experimental name, protein microarray type, and sample information. Additionally, PMD integrates several data analysis tools and provides an automated data analysis pipeline for users. With just one click, users can obtain a comprehensive analysis report for their protein microarray data. The report includes preliminary data analysis, such as data normalization, candidate identification, and an in-depth bioinformatics analysis of the candidates, which include functional annotation, pathway analysis, and protein-protein interaction network analysis. PMD is now freely available at www.proteinmicroarray.cn.

  1. Analyzing Tibetan Monastic Conceptions of the Universe Through Individual Drawings

    Science.gov (United States)

    Sonam, Tenzin; Impey, Chris David

    2017-01-01

    Every culture and tradition has its own representation of the universe that continues to evolve due to the influence of new technologies, discoveries, and cultural exchanges. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores monastic conceptions of the universe prior to formal instruction in astronomy. The drawings of 59 Buddhist monks and nuns were analyzed using Tversky’s three criteria for drawing analysis—segmentation, order, and hierarchical structure of knowledge. We found that 22 out of 59 monastics drew a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. Only six monastics drew the traditional Buddhist model of the world, generally known as the Mount Meru Cosmology. The implication of the monastics' representation of the universe for their assimilation into modern science is discussed.

  2. PLC backplane analyzer for field forensics and intrusion detection

    Science.gov (United States)

    Mulder, John; Schwartz, Moses Daniel; Berg, Michael; Van Houten, Jonathan Roger; Urrea, Jorge Mario; King, Michael Aaron; Clements, Abraham Anthony; Trent, Jason; Depoy, Jennifer M; Jacob, Joshua

    2015-05-12

    The various technologies presented herein relate to the determination of unexpected and/or malicious activity occurring between components communicatively coupled across a backplane. Control data, etc., can be intercepted at a backplane where the backplane facilitates communication between a controller and at least one device in an automation process. During interception of the control data, etc., a copy of the control data can be made, e.g., the original control data can be replicated to generate a copy of the original control data. The original control data can continue on to its destination, while the control data copy can be forwarded to an analyzer system to determine whether the control data contains a data anomaly. The content of the copy of the control data can be compared with a previously captured baseline data content, where the baseline data can be captured for a same operational state as the subsequently captured control data.

  3. Analyzing Tibetan Monastics Conception of Universe Through Their Drawings

    Science.gov (United States)

    Sonam, Tenzin; Chris Impey

    2016-06-01

    Every culture and tradition has their own representation of the universe that continues to evolve through new technologies and discoveries, and as a result of cultural exchange. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores the monastics’ conception of the universe prior to their formal instruction in science. Their drawings were analyzed using Tversky’s three criteria for drawing analysis namely—segmentation, order, and hierarchical structure of knowledge. Among the sixty Buddhist monastics included in this study, we find that most of them draw a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. A few monastics draw the traditional Buddhist model of the world. The implications of the monastics' representation of the universe for their assimilation of modern science is discussed.

  4. Multi-Pass Quadrupole Mass Analyzer

    Science.gov (United States)

    Prestage, John D.

    2013-01-01

    Analysis of the composition of planetary atmospheres is one of the most important and fundamental measurements in planetary robotic exploration. Quadrupole mass analyzers (QMAs) are the primary tool used to execute these investigations, but reductions in size of these instruments has sacrificed mass resolving power so that the best present-day QMA devices are still large, expensive, and do not deliver performance of laboratory instruments. An ultra-high-resolution QMA was developed to resolve N2 +/CO+ by trapping ions in a linear trap quadrupole filter. Because N2 and CO are resolved, gas chromatography columns used to separate species before analysis are eliminated, greatly simplifying gas analysis instrumentation. For highest performance, the ion trap mode is used. High-resolution (or narrow-band) mass selection is carried out in the central region, but near the DC electrodes at each end, RF/DC field settings are adjusted to allow broadband ion passage. This is to prevent ion loss during ion reflection at each end. Ions are created inside the trap so that low-energy particles are selected by low-voltage settings on the end electrodes. This is beneficial to good mass resolution since low-energy particles traverse many cycles of the RF filtering fields. Through Monte Carlo simulations, it is shown that ions are reflected at each end many tens of times, each time being sent back through the central section of the quadrupole where ultrahigh mass filtering is carried out. An analyzer was produced with electrical length orders of magnitude longer than its physical length. Since the selector fields are sized as in conventional devices, the loss of sensitivity inherent in miniaturizing quadrupole instruments is avoided. The no-loss, multi-pass QMA architecture will improve mass resolution of planetary QMA instruments while reducing demands on the RF electronics for high-voltage/high-frequency production since ion transit time is no longer limited to a single pass. The

  5. Comparison of two dry chemistry analyzers and a wet chemistry analyzer using canine serum.

    Science.gov (United States)

    Lanevschi, Anne; Kramer, John W.

    1996-01-01

    Canine serum was used to compare seven chemistry analytes on two tabletop clinical dry chemistry analyzers, Boehringer's Reflotron and Kodak's Ektachem. Results were compared to those obtained on a wet chemistry reference analyzer, Roche Diagnostic's Cobas Mira. Analytes measured were urea nitrogen (BUN), creatinine, glucose, aspartate aminotransferase (AST), alanine aminotransferase (ALT), cholesterol and bilirubin. Nine to 12 canine sera with values in the low, normal, and high range were evaluated. The correlations were acceptable for all comparisons with correlation coefficients greater than 0.98 for all analytes. Regression analysis resulted in significant differences for both tabletop analyzers when compared to the reference analyzer for cholesterol and bilirubin, and for glucose and AST on the Kodak Ektachem. Differences appeared to result from proportional systematic error occurring at high analyte concentrations.

  6. The Chemnitz LogAnalyzer: a tool for analyzing data from hypertext navigation research.

    Science.gov (United States)

    Brunstein, Angela; Naumann, Anja; Krems, Josef F

    2005-05-01

    Computer-based studies usually produce log files as raw data. These data cannot be analyzed adequately with conventional statistical software. The Chemnitz LogAnalyzer provides tools for quick and comfortable visualization and analyses of hypertext navigation behavior by individual users and for aggregated data. In addition, it supports analogous analyses of questionnaire data and reanalysis with respect to several predefined orders of nodes of the same hypertext. As an illustration of how to use the Chemnitz LogAnalyzer, we give an account of one study on learning with hypertext. Participants either searched for specific details or read a hypertext document to familiarize themselves with its content. The tool helped identify navigation strategies affected by these two processing goals and provided comparisons, for example, of processing times and visited sites. Altogether, the Chemnitz LogAnalyzer fills the gap between log files as raw data of Web-based studies and conventional statistical software.

  7. First cloud-based service for analyzing storage tank data

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-01-15

    Most commercial storage tanks are unmonitored and require manual processes to verify conditions, remediate issues or request servicing. New Boundary Technologies has developed an off-the-shelf solution that eliminates several manual processes. Its TankVista Internet service was launched as the first cloud-based service for continuously monitoring and analyzing the conditions and storage levels of commercial storage tanks, bins, silos and other containers. TankVista takes data from storage tank sensors and translates it into graphics and maps that industry can use to drive new efficiencies in storage tank management. A bulk oil distributor can leverage TankVista to remotely and continuously monitor its own storage tanks as well as those of its clients. TankVista monitors tank level, temperature, pressure, humidity and other storage criteria in order to know exactly when and where to replenish supplies. Rather than re-filling tanks at about 50 per cent capacity, a bulk oil distributor can wait until usage levels dictate more efficient re-filling. The monitoring takes place without manual intervention. TankVista complements the iDigi Tank, which has the unique ability to wirelessly connect dispersed and remote tank assets, and get this information through drop-in wireless mesh technology to the cloud without requiring onsite Internet access. 1 fig.

  8. Decomposition methods for analyzing changes of industrial water use

    Science.gov (United States)

    Shang, Yizi; Lu, Shibao; Shang, Ling; Li, Xiaofei; Wei, Yongping; Lei, Xiaohui; Wang, Chao; Wang, Hao

    2016-12-01

    Changes in industrial water use are of the utmost significance in rapidly developing countries. Such countries are experience rapid industrialization, which may stimulate substantial increases in their future industrial water use. Local governments face challenges in formulating industrial policies for sustainable development, particularly in areas that experience severe water shortages. This study addresses the factors driving increased industrial water use and the degrees to which these factors contribute, and determines whether the trend will change in the future. This study explores the options for quantitative analysis that analyzes changes in industrial water use. We adopt both the refined Laspeyres and the Logarithmic Mean Divisia Index models to decompose the driving forces of industrial water use. Additionally, we validate the decomposition results through a comparative study using empirical analysis. Using Tianjin, a national water-saving city in China, as a case study, we compare the performance of the two models. In the study, the driving forces of changes in industrial water use are summarized as output, technological, and structural forces. The comparative results indicate that the refined Laspeyres model may be preferable for this case, and further reveal that output and technology have long-term, stable effects on industrial water use. However, structure may have an uncertain influence on industrial water use. The reduced water use may be a consequence of Tianjin's attempts to target water savings in other areas. Therefore, we advise the Tianjin local government to restructure local industries towards water-saving targets.

  9. Signal processing and analyzing works of art

    Science.gov (United States)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  10. Analyzing modified unimodular gravity via Lagrange multipliers

    Science.gov (United States)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  11. A Method for Analyzing Volunteered Geographic Information ...

    Science.gov (United States)

    Volunteered geographic information (VGI) can be used to identify public valuation of ecosystem services in a defined geographic area using photos as a representation of lived experiences. This method can help researchers better survey and report on the values and preferences of stakeholders involved in rehabilitation and revitalization projects. Current research utilizes VGI in the form of geotagged social media photos from three platforms: Flickr, Instagram, and Panaramio. Social media photos have been obtained for the neighborhoods next to the St. Louis River in Duluth, Minnesota, and are being analyzed along several dimensions. These dimensions include the spatial distribution of each platform, the characteristics of the physical environment portrayed in the photos, and finally, the ecosystem service depicted. In this poster, we focus on the photos from the Irving and Fairmount neighborhoods of Duluth, MN to demonstrate the method at the neighborhood scale. This study demonstrates a method for translating the values expressed in social media photos into ecosystem services and spatially-explicit data to be used in multiple settings, including the City of Duluth’s Comprehensive Planning and community revitalization efforts, habitat restoration in a Great Lakes Area of Concern, and the USEPA’s Office of Research and Development. This poster will demonstrate a method for translating values expressed in social media photos into ecosystem services and spatially

  12. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  13. Alzheimer's disease: analyzing the missing heritability.

    Directory of Open Access Journals (Sweden)

    Perry G Ridge

    Full Text Available Alzheimer's disease (AD is a complex disorder influenced by environmental and genetic factors. Recent work has identified 11 AD markers in 10 loci. We used Genome-wide Complex Trait Analysis to analyze >2 million SNPs for 10,922 individuals from the Alzheimer's Disease Genetics Consortium to assess the phenotypic variance explained first by known late-onset AD loci, and then by all SNPs in the Alzheimer's Disease Genetics Consortium dataset. In all, 33% of total phenotypic variance is explained by all common SNPs. APOE alone explained 6% and other known markers 2%, meaning more than 25% of phenotypic variance remains unexplained by known markers, but is tagged by common SNPs included on genotyping arrays or imputed with HapMap genotypes. Novel AD markers that explain large amounts of phenotypic variance are likely to be rare and unidentifiable using genome-wide association studies. Based on our findings and the current direction of human genetics research, we suggest specific study designs for future studies to identify the remaining heritability of Alzheimer's disease.

  14. Methodological considerations in analyzing Twitter data.

    Science.gov (United States)

    Kim, Annice E; Hansen, Heather M; Murphy, Joe; Richards, Ashley K; Duke, Jennifer; Allen, Jane A

    2013-12-01

    Twitter is an online microblogging tool that disseminates more than 400 million messages per day, including vast amounts of health information. Twitter represents an important data source for the cancer prevention and control community. This paper introduces investigators in cancer research to the logistics of Twitter analysis. It explores methodological challenges in extracting and analyzing Twitter data, including characteristics and representativeness of data; data sources, access, and cost; sampling approaches; data management and cleaning; standardizing metrics; and analysis. We briefly describe the key issues and provide examples from the literature and our studies using Twitter data to understand public health issues. For investigators considering Twitter-based cancer research, we recommend assessing whether research questions can be answered appropriately using Twitter, choosing search terms carefully to optimize precision and recall, using respected vendors that can provide access to the full Twitter data stream if possible, standardizing metrics to account for growth in the Twitter population over time, considering crowdsourcing for analysis of Twitter content, and documenting and publishing all methodological decisions to further the evidence base.

  15. Eastern Mediterranean Natural Gas: Analyzing Turkey's Stance

    Directory of Open Access Journals (Sweden)

    Abdullah Tanriverdi

    2016-02-01

    Full Text Available Recent large-scale natural gas discoveries in East Mediterranean have drawn attention to the region. The discoveries caused both hope and tension in the region. As stated, the new resources may serve as a new hope for all relevant parties as well as the region if managed in a collaborative and conciliatory way. Energy may be a remedy to Cyprus' financial predicament, initiate a process for resolving differences between Turkey and Cyprus, normalize Israel-Turkey relations and so on. On the contrary, adopting unilateral and uncooperative approach may aggravate the tension and undermine regional stability and security. In this sense, the role of energy in generating hope or tension is dependent on the approaches of related parties. The article will analyze Turkey's attitude in East Mediterranean case in terms of possible negative and positive implications for Turkey in the energy field. The article examines Turkey's position and the reasons behind its stance in the East Mediterranean case. Considering Turkey's energy profile and energy policy goals, the article argues that the newly found hydrocarbons may bring in more stakes for Turkey if Turkey adopts a cooperative approach in this case.

  16. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    Directory of Open Access Journals (Sweden)

    E.Dursun

    2008-01-01

    Full Text Available The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, increasing women workforce and income growth. In this research, original data collected through face-to-face interview from 385 respondents which are located in Istanbul. Different Socio-Economic Status (SES groups‟ ratio for Istanbul was forming sampling distribution. Consumers prefer closest food retailers which are mainly purchasing food products. Consumers purchase more than their planned what their needs; especially C SES group average comes first for the spending money for unplanned shopping. Chain stores and hypermarkets are the most preferred retailers in food purchasing. Moreover, consumer responses to judgments related to retailing are being investigating with factor analysis.

  17. Numerical methods for analyzing electromagnetic scattering

    Science.gov (United States)

    Lee, S. W.; Lo, Y. T.; Chuang, S. L.; Lee, C. S.

    1985-01-01

    Attenuation properties of the normal modes in an overmoded waveguide coated with a lossy material were analyzed. It is found that the low-order modes, can be significantly attenuated even with a thin layer of coating if the coating material is not too lossy. A thinner layer of coating is required for large attenuation of the low-order modes if the coating material is magnetic rather than dielectric. The Radar Cross Section (RCS) from an uncoated circular guide terminated by a perfect electric conductor was calculated and compared with available experimental data. It is confirmed that the interior irradiation contributes to the RCS. The equivalent-current method based on the geometrical theory of diffraction (GTD) was chosen for the calculation of the contribution from the rim diffraction. The RCS reduction from a coated circular guide terminated by a PEC are planned schemes for the experiments are included. The waveguide coated with a lossy magnetic material is suggested as a substitute for the corrugated waveguide.

  18. Qualitative Methodology in Analyzing Educational Phenomena

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2010-12-01

    Full Text Available Semiological analysis of educational phenomena allow researchers access to a multidimensional universe of meanings that is represented by the school, not so much seen as an institution, but as a vector of social action through educational strategies. We consider education as a multidimensional phenomenon since its analysis allows the researcher to explore a variety of research hypotheses of different paradigmatic perspectives that converge in an educational finality. According to the author Simona Branc one of the most appropriate methods used in qualitative data analysis is Grounded Theory; this one assumes a systematic process of generating concepts and theories based on the data collected. Specialised literature defines Grounded Theory as an inductive approach that starts with general observations and during the analytical process creates conceptual categories that explain the theme explored. Research insist on the role of the sociologic theory of managing the research data and for providing ways of conceptualizing the descriptions and explanations.Qualitative content analysis is based on the constructivist paradigm (constructionist in the restricted sense that we used previously. It aims to create an “understanding of the latent meanings of the analyzed messages”. Quantitative content analysis involves a process of encoding and statistical analysis of data extracted from the content of the paper in the form of extractions like: frequencies, contingency analysis, etc

  19. Analyzing planar cell polarity during zebrafish gastrulation.

    Science.gov (United States)

    Jessen, Jason R

    2012-01-01

    Planar cell polarity was first described in invertebrates over 20 years ago and is defined as the polarity of cells (and cell structures) within the plane of a tissue, such as an epithelium. Studies in the last 10 years have identified critical roles for vertebrate homologs of these planar cell polarity proteins during gastrulation cell movements. In zebrafish, the terms convergence and extension are used to describe the collection of morphogenetic movements and cell behaviors that contribute to narrowing and elongation of the embryonic body plan. Disruption of planar cell polarity gene function causes profound defects in convergence and extension creating an embryo that has a shortened anterior-posterior axis and is broadened mediolaterally. The zebrafish gastrula-stage embryo is transparent and amenable to live imaging using both Nomarski/differential interference contrast and fluorescence microscopy. This chapter describes methods to analyze convergence and extension movements at the cellular level and thereby connect embryonic phenotypes with underlying planar cell polarity defects in migrating cells.

  20. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  1. USING NLP APPROACH FOR ANALYZING CUSTOMER REVIEWS

    Directory of Open Access Journals (Sweden)

    Saleem Abuleil

    2017-01-01

    Full Text Available The Web considers one of the main sources of customer opinions and reviews which they are represented in two formats; structured data (numeric ratings and unstructured data (textual comments. Millions of textual comments about goods and services are posted on the web by customers and every day thousands are added, make it a big challenge to read and understand them to make them a useful structured data for customers and decision makers. Sentiment analysis or Opinion mining is a popular technique for summarizing and analyzing those opinions and reviews. In this paper, we use natural language processing techniques to generate some rules to help us understand customer opinions and reviews (textual comments written in the Arabic language for the purpose of understanding each one of them and then convert them to a structured data. We use adjectives as a key point to highlight important information in the text then we work around them to tag attributes that describe the subject of the reviews, and we associate them with their values (adjectives.

  2. Technology Maturity is Technology Superiority

    Science.gov (United States)

    2008-09-09

    Dominant Air Power: Design For Tomorrow…Deliver Today 2 TECHNOLOGY MATURITY CONFERENCE • ONE DEFINITION OF MATURITY – GOOD JUDGEMENT COMES FROM...EXPERIENCE—EXPERIENCE COMES FROM BAD JUDGEMENT Dominant Air Power: Design For Tomorrow…Deliver Today 3 TECHNOLOGY MATURITY CONFERENCE • THIS WILL BE A...2008 TECHNOLOGY MATURITY CONFERENCE “ TECHNOLOGY MATURITY IS TECHNOLOGY SUPERIORITY” Aeronautical Systems Center Dr. Tom Christian ASC/EN, WPAFB OH

  3. NON-DESTRUCTIVE SOIL CARBON ANALYZER.

    Energy Technology Data Exchange (ETDEWEB)

    Wielopolski, Lucian; Hendrey, G.; Orion, I.; Prior, S.; Rogers, H.; Runion, B.; Torbert, A.

    2004-02-01

    This report describes the feasibility, calibration, and safety considerations of a non-destructive, in situ, quantitative, volumetric soil carbon analytical method based on inelastic neutron scattering (INS). The method can quantify values as low as 0.018 gC/cc, or about 1.2% carbon by weight with high precision under the instrument's configuration and operating conditions reported here. INS is safe and easy to use, residual soil activation declines to background values in under an hour, and no radiological requirements are needed for transporting the instrument. The labor required to obtain soil-carbon data is about 10-fold less than with other methods, and the instrument offers a nearly instantaneous rate of output of carbon-content values. Furthermore, it has the potential to quantify other elements, particularly nitrogen. New instrumentation was developed in response to a research solicitation from the U.S. Department of Energy (DOE LAB 00-09 Carbon Sequestration Research Program) supporting the Terrestrial Carbon Processes (TCP) program of the Office of Science, Biological and Environmental Research (BER). The solicitation called for developing and demonstrating novel techniques for quantitatively measuring changes in soil carbon. The report includes raw data and analyses of a set of proof-of-concept, double-blind studies to evaluate the INS approach in the first phase of developing the instrument. Managing soils so that they sequester massive amounts of carbon was suggested as a means to mitigate the atmospheric buildup of anthropogenic CO{sub 2}. Quantifying changes in the soils' carbon stocks will be essential to evaluating such schemes and documenting their performance. Current methods for quantifying carbon in soil by excavation and core sampling are invasive, slow, labor-intensive and locally destroy the system being observed. Newly emerging technologies, such as Laser Induced Breakdown Spectroscopy and Near-Infrared Spectroscopy, offer soil

  4. Analyzing cancer samples with SNP arrays.

    Science.gov (United States)

    Van Loo, Peter; Nilsen, Gro; Nordgard, Silje H; Vollan, Hans Kristian Moen; Børresen-Dale, Anne-Lise; Kristensen, Vessela N; Lingjærde, Ole Christian

    2012-01-01

    Single nucleotide polymorphism (SNP) arrays are powerful tools to delineate genomic aberrations in cancer genomes. However, the analysis of these SNP array data of cancer samples is complicated by three phenomena: (a) aneuploidy: due to massive aberrations, the total DNA content of a cancer cell can differ significantly from its normal two copies; (b) nonaberrant cell admixture: samples from solid tumors do not exclusively contain aberrant tumor cells, but always contain some portion of nonaberrant cells; (c) intratumor heterogeneity: different cells in the tumor sample may have different aberrations. We describe here how these phenomena impact the SNP array profile, and how these can be accounted for in the analysis. In an extended practical example, we apply our recently developed and further improved ASCAT (allele-specific copy number analysis of tumors) suite of tools to analyze SNP array data using data from a series of breast carcinomas as an example. We first describe the structure of the data, how it can be plotted and interpreted, and how it can be segmented. The core ASCAT algorithm next determines the fraction of nonaberrant cells and the tumor ploidy (the average number of DNA copies), and calculates an ASCAT profile. We describe how these ASCAT profiles visualize both copy number aberrations as well as copy-number-neutral events. Finally, we touch upon regions showing intratumor heterogeneity, and how they can be detected in ASCAT profiles. All source code and data described here can be found at our ASCAT Web site ( http://www.ifi.uio.no/forskning/grupper/bioinf/Projects/ASCAT/).

  5. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  6. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  7. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  8. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  9. Fuzzy Based Auto-coagulation Control Through Photometric Dispersion Analyzer

    Institute of Scientific and Technical Information of China (English)

    白桦; 李圭白

    2004-01-01

    The main role of water treatment plants is to supply high-quality safe drinking water. Coagulation is one of the most important stages of surface water treatment. The photometric dispersion analyzer(PDA) is a new optical method for flocculation monitoring, and is feasible to realize coagulation feedback control. The on line modification of the coagulation control system' s set point( or optimum dosing coagulant) has influenced the application of this technology in water treatment plant for a long time. A fuzzy control system incorporating the photometric dispersion analyzer was utilized in this coagulation control system. Proposed is a fuzzy logic inference control system by using Takagi and Sugeno' s fuzzy if-then rule for the self-correction of set point on line. Programmed is the dosing rate fuzzy control system in SIEMENS small-scale programmable logic controller. A 400 L/min middle-scale water treatment plant was utilized to simulate the reaction. With the changes of raw water quality, the set point was modified correctly in time, as well as coagulant dosing rate, and residual turbility before filtration was eligible and stable. Results show that this fuzzy inference and control system performs well on the coagulation control system through PDA.

  10. Performance of parametric spectro-temporal analyzer (PASTA).

    Science.gov (United States)

    Zhang, Chi; Wei, Xiaoming; Wong, Kenneth K Y

    2013-12-30

    Parametric spectro-temporal analyzer (PASTA) is an entirely new wavelength resolving modality that focuses the spectral information on the temporal axis, enables ultrafast frame rate, and provides comparable resolution and sensitivity to the state-of-art optical spectrum analyzer (OSA). Generally, spectroscopy relies on the allocation of the spectrum onto the spatial or temporal domain, and the Czerny-Turner monochromator based conventional OSA realizes the spatial allocation by a dispersive grating, while the mechanical rotation limits its operation speed. On the other hand, the PASTA system performs the spectroscopy function by a time-lens focusing mechanism, which all-optically maps the spectral information on the temporal axis, and realizes the single-shot spectrum acquisition. Therefore, the PASTA system provides orders of magnitude improvement on the frame rate, as high as megahertz or even gigahertz in principle. In addition to the implementation of the PASTA system, in this paper, we will primarily discuss its performance, including the tradeoff between the frame rate and the wavelength range, factors that affect the wavelength resolution, the conversion efficiency, the power saturation and the polarization sensitivity. Detection bandwidth and high-order dispersion introduced limitations are also under investigation. All these analyses not only provide an overall guideline for the PASTA design, but also help future research in improving and optimizing this new spectrum resolving technology.

  11. A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.

    Science.gov (United States)

    Cohen, Laura B.

    2003-01-01

    Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…

  12. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  13. Living Technology

    DEFF Research Database (Denmark)

    2010-01-01

    This book is aimed at anyone who is interested in learning more about living technology, whether coming from business, the government, policy centers, academia, or anywhere else. Its purpose is to help people to learn what living technology is, what it might develop into, and how it might impact...... our lives. The phrase 'living technology' was coined to refer to technology that is alive as well as technology that is useful because it shares the fundamental properties of living systems. In particular, the invention of this phrase was called for to describe the trend of our technology becoming...... increasingly life-like or literally alive. Still, the phrase has different interpretations depending on how one views what life is. This book presents nineteen perspectives on living technology. Taken together, the interviews convey the collective wisdom on living technology's power and promise, as well as its...

  14. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels...

  15. Processing and Analyzing Assessment Test Logs provided by Digital Pen and Paper

    OpenAIRE

    Barhoumi, Zaara; Tort, Françoise

    2011-01-01

    In an educational context, the use of new technologies can influence and change teaching practices. Digital pen and paper, as man-machine interface, appears familiar and do not require any specific training. Analyzing logs recorded by this technology, especially time, seems interesting in order to provide new indicators for evaluation or observation. As far as we know, time stamped logs recorded by digital pen is barely studied in an educational context by researchers. We explore the use of d...

  16. Emerging Technologies

    OpenAIRE

    Salgar, S. M.

    2004-01-01

    Phenomenal advancements have taken place in the field of Information and communication technologies in the last decade. Spectacular and innovative changes are expected to take place in these fields in coming decade. Networking technologies are going through a sea change. This paper enumerates the likely networking technologies which are emerging, particularly WLANs. Most of the personal communication in the country will be through cellular/ mobile technologies, which are also covered in the p...

  17. Evaluation and performance characteristics of the Q Hemostasis Analyzer, an automated coagulation analyzer.

    Science.gov (United States)

    Toulon, Pierre; Fischer, Florence; Appert-Flory, Anny; Jambou, Didier

    2014-05-01

    The Q Hemostasis Analyzer (Grifols, Barcelona, Spain) is a fully-automated random-access multiparameter analyzer, designed to perform coagulation, chromogenic and immunologic assays. It is equipped with a cap-piercing system. The instrument was evaluated in a hemostasis laboratory of a University Hospital with respect to its technical features in the determination of coagulation i.e. prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin time, fibrinogen and single coagulation factors V (FV) and VIII (FVIII), chromogenic [antithrombin (AT) and protein C activity] and immunologic assays [von Willebrand factor antigen (vWF:Ag) concentration], using reagents from the analyzer manufacturer. Total precision (evaluated as the coefficient of variation) was below 6% for most parameters both in normal and in pathological ranges, except for FV, FVIII, AT and vWF:Ag both in the normal and pathological samples. No carryover was detected in alternating aPTT measurement in a pool of normal plasma samples and in the same pool spiked with unfractionated heparin (>1.5 IU/mL). The effective throughput was 154 PT, 66 PT/aPTT, 42 PT/aPTT/fibrinogen, and 38 PT/aPTT/AT per hour, leading to 154 to 114 tests performed per hour, depending of the tested panel. Test results obtained on the Q Hemostasis Analyzer were well correlated with those obtained on the ACL TOP analyzer (Instrumentation Laboratory), with r between 0.862 and 0.989. In conclusion, routine coagulation testing can be performed on the Q Hemostasis Analyzer with satisfactory precision and the same apply to more specialized and specific tests.

  18. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  19. Using Simulation to Analyze Acoustic Environments

    Science.gov (United States)

    Wood, Eric J.

    2016-01-01

    One of the main projects that was worked on this semester was creating an acoustic model for the Advanced Space Suit in Comsol Multiphysics. The geometry tools built into the software were used to create an accurate model of the helmet and upper torso of the suit. After running the simulation, plots of the sound pressure level within the suit were produced, as seen below in Figure 1. These plots show significant nulls which should be avoided when placing microphones inside the suit. In the future, this model can be easily adapted to changes in the suit design to determine optimal microphone placements and other acoustic properties. Another major project was creating an acoustic diverter that will potentially be used to route audio into the Space Station's Node 1. The concept of the project was to create geometry to divert sound from a neighboring module, the US Lab, into Node 1. By doing this, no new audio equipment would need to be installed in Node 1. After creating an initial design for the diverter, analysis was performed in Comsol in order to determine how changes in geometry would affect acoustic performance, as shown in Figure 2. These results were used to produce a physical prototype diverter on a 3D printer. With the physical prototype, testing was conducted in an anechoic chamber to determine the true effectiveness of the design, as seen in Figure 3. The results from this testing have been compared to the Comsol simulation results to analyze how closely the Comsol results are to real-world performance. While the Comsol results do not seem to closely resemble the real world performance, this testing has provided valuable insight into how much trust can be placed in the results of Comsol simulations. A final project that was worked on during this tour was the Audio Interface Unit (AIU) design for the Orion program. The AIU is a small device that will be used for as an audio communication device both during launch and on-orbit. The unit will have functions

  20. Assistive Technologies

    Science.gov (United States)

    Auat Cheein, Fernando A., Ed.

    2012-01-01

    This book offers the reader new achievements within the Assistive Technology field made by worldwide experts, covering aspects such as assistive technology focused on teaching and education, mobility, communication and social interactivity, among others. Each chapter included in this book covers one particular aspect of Assistive Technology that…

  1. Technology Tiers

    DEFF Research Database (Denmark)

    Karlsson, Christer

    2015-01-01

    A technology tier is a level in a product system: final product, system, subsystem, component, or part. As a concept, it contrasts traditional “vertical” special technologies (for example, mechanics and electronics) and focuses “horizontal” feature technologies such as product characteristics...

  2. Soulful Technologies

    DEFF Research Database (Denmark)

    Fausing, Bent

    2010-01-01

    or anthropomorphism is important for the branding of new technology. Technology is seen as creating a techno-transcendence towards a more qualified humanity which is in contact with fundamental human values like intuition, vision, and sensing; all the qualities that technology, industrialization, and rationalization...

  3. Science and Technology Parks in the Context of Social Technologies

    Directory of Open Access Journals (Sweden)

    Edgaras Leichteris

    2013-08-01

    Full Text Available This article aims to present a new approach to science and technology park concept and the development prospects in the context of social technologies. Globalization and the spread of social technologies are expanding the influence of science and technology parks on national innovation systems. It opens new directions for research in this area, as well as the practical use of social technologies in the development of science and technology parks. The paper also examines the science and technology park as an institutionalized concept of social technology. In this article the interdisciplinary approach for analyzing the complex concept of science and technology parks is used to explore the theoretical relationships with the social technologies concept. The possible links are identified and illustrated by practical examples ofLithuanian science and technology parks. Finally suggestions for further research are made. Based on the analysis and synthesis of scientific literature in both fields (science and technology parks; social technologies three possible theoretical links are established: a the use of social technologies in science and technology parks b the role of a science park as an intermediate body in the humanization and socialization of technologies c science and technology parks as an institutionalized concept of social technology. The theoretical model is supported by empirical illustrations from the development of Lithuanian science and technology parks, therefore further research in all three directions is feasible and needed. As this research takes a merely theoretical approach to the social systems investigation, it can be qualified only as a preparational stage for further research. The practical examples used in the article are more illustrative than evidence based and shall not be considered as case studies. The research offers an initial framework for researching science and technology parks in the context of social technologies

  4. Science and Technology Parks in the Context of Social Technologies

    Directory of Open Access Journals (Sweden)

    Edgaras Leichteris

    2011-08-01

    Full Text Available Summary. This article aims to present a new approach to science and technology park concept and the development prospects in the context of social technologies. Globalization and the spread of social technologies are expanding the influence of science and technology parks on national innovation systems. It opens new directions for research in this area, as well as the practical use of social technologies in the development of science and technology parks. The paper also examines the science and technology park as an institutionalized concept of social technology. In this article the interdisciplinary approach for analyzing the complex concept of science and technology parks is used to explore the theoretical relationships with the social technologies concept. The possible links are identified and illustrated by practical examples of Lithuanian science and technology parks. Finally suggestions for further research are made. Based on the analysis and synthesis of scientific literature in both fields (science and technology parks; social technologies three possible theoretical links are established: a the use of social technologies in science and technology parks b the role of a science park as an intermediate body in the humanization and socialization of technologies c science and technology parks as an institutionalized concept of social technology. The theoretical model is supported by empirical illustrations from the development of Lithuanian science and technology parks, therefore further research in all three directions is feasible and needed. As this research takes a merely theoretical approach to the social systems investigation, it can be qualified only as a preparational stage for further research. The practical examples used in the article are more illustrative than evidence based and shall not be considered as case studies. The research offers an initial framework for researching science and technology parks in the context of social

  5. Technological Strategies and National Purpose

    Science.gov (United States)

    Gilpin, Robert

    1970-01-01

    Discusses the international and domestic implications of technological growth. Defines three basic national strategies: a broad front approach, scientific and technological specialization and importation. Analyzes the strategies followed by form countries - France, the United States, Sweden, and Japan- to illustrate the alternatives and the…

  6. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Sung Kee; Jung, U.; Park, H. R.

    2010-04-15

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  7. Making Sense of Mobile Technology

    OpenAIRE

    David Pauleen; John Campbell; Brian Harmer; Ali Intezari

    2015-01-01

    Mobile technologies have facilitated a radical shift in work and private life. In this article, we seek to better understand how individual mobile technology users have made sense of these changes and adapted to them. We have used narrative enquiry and sensemaking to collect and analyze the data. The findings show that mobile technology use blurs the boundaries between work and private life, making traditional time and...

  8. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2004-01-01

    Implementation of new computer-mediated communication (CMC) systems in organizations is a complex socio-technical endeavour, involving the mutual adaptation of technology and organization over time. Drawing on the analytic concept of sensemaking, this paper provides a theoretical perspective...... that deepens our understanding of how organizations appropriate new electronic communication media. The paper analyzes how a group of mediators in a large, multinational company adapted a new web-based CMC technology (a virtual workspace) to the local organizational context (and vice versa) by modifying...... features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. We found that these mediators exerted considerable influence on how the technology was established and used in the organization. The mediators were not neutral facilitators of a well...

  9. Appropriate Technology as Indian Technology.

    Science.gov (United States)

    Barry, Tom

    1979-01-01

    Describes the mounting enthusiasm of Indian communities for appropriate technology as an inexpensive means of providing much needed energy and job opportunities. Describes the development of several appropriate technology projects, and the goals and activities of groups involved in utilizing low scale solar technology for economic development on…

  10. Analysis of Impact of 3D Printing Technology on Traditional Manufacturing Technology

    Science.gov (United States)

    Wu, Niyan; Chen, Qi; Liao, Linzhi; Wang, Xin

    With quiet rise of 3D printing technology in automobile, aerospace, industry, medical treatment and other fields, many insiders hold different opinions on its development. This paper objectively analyzes impact of 3D printing technology on mold making technology and puts forward the idea of fusion and complementation of 3D printing technology and mold making technology through comparing advantages and disadvantages of 3D printing mold and traditional mold making technology.

  11. Demonstration Technology Application and Analysis on the Scientific and Technological Progress

    OpenAIRE

    Qingzhu Qi; Zhixiao Jiang

    2013-01-01

    This paper takes Tianjin for example and analyzes the development tend of scientific and technological progress in Tianjin. From five aspects as ‘environment of scientific and technological progress’, ‘input of scientific and technological activities’, ‘output of scientific and technological activities’, ‘high-tech industrialization’, ‘science and technology for economic and social development’, the paper analysis the correlation between GDP and scientific and technological progress. Research...

  12. A Theory-Based Methodology for Analyzing Domain Suitability for Expert Systems Technology Applications

    Science.gov (United States)

    1989-06-01

    complex auditory stimuli on ambiguous or esoteric factors. Describing a musical piece as being composed by Chopin or classifying a submarine from the...classifications of complex auditory stimuli on ambiguous or esoteric factors. Storing examples of music composed by Chopin or submarine classifications based...classifying a particular musical piece as being composed by 130 Chopin or classifying a specific submarine’s sonar return as an "Alfa" class ship. F C

  13. Analyzing the Effects of Technological Change: A Computable General Equilibrium Approach

    Science.gov (United States)

    1988-09-01

    homogeneous of degree zero and 16 2.6 The functional form of PILOT satisfies Walras ’ Law with equality, p’C(p,h) = h, since this is true for each...ignored if a and b could be taken to be inde- pendent. Unfortunately, this is impossible. Given Walras ’ Law, for any p and h it must be true that p

  14. TOGA: an automated parsing technology for analyzing expression of nearly all genes.

    Science.gov (United States)

    Sutcliffe, J G; Foye, P E; Erlander, M G; Hilbush, B S; Bodzin, L J; Durham, J T; Hasel, K W

    2000-02-29

    We have developed an automated, high-throughput, systematic cDNA display method called TOGA, an acronym for total gene expression analysis. TOGA utilizes 8-nt sequences, comprised of a 4-nt restriction endonuclease cleavage site and adjacent 4-nt parsing sequences, and their distances from the 3' ends of mRNA molecules to give each mRNA species in an organism a single identity. The parsing sequences are used as parts of primer-binding sites in 256 PCR-based assays performed robotically on tissue extracts to determine simultaneously the presence and relative concentration of nearly every mRNA in the extracts, regardless of whether the mRNA has been discovered previously. Visualization of the electrophoretically separated fluorescent assay products from different extracts displayed via a Netscape browser-based graphical user interface allows the status of each mRNA to be compared among samples and its identity to be matched with sequences of known mRNAs compiled in databases.

  15. TOGA: An automated parsing technology for analyzing expression of nearly all genes

    OpenAIRE

    Sutcliffe, J. Gregor; Foye, Pamela E.; Erlander, Mark G.; HIlbush, Brian S.; Bodzin, Leon J.; Durham, Jayson T.; Hasel, Karl W.

    2000-01-01

    We have developed an automated, high-throughput, systematic cDNA display method called TOGA, an acronym for total gene expression analysis. TOGA utilizes 8-nt sequences, comprised of a 4-nt restriction endonuclease cleavage site and adjacent 4-nt parsing sequences, and their distances from the 3′ ends of mRNA molecules to give each mRNA species in an organism a single identity. The parsing sequences are used as parts of primer-binding sites in 256 PCR-based assays performed robotically on tis...

  16. The university-industry knowledge relationship: Analyzing patents and the science base of technologies

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    Via the Internet, information scientists can obtain cost-free access to large databases in the hidden or deep web. These databases are often structured far more than the Internet domains themselves. The patent database of the U.S. Patent and Trade Office is used in this study to examine the science base of patents in terms of the literature references in these patents. University-based patents at the global level are compared with results when using the national economy of the Netherlands as a system of reference. Methods for accessing the on-line databases and for the visualization of the results are specified. The conclusion is that 'biotechnology' has historically generated a model for theorizing about university-industry relations that cannot easily be generalized to other sectors and disciplines.

  17. Advances and considerations in technologies for growing, imaging, and analyzing 3-D root system architecture

    Science.gov (United States)

    The ability of a plant to mine the soil for nutrients and water is determined by how, where, and when roots are arranged in the soil matrix. The capacity of plant to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, is affected by root system architectu...

  18. The optics inside an automated single molecule array analyzer

    Science.gov (United States)

    McGuigan, William; Fournier, David R.; Watson, Gary W.; Walling, Les; Gigante, Bill; Duffy, David C.; Rissin, David M.; Kan, Cheuk W.; Meyer, Raymond E.; Piech, Tomasz; Fishburn, Matthew W.

    2014-02-01

    Quanterix and Stratec Biomedical have developed an instrument that enables the automated measurement of multiple proteins at concentration ~1000 times lower than existing immunoassays. The instrument is based on Quanterix's proprietary Single Molecule Array technology (Simoa™ ) that facilitates the detection and quantification of biomarkers previously difficult to measure, thus opening up new applications in life science research and in-vitro diagnostics. Simoa is based on trapping individual beads in arrays of femtoliter-sized wells that, when imaged with sufficient resolution, allows for counting of single molecules associated with each bead. When used to capture and detect proteins, this approach is known as digital ELISA (Enzyme-linked immunosorbent assay). The platform developed is a merger of many science and engineering disciplines. This paper concentrates on the optical technologies that have enabled the development of a fully-automated single molecule analyzer. At the core of the system is a custom, wide field-of-view, fluorescence microscope that images arrays of microwells containing single molecules bound to magnetic beads. A consumable disc containing 24 microstructure arrays was developed previously in collaboration with Sony DADC. The system cadence requirements, array dimensions, and requirement to detect single molecules presented significant optical challenges. Specifically, the wide field-of-view needed to image the entire array resulted in the need for a custom objective lens. Additionally, cost considerations for the system required a custom solution that leveraged the image processing capabilities. This paper will discuss the design considerations and resultant optical architecture that has enabled the development of an automated digital ELISA platform.

  19. Flexibility of MIP Technology

    Institute of Scientific and Technical Information of China (English)

    Tang Jinlian; Gong Jianhong; Xu Youhao

    2015-01-01

    The lfexibility of MIP technology to meet market demand is mainly introduced in this study. Their commercial application and technical principle are analyzed too. The MIP technology with wide feed adaptability can form a good com-bination with other technologies. The MIP technology has been applied extensively in China. Based on this platform, the CGP, MIP-LTG and MIP-DCR technologies have been developed, which can further improve the lfexibility of MIP tech-nology. Based on its novel reaction control technique with a sole sequential two-zone riser, the MIP users can easily switch to different operating modes between producing either more clean gasoline and propylene or diesel through changing the catalysts and varying the operating conditions. That offers MIP users with enough production lfexibility and a rational pro-duction arrangement to meet the market demand. The MIP-DCR technology with less dry gas and coke yields can provide a more lfexible operating mode since the catalysts to oil ratio has become an independent variable.

  20. Implementing High Performance Lexical Analyzer using CELL Broadband Engine Processor

    Directory of Open Access Journals (Sweden)

    P.J.SATHISH KUMAR

    2011-09-01

    Full Text Available The lexical analyzer is the first phase of the compiler and commonly the most time consuming. The compilation of large programs is still far from optimized in today’s compilers. With modern processors moving more towards improving parallelization and multithreading, it has become impossible for performance gains in older compilersas technology advances. Any multicore architecture relies on improving parallelism than on improving single core performance. A compiler that is completely parallel and optimized is yet to be developed and would require significant effort to create. On careful analysis we find that the performance of a compiler is majorly affected by the lexical analyzer’s scanning and tokenizing phases. This effort is directed towards the creation of a completelyparallelized lexical analyzer designed to run on the Cell/B.E. processor that utilizes its multicore functionalities to achieve high performance gains in a compiler. Each SPE reads a block of data from the input and tokenizes them independently. To prevent dependence of SPE’s, a scheme for dynamically extending static block-limits isincorporated. Each SPE is given a range which it initially scans and then finalizes its input buffer to a set of complete tokens from the range dynamically. This ensures parallelization of the SPE’s independently and dynamically, with the PPE scheduling load for each SPE. The initially static assignment of the code blocks is made dynamic as soon as one SPE commits. This aids SPE load distribution and balancing. The PPE maintains the output buffer until all SPE’s of a single stage commit and move to the next stage before being written out to the file, to maintain order of execution. The approach can be extended easily to other multicore architectures as well. Tokenization is performed by high-speed string searching, with the keyword dictionary of the language, using Aho-Corasick algorithm.

  1. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  2. Chemistry Technology

    Data.gov (United States)

    Federal Laboratory Consortium — Chemistry technology experts at NCATS engage in a variety of innovative translational research activities, including:Design of bioactive small molecules.Development...

  3. Sensemaking technologies

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    & Brass, 1990; Kling 1991; Orlikowski 2000). It also demonstrates that technology is a flexible variable adapted to the organisation's needs, culture, climate and management philosophy, thus leading to different uses and outcomes of the same technology in different organisations (Barley 1986; 1990......, Orlikowski 2000). Viewing the use of technology as a process of enactment opens up for investigating the social processes of interpreting new technology into the organisation (Orlikowski 2000). The scope of the PhD project will therefore be to gain a deeper understanding of how the enactment of new...

  4. Co-Production of Knowledge in Multi-Stakeholder Processes: Analyzing Joint Experimentation as Social Learning

    Science.gov (United States)

    Akpo, Essegbemon; Crane, Todd A.; Vissoh, Pierre V.; Tossou, Rigobert C.

    2015-01-01

    Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand how the way of organizing social learning affects…

  5. Effects of Professional Experience and Group Interaction on Information Requested in Analyzing IT Cases

    Science.gov (United States)

    Lehmann, Constance M.; Heagy, Cynthia D.

    2008-01-01

    The authors investigated the effects of professional experience and group interaction on the information that information technology professionals and graduate accounting information system (AIS) students request when analyzing business cases related to information systems design and implementation. Understanding these effects can contribute to…

  6. Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools

    Science.gov (United States)

    Jeon, Moongee

    2014-01-01

    This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…

  7. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    Science.gov (United States)

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  8. Field-usable portable analyzer for chlorinated organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, W.J.; Penrose, W.R.; Stetter, J.R. [Transducer Research, Inc., Naperville, IL (United States)

    1995-10-01

    Transducer Research, Inc. (TRI) has been working with the DOE Morgantown Energy Technology Center to develop a new chemical monitor based on a unique sensor which responds selectively to vapors of chlorinated solvents. We are also developing field applications for the monitor in actual DOE cleanup operations. During the initial phase, prototype instruments were built and field tested. Because of the high degree of selectivity that is obtained, no response was observed with common hydrocarbon organic compounds such as BTX (benzene, toluene, xylene) or POLs (petroleum, oil, lubricants), and in fact, no non-halogen-containing chemical has been identified which induces a measurable response. By the end of the Phase I effort, a finished instrument system was developed and test marketed. This instrument, called the RCL MONITOR, was designed to analyze individual samples or monitor an area with automated repetitive analyses. Vapor levels between 0 and 500 ppm can be determined in 90 s with a lower detection limit of 0.2 ppm using the handportable instrument. In addition to the development of the RCL MONITOR, advanced sampler systems are being developed to: (1) extend the dynamic range of the instrument through autodilution of the vapor and (2) allow chemical analyses to be performed on aqueous samples. When interfaced to the samplers, the RCL MONITOR is capable of measuring chlorinated solvent contamination in the vapor phase up to 5000 ppm and in water and other condensed media from 10 to over 10,000 ppb(wt)--without hydrocarbon and other organic interferences.

  9. The minimal requirements to use calcium imaging to analyze ICRAC.

    Science.gov (United States)

    Alansary, Dalia; Kilch, Tatiana; Holzmann, Christian; Peinelt, Christine; Hoth, Markus; Lis, Annette

    2014-06-02

    Endogenous calcium release-activated channel (CRAC) currents are usually quite small and not always easy to measure using the patch-clamp technique. While we have, for instance, successfully recorded very small CRAC currents in primary human effector T cells, we have not yet managed to record CRAC in naïve primary human T cells. Many groups, including ours, therefore use Ca(2+) imaging technologies to analyze CRAC-dependent Ca(2+) influx. However, Ca(2+) signals are quite complex and depend on many different transporter activities; thus, it is not trivial to make quantitative statements about one single transporter, in this case CRAC channels. Therefore, a detailed patch-clamp analysis of ICRAC is always preferred. Since many laboratories use Ca(2+) imaging for ICRAC analysis, we detail here the minimal requirements for reliable measurements. Ca(2+) signals not only depend on the net Ca(2+) influx through CRAC channels but also depend on other Ca(2+) influx mechanisms, K(+) channels or Cl(-) channels (which determine the membrane potential), Ca(2+) export mechanisms like plasma membrane Ca(2+) ATPase (PMCA), sarco/endoplasmic reticulum Ca(2+) ATPase (SERCA) or Na(+)-Ca(2+) exchangers, and (local) Ca(2+) buffering often by mitochondria. In this protocol, we summarize a set of experiments that allow (quantitative) statements about CRAC channel activity using Ca(2+) imaging experiments, including the ability to rule out Ca(2+) signals from other sources.

  10. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  11. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  12. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  13. Thermally activated technologies: Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2003-05-01

    The purpose of this Technology Roadmap is to outline a set of actions for government and industry to develop thermally activated technologies for converting America’s wasted heat resources into a reservoir of pollution-free energy for electric power, heating, cooling, refrigeration, and humidity control. Fuel flexibility is important. The actions also cover thermally activated technologies that use fossil fuels, biomass, and ultimately hydrogen, along with waste heat.

  14. Technology Push

    Science.gov (United States)

    Kennedy, Mike

    2008-01-01

    When students, teachers, administrators and others employed in education arrive at work every day on thousands of campuses across the nation, it should come as no surprise that at every step along the way, technology is there to greet them. Technological advancements in education, as well as in facilities operation and management, are not a…

  15. Maritime Technology

    DEFF Research Database (Denmark)

    Sørensen, Herman

    1997-01-01

    Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text.......Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text....

  16. Lasers technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    The Lasers Technology Program of IPEN is committed to the development of new lasers based on the research of optical materials and new technologies, as well to laser applications in several areas: Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. The Program is basically divided into two main areas: Material and Laser Development and Laser Applications.

  17. Study on brackish water treatment technology

    Institute of Scientific and Technical Information of China (English)

    HE Xu-wen(何绪文); Xu De-ping (许德平); WU Bing(吴兵); WANG Tong(王通)

    2003-01-01

    Based on the characters of deep well-water quality in Fenxi Mining Group in Liulin, the feasibilities of two treatment technologies which use electrodialysis and reverse osmosis are analyzed. Through analyzing and comparing, reverse osmosis technology has several advantages, such as good treatment effect, convenient operating management and low run-cost.

  18. Push Technology on the Net: Threat or Opportunity for the Online Searcher?

    Science.gov (United States)

    Helfer, Joe

    1997-01-01

    Analyzes whether push technology is a threat or opportunity for online searchers. Outlines key functions of push technology products and defines key technology terms. Illustrates how the role of the online searcher changes with push technology. (AEF)

  19. Sensemaking technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research objective: The object of the LOK research project is to gain a better understanding of the technological strategic processes in organisations by using the concept/metaphor of sensemaking. The project will investigate the technological strategies in organisations in order to gain a deeper...... understanding of the cognitive competencies and barriers towards implementing new technology in organisations. The research will therefore concentrate on researching the development process in the organisation's perception of the external environmental elements of customers, suppliers, competitors, internal...... and external technology and legislation and the internal environmental elements of structure, power relations and political arenas. All of these variables have influence on which/how technologies are implemented thus creating different outcomes all depending on the social dynamics that are triggered by changes...

  20. Technology collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Jacob [Halliburton (Brazil)

    2011-07-01

    The aim of this paper is to present Halliburton's Brazilian technology center. Halliburton has technology centers in the United States, Saudi Arabia, India, Singapore and Brazil, all of which aim at delivering accelerated innovation in the oil sector. The technology centers engage in research and development activities with the help of various universities and in collaboration with the customer or supplier. The Halliburton Brazil technology center provides its customers with timely research and development solutions for enhancing recovery and mitigating reservoir uncertainty; they are specialized in finding solutions for pre- and post-salt carbonate drilling and in the enhancement of production from mature fields. This presentation showcased the work carried out by the Halliburton Brazil technology center to help customers develop their deepwater field activities.

  1. A Predictive Model of Technology Transfer Using Patent Analysis

    OpenAIRE

    Jaehyun Choi; Dongsik Jang; Sunghae Jun; Sangsung Park

    2015-01-01

    The rapid pace of technological advances creates many difficulties for R&D practitioners in analyzing emerging technologies. Patent information analysis is an effective tool in this situation. Conventional patent information analysis has focused on the extraction of vacant, promising, or core technologies and the monitoring of technological trends. From a technology management perspective, the ultimate purpose of R&D is technology commercialization. The core of technology commercializ...

  2. The moral relevance of technological artifacts

    NARCIS (Netherlands)

    Verbeek, P.P.C.C.; Sollie, P.; Düwell, M.

    2009-01-01

    This chapter explores the ethics of technology in a double sense: it lays bare points of application for ethical reflection about technology development, and it analyzes the ethical dimensions of technology itself. First, the chapter addresses the question of how to conceptualize and assess the mora

  3. Instructional Technology Must Contribute to Productivity

    Science.gov (United States)

    Molenda, Michael

    2009-01-01

    Those involved in instructional technology in higher education are urged to view instructional technology as a means of improving academic productivity. Instructional technology has been used for over forty years to analyze instructional problems and design solutions that reduce costs and improve learning outcomes. The Pew Program in Course…

  4. Development and Applications of Simulation Technology

    Institute of Scientific and Technical Information of China (English)

    WangZicai

    2004-01-01

    The developing process of simulation technology is discussed in view of its development, maturation and further development.The applications of simulation technology in the fields of national economy are introduced. Finally, the level and status quo of simulation technology home and overseas are analyzed, and its future trend in the new century is presented.

  5. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 92... Hydrocarbon analyzer calibration. The HFID hydrocarbon analyzer shall receive the following initial and... into service and at least annually thereafter, the HFID hydrocarbon analyzer shall be adjusted...

  6. 40 CFR 86.1321-94 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Procedures § 86.1321-94 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the... into service and at least annually thereafter, the FID hydrocarbon analyzer shall be adjusted...

  7. 40 CFR 91.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 91....316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon analyzer as described... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  8. 40 CFR 89.319 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 89... Equipment Provisions § 89.319 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall... and at least annually thereafter, adjust the FID hydrocarbon analyzer for optimum hydrocarbon...

  9. Ergonomics technology

    Science.gov (United States)

    Jones, W. L.

    1977-01-01

    Major areas of research and development in ergonomics technology for space environments are discussed. Attention is given to possible applications of the technology developed by NASA in industrial settings. A group of mass spectrometers for gas analysis capable of fully automatic operation has been developed for atmosphere control on spacecraft; a version for industrial use has been constructed. Advances have been made in personal cooling technology, remote monitoring of medical information, and aerosol particle control. Experience gained by NASA during the design and development of portable life support units has recently been applied to improve breathing equipment used by fire fighters.

  10. Exploration technology

    Energy Technology Data Exchange (ETDEWEB)

    Roennevik, H.C. [Saga Petroleum A/S, Forus (Norway)

    1996-12-31

    The paper evaluates exploration technology. Topics discussed are: Visions; the subsurface challenge; the creative tension; the exploration process; seismic; geology; organic geochemistry; seismic resolution; integration; drilling; value creation. 4 refs., 22 figs.

  11. Technology Innovation

    Science.gov (United States)

    EPA produces innovative technologies and facilitates their creation in line with the Agency mission to create products such as the stormwater calculator, remote sensing, innovation clusters, and low-cost air sensors.

  12. Videodisc technology

    Energy Technology Data Exchange (ETDEWEB)

    Marsh, F.E. Jr.

    1981-03-01

    An overview of the technology of videodiscs is given. The emphasis is on systems that use reflection or transmission of laser light. Possible use of videodiscs for storage of bibliographic information is considered. 6 figures, 3 tables. (RWR)

  13. Banana technology

    Science.gov (United States)

    van Amstel, Willem D.; Schellekens, E. P. A.; Walravens, C.; Wijlaars, A. P. F.

    1999-09-01

    With 'Banana Technology' an unconventional hybrid fabrication technology is indicated for the production of very large parabolic and hyperbolic cylindrical mirror systems. The banana technology uses elastic bending of very large and thin glass substrates and fixation onto NC milled metal moulds. This technology has matured during the last twenty years for the manufacturing of large telecentric flat-bed scanners. Two construction types, called 'internal banana' and 'external banana; are presented. Optical figure quality requirements in terms of slope and curvature deviations are discussed. Measurements of these optical specifications by means of a 'finishing rod' type of scanning deflectometer or slope tester are presented. Design constraints for bending glass and the advantages of a new process will be discussed.

  14. Fabrication Technology

    Energy Technology Data Exchange (ETDEWEB)

    Blaedel, K.L.

    1993-03-01

    The mission of the Fabrication Technology thrust area is to have an adequate base of manufacturing technology, not necessarily resident at Lawrence Livermore National Laboratory (LLNL), to conduct the future business of LLNL. The specific goals continue to be to (1) develop an understanding of fundamental fabrication processes; (2) construct general purpose process models that will have wide applicability; (3) document findings and models in journals; (4) transfer technology to LLNL programs, industry, and colleagues; and (5) develop continuing relationships with the industrial and academic communities to advance the collective understanding of fabrication processes. The strategy to ensure success is changing. For technologies in which they are expert and which will continue to be of future importance to LLNL, they can often attract outside resources both to maintain their expertise by applying it to a specific problem and to help fund further development. A popular vehicle to fund such work is the Cooperative Research and Development Agreement with industry. For technologies needing development because of their future critical importance and in which they are not expert, they use internal funding sources. These latter are the topics of the thrust area. Three FY-92 funded projects are discussed in this section. Each project clearly moves the Fabrication Technology thrust area towards the goals outlined above. They have also continued their membership in the North Carolina State University Precision Engineering Center, a multidisciplinary research and graduate program established to provide the new technologies needed by high-technology institutions in the US. As members, they have access to and use of the results of their research projects, many of which parallel the precision engineering efforts at LLNL.

  15. Lasers technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-07-01

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners.

  16. Technology and technology transfer: some basic issues

    OpenAIRE

    Shamsavari, Ali; Adikibi, Owen; Taha, Yasser

    2002-01-01

    This paper addresses various issues relating to technology and transfer of technology such as technology and society, technology and science, channels and models of technology transfer, the role of multinational companies in transfer of technology, etc. The ultimate objective is to pose the question of relevance of some existing models and ideas like technological independence in an increasingly globalised world economy.

  17. Technology cycles and technology revolutions

    Energy Technology Data Exchange (ETDEWEB)

    Paganetto, Luigi; Scandizzo, Pasquale Lucio

    2010-09-15

    Technological cycles have been characterized as the basis of long and continuous periods economic growth through sustained changes in total factor productivity. While this hypothesis is in part consistent with several theories of growth, the sheer magnitude and length of the economic revolutions experienced by humankind seems to indicate surmise that more attention should be given to the origin of major technological and economic changes, with reference to one crucial question: role of production and use of energy in economic development.

  18. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer will...

  19. Looking for a Framework for Analyzing Eco-innovation Dynamics

    DEFF Research Database (Denmark)

    Yang, Yan

    2011-01-01

    Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective.......Looking for a Framework for Analyzing Eco-innovation Dynamics: A Triple Helix Model of Innovation Perspective....

  20. 40 CFR 90.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Emission Test Equipment Provisions § 90.320 Carbon dioxide analyzer calibration. (a) Prior to its initial... carbon dioxide analyzer as follows: (1) Follow good engineering practices for instrument start-up...

  1. 40 CFR 89.322 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Test Equipment Provisions § 89.322 Carbon dioxide analyzer calibration. (a) Prior to its introduction... carbon dioxide analyzer shall be calibrated on all normally used instrument ranges. New...

  2. 21 CFR 868.1400 - Carbon dioxide gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Carbon dioxide gas analyzer. 868.1400 Section 868...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1400 Carbon dioxide gas analyzer. (a) Identification. A carbon dioxide gas analyzer is a device intended to measure the concentration of carbon...

  3. 21 CFR 882.1420 - Electroencephalogram (EEG) signal spectrum analyzer.

    Science.gov (United States)

    2010-04-01

    ....1420 Electroencephalogram (EEG) signal spectrum analyzer. (a) Identification. An electroencephalogram (EEG) signal spectrum analyzer is a device used to display the frequency content or power spectral... analyzer. 882.1420 Section 882.1420 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH...

  4. 21 CFR 1230.32 - Analyzing of samples.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analyzing of samples. 1230.32 Section 1230.32 Food... FEDERAL CAUSTIC POISON ACT Administrative Procedures § 1230.32 Analyzing of samples. Samples collected by an authorized agent shall be analyzed at the laboratory designated by the Food and...

  5. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1670 Neon gas analyzer. (a) Identification. A neon gas analyzer is a device intended to measure the concentration of neon in a gas mixture exhaled by...

  6. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  7. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the following initial and periodic calibrations. (a) Initial and...

  8. 40 CFR 90.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 90... Equipment Provisions § 90.316 Hydrocarbon analyzer calibration. (a) Calibrate the FID and HFID hydrocarbon... thereafter, adjust the FID and HFID hydrocarbon analyzer for optimum hydrocarbon response as specified...

  9. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86....331-79 Hydrocarbon analyzer calibration. The following steps are followed in sequence to calibrate the hydrocarbon analyzer. It is suggested, but not required, that efforts be made to minimize relative...

  10. 40 CFR 86.121-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Complete Heavy-Duty Vehicles; Test Procedures § 86.121-90 Hydrocarbon analyzer calibration. The hydrocarbon... FID and HFID hydrocarbon analyzers shall be adjusted for optimum hydrocarbon response....

  11. 40 CFR 86.521-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Regulations for 1978 and Later New Motorcycles; Test Procedures § 86.521-90 Hydrocarbon analyzer calibration. (a) The FID hydrocarbon analyzer shall receive the following initial and periodic calibration....

  12. 21 CFR 868.2380 - Nitric oxide analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitric oxide analyzer. 868.2380 Section 868.2380...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Monitoring Devices § 868.2380 Nitric oxide analyzer. (a) Identification. The nitric oxide analyzer is a device intended to measure the concentration of nitric oxide...

  13. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  14. Automatic proximate analyzer of coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, Youhui; Jiang, Taiyi; Zou, Xianhong [National Laboratory of Coal Combustion, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China)

    2003-12-17

    A new type of rapid and automatic proximate analyzer for coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace is introduced in this paper. This automatic proximate analyzer was developed by combination with some novel technologies, such as the automatic weighting method for multi-samples in a high temperature and dynamic gas flow circumstance, the self-protection system for the electric balance, and the optimal method and procedure for coal analysis process. Additionally, the comparison between standard values and the measurement values derived from the new instrument of standard coals was presented.

  15. Shared Consumption : A Technological Analysis

    OpenAIRE

    John A. Weymark

    2004-01-01

    James Buchanan (Economica, [1966]) has argued that Alfred Marshall's theory of jointly-supplied goods can be extended to analyze the allocation of impure public goods. This article introduces a way of modelling sharing technologies for jointly-supplied goods that captures the essential features of Buchanan's proposal. Public and private goods are special cases of shared goods obtained by appropriately specifying the sharing technology. Necessary conditions for an allocation in a shared goods ...

  16. Persuasive Technology

    DEFF Research Database (Denmark)

    This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included in this vol......This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included...... in this volume. The topics covered are emotions and user experience, ambient persuasive systems, persuasive design, persuasion profiles, designing for health, psychology of persuasion, embodied and conversational agents, economic incentives, and future directions for persuasive technology....

  17. Seafood Technology

    DEFF Research Database (Denmark)

    Børresen, Torger

    This presentation will fill the total picture of this conference between fisheries and aquaculture, blue biotech and bioconservation, by considering the optimal processing technology of marine resources from the raw material until the seafood reaches the plate of the consumer. The situation today...... must be performed such that total traceability and authenticity of the final products can be presented on demand. The most important aspects to be considered within seafood technology today are safety, healthy products and high eating quality. Safety can be divided into microbiological safety...... and not presenting any safety risk per se. Seafood is healthy due to the omega-3 fatty acids and the nutritional value of vitamins, peptides and proteins. The processing technology must however be performed such that these valuable features are not lost during production. The same applies to the eating quality. Any...

  18. Knowledge Technologies

    CERN Document Server

    Milton, Nick

    2008-01-01

    Several technologies are emerging that provide new ways to capture, store, present and use knowledge. This book is the first to provide a comprehensive introduction to five of the most important of these technologies: Knowledge Engineering, Knowledge Based Engineering, Knowledge Webs, Ontologies and Semantic Webs. For each of these, answers are given to a number of key questions (What is it? How does it operate? How is a system developed? What can it be used for? What tools are available? What are the main issues?). The book is aimed at students, researchers and practitioners interested in Knowledge Management, Artificial Intelligence, Design Engineering and Web Technologies. During the 1990s, Nick worked at the University of Nottingham on the application of AI techniques to knowledge management and on various knowledge acquisition projects to develop expert systems for military applications. In 1999, he joined Epistemics where he worked on numerous knowledge projects and helped establish knowledge management...

  19. Next-generation sequencing technologies and the application in microbiology-A review%高通量测序技术及其在微生物学研究中的应用

    Institute of Scientific and Technical Information of China (English)

    秦楠; 栗东芳; 杨瑞馥

    2011-01-01

    20世纪70年代发明的核酸测序技术为基因组学及其相关学科的发展做出了巨大贡献,本世纪初发展的以Illumina公司的HiSeq 2000,ABI公司的SOLID,和Roche公司的454技术为代表的高通量测序技术又为基因组学的发展注入了新活力.本文在阐述这些技术的基础上,着重讨论了新一代测序技术在微生物领域中的应用.%Since its invention in 1970s, nucleic acid sequencing technology has contributed tremendously to the genomics advances.The next-generation sequencing technologies, represented by HiSeq 2000 from Illumina, SOLiD from Applied Biosystems and 454 from Roche, re-energized the application of genomics.In this review, we first introduced the next-generation sequencing technologies, then, described their potential applications in the field of microbiology.

  20. Technology Transfer

    Science.gov (United States)

    Smith, Nanette R.

    1995-01-01

    The objective of this summer's work was to attempt to enhance Technology Application Group (TAG) ability to measure the outcomes of its efforts to transfer NASA technology. By reviewing existing literature, by explaining the economic principles involved in evaluating the economic impact of technology transfer, and by investigating the LaRC processes our William & Mary team has been able to lead this important discussion. In reviewing the existing literature, we identified many of the metrics that are currently being used in the area of technology transfer. Learning about the LaRC technology transfer processes and the metrics currently used to track the transfer process enabled us to compare other R&D facilities to LaRC. We discuss and diagram impacts of technology transfer in the short run and the long run. Significantly, it serves as the basis for analysis and provides guidance in thinking about what the measurement objectives ought to be. By focusing on the SBIR Program, valuable information regarding the strengths and weaknesses of this LaRC program are to be gained. A survey was developed to ask probing questions regarding SBIR contractors' experience with the program. Specifically we are interested in finding out whether the SBIR Program is accomplishing its mission, if the SBIR companies are providing the needed innovations specified by NASA and to what extent those innovations have led to commercial success. We also developed a survey to ask COTR's, who are NASA employees acting as technical advisors to the SBIR contractors, the same type of questions, evaluating the successes and problems with the SBIR Program as they see it. This survey was developed to be implemented interactively on computer. It is our hope that the statistical and econometric studies that can be done on the data collected from all of these sources will provide insight regarding the direction to take in developing systematic evaluations of programs like the SBIR Program so that they can

  1. Architectural technology

    DEFF Research Database (Denmark)

    2005-01-01

    The booklet offers an overall introduction to the Institute of Architectural Technology and its projects and activities, and an invitation to the reader to contact the institute or the individual researcher for further information. The research, which takes place at the Institute of Architectural...... Technology at the Roayl Danish Academy of Fine Arts, School of Architecture, reflects a spread between strategic, goal-oriented pilot projects, commissioned by a ministry, a fund or a private company, and on the other hand projects which originate from strong personal interests and enthusiasm of individual...

  2. Playful Technology

    DEFF Research Database (Denmark)

    Johansen, Stine Liv; Eriksson, Eva

    2013-01-01

    In this paper, the design of future services for children in Danish public libraries is discussed, in the light of new challenges and opportunities in relation to new media and technologies. The Danish government has over the last few years initiated and described a range of initiatives regarding...... in the library, the changing role of the librarians and the library space. We argue that intertwining traditional library services with new media forms and engaging play is the core challenge for future design in physical public libraries, but also that it is through new media and technology that new...

  3. 美国基因技术与专利制度的互动诉求及趋势--以Myriad Genetics案的起因为视角%Study the Interaction Demand and Trend of the Gene Technology and the U.S. Patent Regime:By analyzing the Reasons of the Case 'Association for Molecular Pathology et al. v. Myriad Genetics, Inc., et al'

    Institute of Scientific and Technical Information of China (English)

    吴秀文; 肖冬梅

    2015-01-01

    The conflict between the high-value gene patents and the human life and health rights, the very strin-gent conditions of the research exemption of the US Patent Regime, the open-mind attitude on the subject matter of the U.S. Patent Regime, which is the root of Myriad Genetics case, but also revealed the interactive demand of gene technology and US Patent Regime. However, the Supreme Court of the United States held that the patent claims of the isolated DNA was invalid which can only temporarily quell the "Controversial" situation of the Patent Regime due to the gene technology, far from achieving benign interaction between gene technology and the Patent Regime. Through appropriate relaxation of the research exemption condition, limit the patentable subject matter of gene technology, in order to balance the interests of gene patentees and the people’human life and health rights, promote the optimization and trend of Patent Regime on gene technology.%高价值基因专利权人利益与人类生命健康权益的冲突、美国专利制度对科研豁免规定的严格条件、美国专利法对专利权客体所持的开放、包容态度,这既是Myriad Genetics案发生的根源所在,也传达了基因技术与专利制度的互动诉求。美国联邦最高法院作出的关于分离的DNA序列不具备可专利性的回应只是暂时平息基因专利争议的权宜之计,远未实现基因技术与专利制度的良性互动。通过适度放宽科研豁免条件、限缩基因类专利的客体范围,以平衡基因技术专利权人利益与人类生命健康权益,推动基因专利制度的优化趋势。

  4. Relations between the technological standards and technological appropriation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto PRADO GUERRERO

    2010-06-01

    Full Text Available The objective of this study is to analyze the educational practices of using Blackboard in blended learning environments with students of higher education to understand the relationship between technological appropriation and standards of educational technology. To achieve that goal, the following research question was raised: ¿To what extent are the standards of education technology with the appropriation of technology in blended learning environments in higher educa­tion related? The contextual framework of this work includes the following topics: the institution, teaching, teachers and students. The design methodology that was used is of a correlation type. Correlations were carried out to determine the frequency and level in the technological standards as well as the appropriation of technology. In the comparison of the results obtained by the students, the teachers and the platform; we found that students in the school study showed a high degree of technology ownership and this was the same for the performance shown on the technological standards. It was established that teachers play a key role in developing the techno­logical appropriation of students and performance in technology standards.

  5. Study on Analyzing Monodisperse Uranium Oxide Particle by FT-TIMS

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yan; WANG; Fan; ZHAO; Yong-gang; LI; Li-li; ZHANG; Yan; SHEN; Yan; CUI; Jian-yong; LIU; Yuang

    2012-01-01

    <正>Environmental sampling is the important one of IAEA safeguards technology, the aim of which is detecting the undeclared nuclear activities. Analyzing isotopic ratio of single uranium-bearing particle in swipe samples was a effective analytic technique in virtue of its ability of achieving the present or past information of nuclear facilities. For this purpose, a new method of Fission track (FT) technique combined with thermal ionization mass spectrometry (TIMS) was developed.

  6. Realization of inhomogeneous magnetic field for prism-type mass analyzer

    Directory of Open Access Journals (Sweden)

    P.O. Kuzema

    2012-06-01

    Full Text Available The configuration of magnet polar tips, which form in its gap the inhomogeneous magnetic field with the axial symmetry, has been determined and the technology of their production has been described. It is shown that for the given value of the polar tip apex angle, the necessary heterogeneity of magnetic field can be provided by the corresponding choice of the interpolar gap width of the mass analyzer magnet.

  7. Improved base calling for the Illumina Genome Analyzer using machine learning strategies

    OpenAIRE

    Kircher, Martin; Stenzel, Udo; Kelso, Janet

    2009-01-01

    The Illumina Genome Analyzer generates millions of short sequencing reads. We present Ibis (Improved base identification system), an accurate, fast and easy-to-use base caller that significantly reduces the error rate and increases the output of usable reads. Ibis is faster and more robust with respect to chemistry and technology than other publicly available packages. Ibis is freely available under the GPL from .

  8. Analyzing Real-World Light Duty Vehicle Efficiency Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, Jeffrey; Wood, Eric; Chaney, Larry; Holden, Jacob; Jeffers, Matthew; Wang, Lijuan

    2016-06-08

    Off-cycle technologies represent an important pathway to achieve real-world fuel savings, through which OEMs can potentially receive credit toward CAFE compliance. DOE national labs such as NREL are well positioned to provide objective input on these technologies using large, national data sets in conjunction with OEM- and technology-specific testing. This project demonstrates an approach that combines vehicle testing (dynamometer and on-road) with powertrain modeling and simulation over large, representative datasets to quantify real-world fuel economy. The approach can be applied to specific off-cycle technologies (engine encapsulation, start/stop, connected vehicle, etc.) in A/B comparisons to support calculation of realistic real-world impacts. Future work will focus on testing-based A/B technology comparisons that demonstrate the significance of this approach.

  9. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.

  10. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  11. Technological Dynamics and Social Capability

    DEFF Research Database (Denmark)

    Fagerberg, Jan; Feldman, Maryann; Srholec, Martin

    2014-01-01

    This article analyzes factors shaping technological capabilities in USA and European countries, and shows that the differences between the two continents in this respect are much smaller than commonly assumed. The analysis demonstrates a tendency toward convergence in technological capabilities...... for the sample as a whole between 1998 and 2008. The results indicate that social capabilities, such as well-developed public knowledge infrastructure, an egalitarian distribution of income, a participatory democracy and prevalence of public safety condition the growth of technological capabilities. Possible...

  12. Manufacturing technologies

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The Manufacturing Technologies Center is an integral part of Sandia National Laboratories, a multiprogram engineering and science laboratory, operated for the Department of Energy (DOE) with major facilities at Albuquerque, New Mexico, and Livermore, California. Our Center is at the core of Sandia`s Advanced Manufacturing effort which spans the entire product realization process.

  13. Energy Technology.

    Science.gov (United States)

    Eaton, William W.

    Reviewed are technological problems faced in energy production including locating, recovering, developing, storing, and distributing energy in clean, convenient, economical, and environmentally satisfactory manners. The energy resources of coal, oil, natural gas, hydroelectric power, nuclear energy, solar energy, geothermal energy, winds, tides,…

  14. Strategic Technology

    Science.gov (United States)

    2012-03-11

    the spectrum of future conflict and engagement. Technology Surprise Francis Fukuyama , in his introduction to the book Blindside, summarizes recent...atrocities or large-scale natural disasters abroad 12 Francis Fukuyama , ed, Blindside (Baltimore, MD: Brookings Institute Press, 2007), 1. 13 Defense

  15. GIG Technologies

    Science.gov (United States)

    2008-08-08

    caching • GIG as a sensor • Cyber SA/defense • Cross Domain Information Sharing • Multi-Level Security solutions • Enterprise Service Bus ( ESB ...Link Layer Technologies Integrated Link Layer All Optical Core For Terrestrial and Space Networks Separate Transmission Networks Mid-Term Integrated

  16. Biomedical sensing analyzer (BSA) for mobile-health (mHealth)-LTE.

    Science.gov (United States)

    Adibi, Sasan

    2014-01-01

    The rapid expansion of mobile-based systems, the capabilities of smartphone devices, as well as the radio access and cellular network technologies are the wind beneath the wing of mobile health (mHealth). In this paper, the concept of biomedical sensing analyzer (BSA) is presented, which is a novel framework, devised for sensor-based mHealth applications. The BSA is capable of formulating the Quality of Service (QoS) measurements in an end-to-end sense, covering the entire communication path (wearable sensors, link-technology, smartphone, cell-towers, mobile-cloud, and the end-users). The characterization and formulation of BSA depend on a number of factors, including the deployment of application-specific biomedical sensors, generic link-technologies, collection, aggregation, and prioritization of mHealth data, cellular network based on the Long-Term Evolution (LTE) access technology, and extensive multidimensional delay analyses. The results are studied and analyzed in a LabView 8.5 programming environment.

  17. FAW Technology Strategies of Low-Carbon Passenger Car

    Institute of Scientific and Technical Information of China (English)

    Li Jun

    2012-01-01

    Author analyzed the global background of low-carbon technology around the world,a technology & economy analysis model called TOS was developed in the paper,author analyzed technology paths for low-carbon Car in China based on the current technologies available and technologies to he developed in China,3 possible paths are presented based on the analysis,author also explained the FAW BlueWay technology strategies for low carbon cars both for short mid and long term objectives.Author concludes the paper with illustration of powertrain lineup for FAW BlueWay Technologies.

  18. Miniature-MCA technology developments

    Energy Technology Data Exchange (ETDEWEB)

    Halbig, J.K.; Klosterbuer, S.F.; Stephens, M.M.; Biddle, R.S.

    1991-12-31

    We have recently reduced the size of multichannel analyzers (MCAs) and have implemented more features in hardware to relieve software requirements. We built and tested a spectroscopy grade, 4096-channel MCA. Exclusive of amplifier and power supply, it fits on two boards each approximately 7 by 15 cm. This paper discusses the features and performance of the analyzer and some reasonable applications of these technologies.

  19. 21世纪制纸技术、纸制品进化之予测的有关人类要素分析的认识科学和文化遗传基因科学的方法%A COGNITIVE AND MEMETIC SCIENCE APPROACH TO ANALYZE THE HUMAN FACTORS IN PREDICTING THE EVOLUTION OF PAPER TECHNOLOGY AND PRODUCTS IN THE 21sT CENTURY

    Institute of Scientific and Technical Information of China (English)

    尾锅史彦

    2004-01-01

    @@ INTRODUCTION Predicting the future of paper industry is conventionally conducted from the technological and market-oriented aspects as well as a variety of constraints lying ahead of the industry such as resource, energy, and environmental issues.Since paper products, particularly paper media,have higher affinity to human being compared with other sheet-like materials such as plastics, metals,glasses and so on, not only the above factors but human factors such as ‘the affinity of paper to human being' and ‘the cognitive characteristics of paper'have to be taken into consideration in constructing a precise prediction model for the future of paper industry.

  20. Technology in L1

    DEFF Research Database (Denmark)

    Elf, Nikolaj Frydensbjerg; Hanghøj, Thorkild; Skaar, Håvard

    2015-01-01

    In recent decades, several Scandinavian research projects have had an explicit focus on how technology intervenes in L1 (or so-called Mother Tongue Education) practices in Swedish, Norwegian and Danish educational contexts, and how this may impact on understanding of the subject. There is currently...... of empirical stud-ies, what characterizes the research field?; and 3) for discussion, which broader implications does the review suggest for a rethinking of L1 in terms of practice and research? Introducing the notion of educa-tional boundary objects, a theoretical framework is developed, which suggests four...... metaphors for un-derstanding technology within L1: as a tool, as media, as socialization, and as literacy practices. These are found useful for analyzing and comparing both theoretical perspectives and empirical research on L1. A key finding of the study is that, although the included research...

  1. Evaluation of the Olympus AU 400 clinical chemistry analyzer.

    Science.gov (United States)

    Bilić, A; Alpeza, I; Rukavina, A S

    2000-01-01

    The performance of the Olympus AU 400 clinical chemistry analyzer was evaluated according to the guidelines of the European Committee for Clinical Laboratory Standards. The following analytes were tested: glucose, urea, creatinine, calcium, AST, ALT, CK, LDH, ALP and amylase. The Olympus AU 400 was compared with the Olympus AU 800. Coefficients of correlation showed high correlation between the compared analyzers. Other performances (intra- and inter-assay variation, carry-over and interferences) of the analyzer were satisfactory.

  2. Technology Programme

    Energy Technology Data Exchange (ETDEWEB)

    Batistoni, Paola; De Marco, Francesco; Pieroni, Leonardo (ed.)

    2005-07-01

    The technology activities carried out by the Euratom-ENEA Association in the framework of the European Fusion Development Agreement concern the Next Step (International Thermonuclear Experimental Reactor - ITER), the Long-Term Programme (breeder blanket, materials, International Fusion Materials Irradiation Facility - IFMIF), Power Plant Conceptual Studies and Socio-Economic Studies. The Underlying Technology Programme was set up to complement the fusion activities as well to develop technologies with a wider range of interest. The Technology Programme mainly involves staff from the Frascati laboratories of the Fusion Technical and Scientific Unit and from the Brasimone laboratories of the Advanced Physics Technologies Unit. Other ENEA units also provide valuable contributions to the programme. ENEA is heavily engaged in component development/testing and in design and safety activities for the European Fusion Technology Programme. Although the work documented in the following covers a large range of topics that differ considerably because they concern the development of extremely complex systems, the high level of integration and coordination ensures the capability to cover the fusion system as a whole. In 2004 the most significant testing activities concerned the ITER primary beryllium-coated first wall. In the field of high-heat-flux components, an important achievement was the qualification of the process for depositing a copper liner on carbon fibre composite (CFC) hollow tiles. This new process, pre-brazed casting (PBC), allows the hot radial pressing (HRP) joining procedure to be used also for CFC-based armour monoblock divertor components. The PBC and HRP processes are candidates for the construction of the ITER divertor. In the materials field an important milestone was the commissioning of a new facility for chemical vapour infiltration/deposition, used for optimising silicon carbide composite (SiCf/SiC) components. Eight patents were deposited during 2004

  3. Hearing Assistive Technology

    Science.gov (United States)

    ... for the Public / Hearing and Balance Hearing Assistive Technology Hearing Assistive Technology: FM Systems | Infrared Systems | Induction ... Assistive Technology Systems Solutions What are hearing assistive technology systems (HATS)? Hearing assistive technology systems (HATS) are ...

  4. The design of wavelength selector for full-automatic ELISA analyzer

    Science.gov (United States)

    Bao, Yan; Dong, Mingli; Zhu, Lianqing; Chang, Haitao; Li, Hong

    2011-05-01

    In recent years, ELISA technology has been developed rapidly and full-automatic ELISA analyzer, which is of significant practical value is widely used in the diagnosis of many diseases, such as, bacteria and viruses. Optical detection system is the hard core of fully automated ELISA analyzer and the key part of system is high-precision wavelength selector. The authors in this paper present the design of wavelength selector composed of light source, optical circuit plan, color filters and the module of signal acquisition. A control system for stepper motor which is used to choose the suitable color filters based on the microcontroller 8051 is introduced. From the results of experiment test, it can be seen that the wavelength selector is sufficient to meet the requirements of the full-automatic ELISA analyzer.

  5. Communications technology

    Science.gov (United States)

    Cuccia, C. Louis; Sivo, Joseph

    1986-01-01

    The technologies for optimized, i.e., state of the art, operation of satellite-based communications systems are surveyed. Features of spaceborne active repeater systems, low-noise signal amplifiers, power amplifiers, and high frequency switches are described. Design features and capabilities of various satellite antenna systems are discussed, including multiple beam, shaped reflector shaped beam, offset reflector multiple beam, and mm-wave and laser antenna systems. Attitude control systems used with the antenna systems are explored, along with multiplexers, filters, and power generation, conditioning and amplification systems. The operational significance and techniques for exploiting channel bandwidth, baseband and modulation technologies are described. Finally, interconnectivity among communications satellites by means of RF and laser links is examined, as are the roles to be played by the Space Station and future large space antenna systems.

  6. Army Technology

    Science.gov (United States)

    2015-02-01

    capabilities that results in rapid and efficient biosurveillance . The program uses an information portal similar to a health surveillance web...and quickly.” The D3 program is part of the broader Joint U.S. Forces Korea Portal and Integrated Threat Reduction Advanced Technology Demonstration... biosurveillance capabilities. Army researchers traveled to Korea with a suite of equipment, including nine commercial detector systems. Some of the systems are

  7. Blast Technologies

    Science.gov (United States)

    2011-06-27

    rollover  VAT: Vertical forces and floor deformation  HIP : Head protection systems Payoff: MABS  State-of-the-art unique piece of test equipment...13 14 15 16 17 Energy Absorbing Seats w/ Restraints Blast Mats and other Interior Treatments Data Recorders and Sensors Methods and Standards... treatments .  Airbag or comparable technologies such as bolsters.  Sensors that can detect and deploy/trigger interior treatments within the timeframe of a

  8. Manufacturing technology

    Energy Technology Data Exchange (ETDEWEB)

    Leonard, J.A.; Floyd, H.L.; Goetsch, B.; Doran, L. [eds.

    1993-08-01

    This bulletin depicts current research on manufacturing technology at Sandia laboratories. An automated, adaptive process removes grit overspray from jet engine turbine blades. Advanced electronic ceramics are chemically prepared from solution for use in high- voltage varistors. Selective laser sintering automates wax casting pattern fabrication. Numerical modeling improves performance of photoresist stripper (simulation on Cray supercomputer reveals path to uniform plasma). And mathematical models help make dream of low- cost ceramic composites come true.

  9. Poretools: a toolkit for analyzing nanopore sequence data

    OpenAIRE

    Loman, Nicholas J.; Quinlan, Aaron R.

    2014-01-01

    Motivation: Nanopore sequencing may be the next disruptive technology in genomics, owing to its ability to detect single DNA molecules without prior amplification, lack of reliance on expensive optical components, and the ability to sequence long fragments. The MinION™ from Oxford Nanopore Technologies (ONT) is the first nanopore sequencer to be commercialized and is now available to early-access users. The MinION™ is a USB-connected, portable nanopore sequencer that permits real-time analysi...

  10. Aplicación de un modelo de ecuaciones estructurales para analizar los sistemas de gestión en la integración de la RSC y su influencia en la estrategia y el performance de las empresas tecnológicas || Applying a structural equation model to analyze management systems in the integration of CSR and its influence on the strategy and performance of technology companies

    Directory of Open Access Journals (Sweden)

    Bernal Conesa, Juan Andrés

    2016-06-01

    Full Text Available La importancia de los sistemas de gestión para la integración de la RSC en la estrategia de la empresa es un recurso vital que ha sido poco estudiado en las empresas tecnológicas. En este artículo se propone un modelo de ecuaciones estructurales para explicar la influencia de la RSC y su integración en el sistema de gestión de la empresa, facilitada por la existencia de sistemas de gestión normalizados previos, y cómo influye dicha integración en la estrategia de la empresa y si esto tiene un reflejo en el performance económico de la empresa tecnológica. El estudio se llevó a cabo en empresas ubicadas en parques científicos y tecnológicos españoles. Los resultados del modelo revelan que existe una relación positiva, directa y estadísticamente significativas entre la integración de la RSC y la estrategia, por un lado, y la integración y el performance, por el otro. Asimismo se evidencia unas relaciones indirectas entre los sistemas de gestión normalizados previos a la implantación de la RSC y el performance y, por tanto, con implicaciones prácticas para la gestión de la RSC en empresas tecnológicas. || The importance of management systems for the integration of CSR in the company strategy is a vital resource that has been little studied in technology companies. In this paper a structural equation model is proposed in order to explain the influence of CSR and its integration into the management system of the company. This influence is facilitated by the existence of previous standardized management systems, and how this integration affects the strategy of the company and if this is a reflection on the economic performance of the technology company. The study was conducted in companies located in Spanish Science and Technology Parks. On the one hand, model results shows that there is a positive, direct and statistically significant relationship between the integration of CSR and strategy; on the other hand, performance and

  11. Technology Management

    DEFF Research Database (Denmark)

    Pilkington, Alan

    2014-01-01

    This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter-relationships. Spe......This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter......-relationships. Specific results from different levels of analysis show the different dimensions of technology management: • Co-word terms identify themes • Journal co-citation network: linking to other disciplines • Co-citation network show concentrations of themes The analysis shows that MOT has a bridging role...... in integrating ideas from several distinct disciplines. This suggests that management and strategy are central to MOT which essentially relates to the firm rather than policy. Similarly we have a dual focus on capabilities, but can see subtle differences in how we view these ideas, either through an inwards...

  12. A Novel Analyzer Control System for Diffraction Enhanced Imaging

    Science.gov (United States)

    Rhoades, Glendon; Belev, George; Rosenberg, Alan; Chapman, Dean

    2013-03-01

    Diffraction Enhanced Imaging is an imaging modality that derives contrast from x-ray refraction, an extreme form of scatter rejection (extinction) and absorption which is common to conventional radiography. A critical part of the imaging system is the "analyzer crystal" which is used to re-diffract the beam after passing through the object being imaged. The analyzer and monochromator crystals form a matched parallel crystal set. This analyzer needs to be accurately aligned and that alignment maintained over the course of an imaging session. Typically, the analyzer needs to remain at a specific angle within a few tens of nanoradians to prevent problems with image interpretation. Ideally, the analyzer would be set to a specific angle and would remain at that angle over the course of an imaging session which might be from a fraction of a second to several minutes or longer. In many instances, this requirement is well beyond what is possible by relying on mechanical stability alone and some form of feedback to control the analyzer setting is required. We describe a novel analyzer control system that allows the analyzer to be set at any location in the analyzer rocking curve, including the peak location. The method described is extensible to include methods to extend the range of analyzer control to several Darwin widths away from the analyzer peaked location. Such a system is necessary for the accurate implementation of the method and is intended to make the use of the method simpler without relying on repeated alignment during the imaging session.

  13. [Health Technology Dependency: A Concept Analysis].

    Science.gov (United States)

    Chen, Miao-Yi; Chen, Ting-Yu; Kao, Chi-Wen

    2016-02-01

    Health technology dependence is a widely recognized concept that refers to the utilization of technology, including drugs, equipment, instruments, and related devices, to compensate for a physical disability or to prevent the progression of a disability. Although technology may significantly prolong the life of a patient, technology may also increase the psychological pressure of these patients and the burdens of their caregivers. There is a current dearth of related research and discussions related to the concept of "health technology dependency". Therefore, the present paper uses the strategies of concept analysis described by Walker & Avant (2010) to analyze this concept. The characteristic definition of health technology dependence addresses individuals who: (1) currently live with health technology, (2) may perceive physical or psychological burdens due to health technology, and (3) feel physical and psychological well-being when coping positively with their health technology dependency and, further, regard health technology as a part of their body. Further, the present paper uses case examples to help analyze the general concept. It is hoped that nurses may better understand the concept of "health technology dependency", consider the concerns of health-technology-dependent patients and their families, and develop relevant interventions to promote the well-being of these patients and their families.

  14. Aviation environmental technology and science

    Institute of Scientific and Technical Information of China (English)

    Zhang Yanzhong

    2008-01-01

    Expatiating on the impact of aviation on the environment and aviation environmental protection projects are ex- pounded, and analyzing on the atmosphere pollution and effects on the aviation noise of aircraft discharge. Researching the approach to control aircraft exhaust pollution and noise pollution, and proposing the technology and management measures to reduce air pollution.

  15. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  16. 40 CFR 86.1524 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration. 86.1524 Section 86.1524 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Test Procedures § 86.1524 Carbon dioxide analyzer calibration. (a) The calibration requirements for...

  17. 40 CFR 91.320 - Carbon dioxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon dioxide analyzer calibration... Provisions § 91.320 Carbon dioxide analyzer calibration. (a) Prior to its introduction into service, and monthly thereafter, or within one month prior to the certification test, calibrate the NDIR carbon...

  18. 40 CFR 86.317-79 - Hydrocarbon analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer specifications....317-79 Hydrocarbon analyzer specifications. (a) Hydrocarbon measurements are to be made with a heated... measures hydrocarbon emissions on a dry basis is permitted for gasoline-fueled testing; Provided,...

  19. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... at any point, use the best-fit non-linear equation which represents the data to within two percent of... been set to the most common operating range. (4) Introduce into the NOX generator analyzer-system an NO... off the NOX generator but maintain gas flow through the system. The oxides of nitrogen analyzer...

  20. DUAL-CHANNEL PARTICLE SIZE AND SHAPE ANALYZER

    Institute of Scientific and Technical Information of China (English)

    Arjen van der Schoot

    2004-01-01

    @@ Fig. 1 shows a newly developed analyzer (Ankersmid CIS-100) that brings together two different measurement channels for accurate size and shape measurement of spherical and non-spherical particles. The size of spherical particles is measured by a HeNe Laser Beam; the size of non-spherical particles is analyzed by Dynamic Video Analysis of the particles' shape.

  1. Introduction of new technology in vascular surgery.

    Science.gov (United States)

    Bergqvist, D

    2008-01-01

    In this review paper introduction of new technologies in vascular surgery is discussed. The difficulties compared to introduction of pharmacological treatment are analyzed. Pros and cons with randomized controlled trials and observational studies are listed.

  2. Sensemaking in Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2006-01-01

    of advanced CSCW technologies is basically a problem of sensemaking. We analyze how a group of “technology-use mediators” (Orlikowski et al. Org. Sci. (1995) 6(4), 423) in a large, multinational company adapted a groupware technology (a “virtual workspace”) to the local organizational context (and vice versa......) by modifying features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. Our findings corroborate earlier research on technology-use mediation, which suggests that such mediators can exert considerable influence on how a particular technology...... will be established and used in an organization. However, we also find that the process of technology-use mediation is much more complex and indeterminate than prior research suggests. The reason being, we argue, that new, advanced CSCW technologies, such as “virtual workspaces” and other groupware applications...

  3. Agreement technologies

    CERN Document Server

    Ossowski, Sascha

    2013-01-01

    More and more transactions, whether in business or related to leisure activities, are mediated automatically by computers and computer networks, and this trend is having a significant impact on the conception and design of new computer applications. The next generation of these applications will be based on software agents to which increasingly complex tasks can be delegated, and which interact with each other in sophisticated ways so as to forge agreements in the interest of their human users. The wide variety of technologies supporting this vision is the subject of this volume. It summarises

  4. International Space Station Major Constituent Analyzer On-orbit Performance

    Science.gov (United States)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  5. Note: Portable rare-earth element analyzer using pyroelectric crystal

    Energy Technology Data Exchange (ETDEWEB)

    Imashuku, Susumu, E-mail: imashuku.susumu.2m@kyoto-u.ac.jp; Fuyuno, Naoto; Hanasaki, Kohei; Kawai, Jun [Department of Materials Science and Engineering, Kyoto University, Sakyo, Kyoto 606-8501 (Japan)

    2013-12-15

    We report a portable rare-earth element analyzer with a palm-top size chamber including the electron source of a pyroelectric crystal and the sample stage utilizing cathodoluminescence (CL) phenomenon. The portable rare-earth element analyzer utilizing CL phenomenon is the smallest reported so far. The portable rare-earth element analyzer detected the rare-earth elements Dy, Tb, Er, and Sm of ppm order in zircon, which were not detected by scanning electron microscopy-energy dispersive X-ray spectroscopy analysis. We also performed an elemental mapping of rare-earth elements by capturing a CL image using CCD camera.

  6. COSTEP: A comprehensive suprathermal and energetic particle analyzer for SOHO

    Science.gov (United States)

    Kunow, Horst; Fischer, Harald; Green, Guenter; Mueller-Mellin, Reinhold; Wibberenz, Gerd; Holweger, Hartmut; Evenson, Paul; Meyer, Jean-Paul; Hasebe, Nabuyuki; Vonrosenvinge, Tycho

    1988-01-01

    The group of instruments involved in the COSTEP (comprehensive suprathermal and energetic particle analyzer) project are described. Three sensors, the LION (low energy ion and electron) instrument, the MEICA (medium energy ion composition analyzer) and the EPHIN (electron proton helium instrument) are described. They are designed to analyze particle emissions from the sun over a wide range of species (electrons through iron) and energies (60 KeV/particle to 500 MeV/nucleon). The data collected is used in studying solar and space plasma physics.

  7. Validation of ESR analyzer using Westergren ESR method.

    Science.gov (United States)

    Sikka, Meera; Tandon, Rajesh; Rusia, Usha; Madan, Nishi

    2007-07-01

    Erythrocyte sedimentation rate (ESR) is one of the most frequently ordered laboratory test. ESR analyzers were developed to provide a quick and efficient measure of ESR. We compared the results of ESR obtained by an ESR analyzer with those by the Westergren method in a group of 75 patients Linear regression analysis showed a good correlation between the two results (r = 0.818, p < 0.01). The intra class correlation was 0.82. The analyzer method had the advantages of safety, decreased technician time and improved patient care by providing quick results.

  8. Photon technology. Hard photon technology; Photon technology. Hard photon gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    Research results of hard photon technology have been summarized as a part of novel technology development highly utilizing the quantum nature of photon. Hard photon technology refers to photon beam technologies which use photon in the 0.1 to 200 nm wavelength region. Hard photon has not been used in industry due to the lack of suitable photon sources and optical devices. However, hard photon in this wavelength region is expected to bring about innovations in such areas as ultrafine processing and material synthesis due to its atom selective reaction, inner shell excitation reaction, and spatially high resolution. Then, technological themes and possibility have been surveyed. Although there are principle proposes and their verification of individual technologies for the technologies of hard photon generation, regulation and utilization, they are still far from the practical applications. For the photon source technology, the laser diode pumped driver laser technology, laser plasma photon source technology, synchrotron radiation photon source technology, and vacuum ultraviolet photon source technology are presented. For the optical device technology, the multi-layer film technology for beam mirrors and the non-spherical lens processing technology are introduced. Also are described the reduction lithography technology, hard photon excitation process, and methods of analysis and measurement. 430 refs., 165 figs., 23 tabs.

  9. Leaders, laggards and technology seeking strategies

    NARCIS (Netherlands)

    Smeets, Roger; Bosker, E. M.

    2011-01-01

    We analyze the conditions determining optimal technology seeking strategies for leader and laggard firms. We extend existing theories by differentiating leaders and laggards in terms of absorptive capacity and intra-firm technology transfer skills, next to productivity levels. In addition, both Fore

  10. [Role of Radionuclide Technologies in Medicine].

    Science.gov (United States)

    Chernyaev, A P; Belousov, A V; Varzar, S M; Borchegovskaya, P Y; Nikolaeva, A A; Krusanov, G A

    2016-01-01

    The paper describes the role of radionuclide technologies among the nuclear-physical methods used in medicine. The condition and prospects of the development of nuclear technology with use of radionuclides in medicine, and in particular, the method of brachytherapy are analyzed. The analysis of the current state of applying radionuclide facilities in medicine is provided.

  11. Wearable Technology

    Science.gov (United States)

    Watson, Amanda

    2013-01-01

    Wearable technology projects, to be useful, in the future, must be seamlessly integrated with the Flight Deck of the Future (F.F). The lab contains mockups of space vehicle cockpits, habitat living quarters, and workstations equipped with novel user interfaces. The Flight Deck of the Future is one element of the Integrated Power, Avionics, and Software (IPAS) facility, which, to a large extent, manages the F.F network and data systems. To date, integration with the Flight Deck of the Future has been limited by a lack of tools and understanding of the Flight Deck of the Future data handling systems. To remedy this problem it will be necessary to learn how data is managed in the Flight Deck of the Future and to develop tools or interfaces that enable easy integration of WEAR Lab and EV3 products into the Flight Deck of the Future mockups. This capability is critical to future prototype integration, evaluation, and demonstration. This will provide the ability for WEAR Lab products, EV3 human interface prototypes, and technologies from other JSC organizations to be evaluated and tested while in the Flight Deck of the Future. All WEAR Lab products must be integrated with the interface that will connect them to the Flight Deck of the Future. The WEAR Lab products will primarily be programmed in Arduino. Arduino will be used for the development of wearable controls and a tactile communication garment. Arduino will also be used in creating wearable methane detection and warning system.

  12. A Delphi forecast of technology in education

    Science.gov (United States)

    Robinson, B. E.

    1973-01-01

    The results are reported of a Delphi forecast of the utilization and social impacts of large-scale educational telecommunications technology. The focus is on both forecasting methodology and educational technology. The various methods of forecasting used by futurists are analyzed from the perspective of the most appropriate method for a prognosticator of educational technology, and review and critical analysis are presented of previous forecasts and studies. Graphic responses, summarized comments, and a scenario of education in 1990 are presented.

  13. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  14. Testing Evaluation of the Electrochemical Organic Content Analyzer

    Science.gov (United States)

    Davenport, R. J.

    1979-01-01

    The breadboard electrochemical organic content analyzer was evalauted for aerospace applications. An awareness of the disadvantages of expendables in some systems resulted in an effort to investigate ways of reducing the consumption of the analyzer's electrolyte from the rate of 5.17 kg/30 days. It was found that the electrochemical organic content analyzer can result in an organic monitor in the water quality monitor having a range of 0.1 to 100 mg/1 total organic carbon for a large number of common organic solutes. In a flight version it is anticipated the analyzer would occupy .0002 cu m, weigh 1.4 kg, and require 10 W or less of power. With the optimum method of injecting electrolyte into the sample (saturation of the sample with a salt) it would expend only 0.04 kg of electrolyte during 30 days of continuous operation.

  15. Mini Total Organic Carbon Analyzer (miniTOCA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Total Organic Carbon (TOC) analyzers function by converting (oxidizing) all organic compounds (contaminants) in the water sample to carbon dioxide gas (CO2), then...

  16. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  17. Mars & Multi-Planetary Electrical Environment Spectrum Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Our objective is to develop MENSA as a highly integrated planetary radio and digital spectrum analyzer cubesat payload that can be deployed as a satellite instrument...

  18. Airspace Analyzer for Assessing Airspace Directional Permeability Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level permeability...

  19. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to develop highly-accurate, lightweight, low-power gas analyzers for measurements of carbon dioxide (CO2) and water vapor (H2O)...

  20. 40 CFR 1065.250 - Nondispersive infra-red analyzer.

    Science.gov (United States)

    2010-07-01

    ... analyzer that has compensation algorithms that are functions of other gaseous measurements and the engine's known or assumed fuel properties. The target value for any compensation algorithm is 0.0% (that is,...

  1. 40 CFR 1065.272 - Nondispersive ultraviolet analyzer.

    Science.gov (United States)

    2010-07-01

    ... in § 1065.307. You may use a NDUV analyzer that has compensation algorithms that are functions of... compensation algorithm is 0.0% (that is, no bias high and no bias low), regardless of the uncompensated...

  2. Lab-on-a-chip astrobiology analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an astrobiology analyzer to measure chemical signatures of life in extraterrestrial settings. The...

  3. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  4. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for eddy flux covariance...

  5. Triple Isotope Water Analyzer for Extraplanetary Studies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  6. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  7. Real-Time, Polyphase-FFT, 640-MHz Spectrum Analyzer

    Science.gov (United States)

    Zimmerman, George A.; Garyantes, Michael F.; Grimm, Michael J.; Charny, Bentsian; Brown, Randy D.; Wilck, Helmut C.

    1994-01-01

    Real-time polyphase-fast-Fourier-transform, polyphase-FFT, spectrum analyzer designed to aid in detection of multigigahertz radio signals in two 320-MHz-wide polarization channels. Spectrum analyzer divides total spectrum of 640 MHz into 33,554,432 frequency channels of about 20 Hz each. Size and cost of polyphase-coefficient memory substantially reduced and much of processing loss of windowed FFTs eliminated.

  8. A New Theoretical Framework for Analyzing Stochastic Global Optimization Algorithms

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    In this paper, we develop a new theoretical framework by means of the absorbing Markov process theory for analyzing some stochastic global optimization algorithms. Applying the framework to the pure random search, we prove that the pure random search converges to the global minimum in probability and its time has geometry distribution. We also analyze the pure adaptive search by this framework and turn out that the pure adaptive search converges to the global minimum in probability and its time has Poisson distribution.

  9. Conceptual Framework for Analyzing the MTS within the Intermodal System

    Science.gov (United States)

    2012-06-01

    BUILDING STRONG® US Army Corps of Engineers BUILDING STRONG® Conceptual Framework for Analyzing the MTS within the Intermodal System Dr. Mike...2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Conceptual Framework for Analyzing the MTS within the... Conceptual Framework c CURRENT MTS CURRENT INTERMODAL SYSTEM c MTS INVESTMENTS INTERMODAL INVESTMENTS FUTURE FREIGHT TRANSPORTATION DEMAND c

  10. Seasonal cycle of cloud cover analyzed using Meteosat images

    OpenAIRE

    Massons, J.; Domingo, D.; Lorente, J.

    1998-01-01

    A cloud-detection method was used to retrieve cloudy pixels from Meteosat images. High spatial resolution (one pixel), monthly averaged cloud-cover distribution was obtained for a 1-year period. The seasonal cycle of cloud amount was analyzed. Cloud parameters obtained include the total cloud amount and the percentage of occurrence of clouds at three altitudes. Hourly variations of cloud cover are also analyzed. Cloud properties determined are coherent with those obtained in previous studies....

  11. Analyzing the spectrum of general, non-hermitian Dirac operators

    CERN Document Server

    Gattringer, C R; Gattringer, Christof; Hip, Ivan

    1999-01-01

    We discuss the computational problems when analyzing general, non-hermitian matrices and in particular the un-modified Wilson lattice Dirac operator. We report on our experiences with the Implicitly Restarted Arnoldi Method. The eigenstates of the Wilson-Dirac operator which have real eigenvalues and correspond to zero modes in the continuum are analyzed by correlating the size of the eigenvalues with the chirality of the eigenstates.

  12. BK/TD models for analyzing in vitro impedance data on cytotoxicity.

    Science.gov (United States)

    Teng, S; Barcellini-Couget, S; Beaudouin, R; Brochot, C; Desousa, G; Rahmani, R; Pery, A R R

    2015-06-01

    The ban of animal testing has enhanced the development of new in vitro technologies for cosmetics safety assessment. Impedance metrics is one such technology which enables monitoring of cell viability in real time. However, analyzing real time data requires moving from static to dynamic toxicity assessment. In the present study, we built mechanistic biokinetic/toxicodynamic (BK/TD) models to analyze the time course of cell viability in cytotoxicity assay using impedance. These models account for the fate of the tested compounds during the assay. BK/TD models were applied to analyze HepaRG cell viability, after single (48 h) and repeated (4 weeks) exposures to three hepatotoxic compounds (coumarin, isoeugenol and benzophenone-2). The BK/TD models properly fit the data used for their calibration that was obtained for single or repeated exposure. Only for one out of the three compounds, the models calibrated with a single exposure were able to predict repeated exposure data. We therefore recommend the use of long-term exposure in vitro data in order to adequately account for chronic hepatotoxic effects. The models we propose here are capable of being coupled with human biokinetic models in order to relate dose exposure and human hepatotoxicity.

  13. Evaluation of the Sysmex Xe-2100 hematology analyzer in hospital use.

    Science.gov (United States)

    Nakul-Aquaronne, Danièle; Sudaka-Sammarcelli, Isabelle; Ferrero-Vacher, Corinne; Starck, Benoit; Bayle, Jacques

    2003-01-01

    The Sysmex XE-2100 (Sysmex Corp. Kobe, Japan) is a latest-generation hematology analyzer. Its optical and electrical measuring technology is improved by the addition of flux cytometry, fluorescence, and differential lysis. Its analytical performance in terms of precision, reproducibility, linearity, carryover, and time stability was found to be entirely satisfactory. In addition, the results of 500 complete blood counts and differentials correlated perfectly with those obtained by the Coulter STKS (Beckman Coulter, Villapointe, France). The comparison of 500 leukocyte differential count results analyzed in parallel with optical microscopy and the XE-2100 were surprising, and favorable to the XE-2100. This analyzer provides the user with an undeniable feeling of security concerning its reliability in detecting and identifying anomalies in the automated leukocyte differential count. With a sensitivity of 96%, a negative predictive value (NPV) of 98%, and a false-negative (FN) rate of 4%, the XE-2100 has perhaps reached the technological limits for a machine performing morphological recognition of normal and pathological blood cells.

  14. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Science.gov (United States)

    Gilev, A. G.; Pleshanov, N. K.; Bazarov, B. A.; Bulkin, A. P.; Schebetov, A. F.; Syromyatnikov, V. G.; Tarnavich, V. V.; Ulyanov, V. A.

    2016-10-01

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4‧ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm2 beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm2 window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm2 window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  15. Theoretical and methodological foundations of technological management

    Directory of Open Access Journals (Sweden)

    L.O. Ligonenko

    2016-09-01

    Full Text Available The aim of the article. In the article there are critically analyzed the existing developments about the content, objectives and functions of technological management, which allowed to identify such approaches as target, process, functional, philosophical, resource and competitive ones. The results of the analysis. While integrating them there was formed the author's interpretation of theoretical and methodological foundations of this relatively new functional type of management for Ukraine, which represented the system of principles and methods of taking and implementing complex management decisions aimed at efficient use of available technological resources and technological development of the company. There are also grounded the purposes of technological management, which are: technological development of enterprises, that is purposeful, continuous (constantly organized process of irreversible changes in production processes (technologies of enterprise economic activity that provoke (cause the corresponding development of the fixed assets (which provide them, staff (which implements and use them and intangible assets (which identify their creation or use, which together enable to ensure technological competitiveness of the enterprise and development of the market of technologies in general. The object of such management is defined, which are: technological processes, preconditions (technological potential and the consequences of their implementation (technological competitiveness of the enterprise. There are identified the key subjects of technological management, their interests and spheres of responsibilities. The methodological basis of technological management is considered to be the concept of open innovations by H. Chesbro which, on the one hand, means deliberate involvement of external ideas and technologies; active cooperation of all stakeholders in the company as to the formation of new ideas aimed at systematic improving the product

  16. Development of a Thermal/Optical Carbon Analyzer with Multi-Wavelength Capabilities

    Science.gov (United States)

    Sumlin, B.; Chow, J. C.; Watson, J. G.; Wang, X.; Gronstal, S.; Chen, L. W. A. A.; Trimble, D.

    2014-12-01

    A thermal/optical carbon analyzer (DRI Model 2015) equipped with a novel seven-wavelength light source (405, 445, 532, 635, 780, 808, and 980 nm) was developed to analyze chemical and optical properties of particles collected on quartz-fiber filters. Based on the DRI Model 2001 carbon analyzer at 633 nm, major modifications were made on mechanical and electrical components, flow control, and the carbon detector to adopt modern technologies, increase instrument reliability, and reduce costs and maintenance. The correlation between wavelength-dependent light attenuation and organic and elemental carbon (OC and EC, respectively) content allows estimation of the amount of brown and black carbon (BrC and BC, respectively) on filters. Continuous monitoring of the light reflected from and transmitted through the filter along with carbon evolved from the filter when heated to different temperatures under either inert or oxidizing gas environments provides insights into the optical properties of the carbon released from the filter; it also allows examination of the charring process as pyrolyzed char has been one of the major uncertainties in quantifying OC and EC. The objectives of this study are: 1) establish performance equivalency between the Model 2015 and Model 2001 DRI carbon analyzers when comparing similar laser wavelength to maintain consistency for long-term network sample analysis; and 2) analyze the multi-wavelength signal to quantify BrC and BC, and to optimize char correction. A selection of samples, including standard chemicals, rural and urban ambient filters, and emission sources from biomass burning, diesel and gasoline engine exhaust, and resuspended dust were measured by both the Model 2015 and Model 2001 analyzers. The instrument design, calibration, comparison with legacy analyzer, and interpretation of the multi-wavelengths measurement will be presented.

  17. Clean coal technologies market potential

    Energy Technology Data Exchange (ETDEWEB)

    Drazga, B. (ed.)

    2007-01-30

    Looking at the growing popularity of these technologies and of this industry, the report presents an in-depth analysis of all the various technologies involved in cleaning coal and protecting the environment. It analyzes upcoming and present day technologies such as gasification, combustion, and others. It looks at the various technological aspects, economic aspects, and the various programs involved in promoting these emerging green technologies. Contents: Industry background; What is coal?; Historical background of coal; Composition of coal; Types of coal; Environmental effects of coal; Managing wastes from coal; Introduction to clean coal; What is clean coal?; Byproducts of clean coal; Uses of clean coal; Support and opposition; Price of clean coal; Examining clean coal technologies; Coal washing; Advanced pollution control systems; Advanced power generating systems; Pulverized coal combustion (PCC); Carbon capture and storage; Capture and separation of carbon dioxide; Storage and sequestration of carbon dioxide; Economics and research and development; Industry initiatives; Clean Coal Power Initiative; Clean Coal Technology Program; Coal21; Outlook; Case Studies.

  18. Analyzing complex patients' temporal histories: new frontiers in temporal data mining.

    Science.gov (United States)

    Sacchi, Lucia; Dagliati, Arianna; Bellazzi, Riccardo

    2015-01-01

    In recent years, data coming from hospital information systems (HIS) and local healthcare organizations have started to be intensively used for research purposes. This rising amount of available data allows reconstructing the compete histories of the patients, which have a strong temporal component. This chapter introduces the major challenges faced by temporal data mining researchers in an era when huge quantities of complex clinical temporal data are becoming available. The analysis is focused on the peculiar features of this kind of data and describes the methodological and technological aspects that allow managing such complex framework. The chapter shows how heterogeneous data can be processed to derive a homogeneous representation. Starting from this representation, it illustrates different techniques for jointly analyze such kind of data. Finally, the technological strategies that allow creating a common data warehouse to gather data coming from different sources and with different formats are presented.

  19. Analyzing Distributed Generation Impact on the Reliability of Electric Distribution Network

    Directory of Open Access Journals (Sweden)

    Sanaullah Ahmad

    2016-10-01

    Full Text Available With proliferation of Distribution Generation (DG and renewable energy technologies the power system is becoming more complex, with passage of time the development of distributed generation technologies is becoming diverse and broad. Power system reliability is one of most vital area in electric power system which deals with continuous supply of power and customer satisfaction. Distribution network in power system contributed up to 80% of reliability problems. This paper analyzes the impact of Wind Turbine Generator (WTG as a distribution generation source on reliability of distribution system. Injecting single WTG and close to load point has positive impact on reliability, while injecting multiple WTGs at single place has adverse impact on distribution system reliability. These analyses are performed on bus 2 of Roy Billinton Test System (RBTS.

  20. Incineration technologies

    CERN Document Server

    Buekens, Alfons

    2013-01-01

    Waste incineration is the art of completely combusting waste, while maintaining or reducing emission levels below current emission standards. Where possible, objectives include the recovering of energy as well as the  combustion residues.  Successful waste incineration makes it possible to achieve a deep reduction in waste volume, obtain a compact and sterile residue, and eliminate a wide array of pollutants. This book places waste incineration within the wider context of waste management, and demonstrates that, in contrast to landfills and composting, waste incineration can eliminate objectionable and hazardous properties such as flammability and toxicity, result in a significant reduction in volume, and destroy gaseous and liquid waste streams leaving little or no residues beyond those linked to flue gas neutralization and treatment. Moreover, waste incineration sterilizes and destroys putrescible matter, and produces usable heat.  Incineration Technologies first appeared as a peer-reviewed contribution ...