WorldWideScience

Sample records for biosystem analyzing technology

  1. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  2. Biosystems engineering

    OpenAIRE

    P. Ulger; E. Gonulol

    2015-01-01

    Higher agricultural education system has been getting multidiscipline as a result of the level oftechnology, recently. Biosystems Engineering has become popular in developed countries particularly afterelectronic and information technologies has been getting to be a part of agriculture and accompanied ofbiology. In this study definition of Biosystems Engineering discipline, working areas, research, publicationand job opportunities were discussed meticulously. Academic organization of Biosyste...

  3. An updated validation of Promega's PowerPlex 16 System: high throughput databasing under reduced PCR volume conditions on Applied Biosystem's 96 capillary 3730xl DNA Analyzer.

    Science.gov (United States)

    Spathis, Rita; Lum, J Koji

    2008-11-01

    The PowerPlex 16 System from Promega Corporation allows single tube multiplex amplification of sixteen short tandem repeat (STR) loci including all 13 core combined DNA index system STRs. This report presents an updated validation of the PowerPlex 16 System on Applied Biosystem's 96 capillary 3730xl DNA Analyzer. The validation protocol developed in our laboratory allows for the analysis of 1536 loci (96 x 16) in c. 50 min. We have further optimized the assay by decreasing the reaction volume to one-quarter that recommended by the manufacturer thereby substantially reducing the total cost per sample without compromising reproducibility or specificity. This reduction in reaction volume has the ancillary benefit of dramatically increasing the sensitivity of the assay allowing for accurate analysis of lower quantities of DNA. Due to its substantially increased throughput capability, this extended validation of the PowerPlex 16 System should be useful in reducing the backlog of unanalyzed DNA samples currently facing public DNA forensic laboratories.

  4. Application of structural health monitoring technologies to bio-systems: current status and path forward

    Science.gov (United States)

    Bhalla, Suresh; Srivastava, Shashank; Suresh, Rupali; Moharana, Sumedha; Kaur, Naveet; Gupta, Ashok

    2015-03-01

    This paper presents a case for extension of structural health monitoring (SHM) technologies to offer solutions for biomedical problems. SHM research has made remarkable progress during the last two/ three decades. These technologies are now being extended for possible applications in the bio-medical field. Especially, smart materials, such as piezoelectric ceramic (PZT) patches and fibre-Bragg grating (FBG) sensors, offer a new set of possibilities to the bio-medical community to augment their conventional set of sensors, tools and equipment. The paper presents some of the recent extensions of SHM, such as condition monitoring of bones, monitoring of dental implant post surgery and foot pressure measurement. Latest developments, such as non-bonded configuration of PZT patches for monitoring bones and possible applications in osteoporosis detection, are also discussed. In essence, there is a whole new gamut of new possibilities for SHM technologies making their foray into the bi-medical sector.

  5. BioSystems

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NCBI BioSystems Database provides integrated access to biological systems and their component genes, proteins, and small molecules, as well as literature...

  6. ANTICIPATING AND REGULATING BIOSYSTEM

    Directory of Open Access Journals (Sweden)

    Ion Iorga Siman

    2010-06-01

    Full Text Available Regulating biosystems closely related to human beings are structures still difficult to understand.Numerous intimate processes taking place in these systems, even their actual constitution, are insufficiently decoded, and that they have populated the world long before man invented the first regulator, appears not to have contributed much to their knowledge. This work is intended to highlight what regulating biosystems are.There is no secret that somatic muscles perform control operations which no act of moving would be possible without. All actions are the result of dynamic controlled processes adjusted to strict control laws. By treating them very seriously may lead to knowledge of processes occurring in complex systems

  7. Industrial Biosystems Engineering and Biorefinery Systems

    Institute of Scientific and Technical Information of China (English)

    Shulin Chen

    2008-01-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development

  8. Industrial biosystems engineering and biorefinery systems.

    Science.gov (United States)

    Chen, Shulin

    2008-06-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development.

  9. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  10. Biosystems and Food Engineering Research Review 21

    OpenAIRE

    Cummins, Enda; Curran, Thomas P.

    2016-01-01

    The Twenty First Annual Research Review describes the ongoing research programme in the School of Biosystems and Food Engineering at University College Dublin from over 83 researchers (11 academic staff, 1 technician, 4 postdoctoral researchers and 67 postgraduates). The research programme covers three focal areas: Food and Process Engineering; Bioresource Systems; and Bioenvironmental Engineering. Each area is divided into sub-areas as outlined in the Table of Contents which also includes th...

  11. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of technologies for producing substitute fuel for petroleum by utilizing organisms; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Seibutsu riyo sekiyu daitai nenryo seizo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Technologies of producing useful substances using the substance decomposing/producing functions of complex biosystems and methods of their handling are developed. In the utilization of microbes in the digestive tracts of termites and longicorns, it is made clear that several kinds of termites cleave the {beta}-O-4 ether linkage. In relation to technologies for wood decomposing complex microbial system construction and complex vector system development, a screening system is constructed in which strains that exhibit complex actions are combined. Concerning the advanced utilization of tropical oil plants, conditions are determined for inducing callus out of oil palm tissues. Out of oil palm sarcocarp tissues, mRNA (messenger ribonucleic acid) is isolated for the construction of a cDNA (complementary deoxyribonucleic acid) library. For the purpose of isolating a powerful promoter, a partial base sequence is determined for ubiquitin that frequently expresses itself in cells. A pathogenic bacterium ailing the oil palm is sampled for identification, and it is inferred that the bacterium is a kind of Ganoderma boninense. (NEDO)

  12. International Student Collaboration in Biosystems Engineering using Video Podcasts in Design Classes

    OpenAIRE

    Curran, Thomas P.; Gates, Richard S.; Gentile, Francesco; et al

    2014-01-01

    A working group within the Trans-Atlantic Biosystems Engineering Network (TABE.NET) analyzed the idea of the development of an international collaborative design project for undergraduate students in the participating institutions. Aims of this action were to get a change in the students' outlook to utilize in their remaining years within the university and a more internationalized resume before they finish. Further outcomes desired by the team were a second or third language acquisition and ...

  13. Analyzing the next-generation catalog a library technology report

    CERN Document Server

    Nagy, Andrew

    2011-01-01

    his issue of ""Library Technology Reports"" analyzes five different academic libraries to better understand their investments, detailing the outcome thus far and drawing conclusions about the next-generation catalog.

  14. Abstracts of the 17. world congress of the International Commission of Agriculture and Biosystems Engineering (CIGR) : sustainable biosystems through engineering

    Energy Technology Data Exchange (ETDEWEB)

    Savoie, P.; Villeneuve, J.; Morisette, R. [Agriculture and Agri-Food Canada, Quebec City, PQ (Canada). Soils and Crops Research and Development Centre] (eds.)

    2010-07-01

    This international conference provided a forum to discuss methods to produce agricultural products more efficiently through improvements in engineering and technology. It was attended by engineers and scientists working from different perspectives on biosystems. Beyond food, farms and forests can provide fibre, bio-products and renewable energy. Seven sections of CIGR were organized in the following technical sessions: (1) land and water engineering, (2) farm buildings, equipment, structures and environment, (3) equipment engineering for plants, (4) energy in agriculture, (5) management, ergonomics and systems engineering, (6) post harvest technology and process engineering, and (7) information systems. The Canadian Society of Bioengineering (CSBE) merged its technical program within the 7 sections of CIGR. Four other groups also held their activities during the conference. The American Society of Agricultural and Biological Engineers (ASABE) organized its 9th international drainage symposium and the American Ecological Engineering Society (AEES) held its 10th annual meeting. The International Network for Information Technology in Agriculture (INFITA), and the 8th world congress on computers in agriculture also joined CIGR 2010.

  15. Study on Algae Removal by Immobilized Biosystem on Sponge

    Institute of Scientific and Technical Information of China (English)

    PEI Haiyan; HU Wenrong

    2006-01-01

    In this study, sponges were used to immobilize domesticated sludge microbes in a limited space, forming an immobilized biosystem capable of algae and microcystins removal. The removal effects on algae, microcystins and UV260 of this biosystem and the mechanism of algae removal were studied. The results showed that active sludge from sewage treatment plants was able to remove algae from a eutrophic lake's water after 7 d of domestication. The removal efficiency for algae,organic matter and microcystins increased when the domesticated sludge was immobilized on sponges. When the hydraulic retention time (HRT) was 5h, the removal rates of algae, microcystins and UV260 were 90%, 94.17% and 84%, respectively.The immobilized biosystem consisted mostly of bacteria, the Ciliata and Sarcodina protozoans and the Rotifer metazoans.Algal decomposition by zoogloea bacteria and preying by microcreatures were the two main modes of algal removal, which occurred in two steps: first, absorption by the zoogloea; second, decomposition by the zoogloea bacteria and the predacity of the microcreatures.

  16. Nontrivial quantum and quantum-like effects in biosystems: Unsolved questions and paradoxes.

    Science.gov (United States)

    Melkikh, Alexey V; Khrennikov, Andrei

    2015-11-01

    Non-trivial quantum effects in biological systems are analyzed. Some unresolved issues and paradoxes related to quantum effects (Levinthal's paradox, the paradox of speed, and mechanisms of evolution) are addressed. It is concluded that the existence of non-trivial quantum effects is necessary for the functioning of living systems. In particular, it is demonstrated that classical mechanics cannot explain the stable work of the cell and any over-cell structures. The need for quantum effects is generated also by combinatorial problems of evolution. Their solution requires a priori information about the states of the evolving system, but within the framework of the classical theory it is not possible to explain mechanisms of its storage consistently. We also present essentials of so called quantum-like paradigm: sufficiently complex bio-systems process information by violating the laws of classical probability and information theory. Therefore the mathematical apparatus of quantum theory may have fruitful applications to describe behavior of bio-systems: from cells to brains, ecosystems and social systems. In quantum-like information biology it is not presumed that quantum information bio-processing is resulted from quantum physical processes in living organisms. Special experiments to test the role of quantum mechanics in living systems are suggested. This requires a detailed study of living systems on the level of individual atoms and molecules. Such monitoring of living systems in vivo can allow the identification of the real potentials of interaction between biologically important molecules. PMID:26160644

  17. Financial options methodology for analyzing investments in new technology

    International Nuclear Information System (INIS)

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated

  18. Financial options methodology for analyzing investments in new technology

    Science.gov (United States)

    Wenning, B. D.

    1995-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  19. Financial options methodology for analyzing investments in new technology

    Energy Technology Data Exchange (ETDEWEB)

    Wenning, B.D. [Texas Utilities Services, Inc., Dallas, TX (United States)

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  20. Nano-Biotechnology: Structure and Dynamics of Nanoscale Biosystems

    CERN Document Server

    Manjasetty, Babu A; Ramaswamy, Y S

    2010-01-01

    Nanoscale biosystems are widely used in numerous medical applications. The approaches for structure and function of the nanomachines that are available in the cell (natural nanomachines) are discussed. Molecular simulation studies have been extensively used to study the dynamics of many nanomachines including ribosome. Carbon Nanotubes (CNTs) serve as prototypes for biological channels such as Aquaporins (AQPs). Recently, extensive investigations have been performed on the transport of biological nanosystems through CNTs. The results are utilized as a guide in building a nanomachinary such as nanosyringe for a needle free drug delivery.

  1. Polarized 3He Gas Circulating Technologies for Neutron Analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David; Hersman, Bill

    2014-12-10

    We describe the development of an integrated system for quasi-continuous operation of a large volume neutron analyzer. The system consists of a non-magnetic diaphragm compressor, a prototype large volume helium polarizer, a surrogate neutron analyzer, a non-depolarizing gas storage reservoir, a non-ferrous valve manifold for handling gas distribution, a custom rubidium-vapor gas return purifier, and wire-wound transfer lines, all of which are immersed in a two-meter external magnetic field. Over the Phase II period we focused on three major tasks required for the successful deployment of these types of systems: 1) design and implementation of gas handling hardware, 2) automation for long-term operation, and 3) improvements in polarizer performance, specifically fabrication of aluminosilicate optical pumping cells. In this report we describe the design, implementation, and testing of the gas handling hardware. We describe improved polarizer performance resulting from improved cell materials and fabrication methods. These improvements yielded valved 8.5 liter cells with relaxation times greater than 12 hours. Pumping this cell with 1500W laser power with 1.25nm linewidth yielded peak polarizations of 60%, measured both inside and outside the polarizer. Fully narrowing this laser to 0.25nm, demonstrated separately on one stack of the four, would have allowed 70% polarization with this cell. We demonstrated the removal of 5 liters of polarized helium from the polarizer with no measured loss of polarization. We circulated the gas through a titanium-clad compressor with polarization loss below 3% per pass. We also prepared for the next phase of development by refining the design of the polarizer so that it can be engineer-certified for pressurized operation. The performance of our system far exceeds comparable efforts elsewhere.

  2. Biosystems analysis and engineering of microbial consortia for industrial biotechnology

    Energy Technology Data Exchange (ETDEWEB)

    Sabra, Wael; Dietz, David; Tjahjasari, Donna; Zeng, An-Ping [Institute of Bioprocess and Biosystems Engineering, Hamburg University of Technology, Hamburg (Germany)

    2010-10-15

    The development of industrial biotechnology for an economical and ecological conversion of renewable materials into chemicals and fuels requires new strategies and concepts for bioprocessing. Biorefinery has been proposed as one of the key concepts with the aim of completely utilizing the substrate(s) and producing multiple products in one process or at one production site. In this article, we argue that microbial consortia can play an essential role to this end. To illustrate this, we first briefly describe some examples of existing industrial bioprocesses involving microbial consortia. New bioprocesses under development which make use of the advantages of microbial consortia are then introduced. Finally, we address some of the key issues and challenges for the analysis and engineering of bioprocesses involving microbial consortia from a perspective of biosystems engineering. (Copyright copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  3. Fiscal 1998 industrial science and technology R and D project. Research report on R and D of genome informatics technology (Development of stable oil supply measures using complex biosystem); 1998 nendo genome informatics gijutsu kenkyu kaihtsu seika hokokusho. Fukugo seibutsukei riyo sekiyu antei kyokyu taisaku kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This report describes the fiscal 1998 result on development of genome informatics technology. As comparative analysis technique of genes, the combination of electrophoresis and PCR was used. For improvement of the throughput and reproducibility of the technique, module- shuffling primers were used, and the multi(96)-arrayed capillary fragment analyzer was devised. The system detecting SNPs rapidly was also developed successfully. As analysis technology of DNA sequence by use of triple- stranded DNA formation, study was made on construction of long cDNA libraries, selective subtraction of specific sequences from libraries, and the basic technology of homologous cloning. Study was also made on each reaction step of IGCR technique for fast analysis, and specifications of a fluorescence transfer monitor. As modeling technique of genetic sequence information, the simulation model was developed for gene expression regulatory networks during muscle differentiation, and feedback regulation of period genes. Such support systems as transcription factor prediction and gene regulatory network inference were developed from existing data. (NEDO)

  4. Analyzing of MOS and Codec Selection for Voice over IP Technology

    OpenAIRE

    Mohd Nazri Ismail

    2009-01-01

    In this research, we propose an architectural solution to implement the voice over IP (VoIP) service in campus environment network. Voice over IP (VoIP) technology has become a discussion issue for this time being. Today, the deployment of this technology on an organization truly can give a great financial benefit over traditional telephony. Therefore, this study is to analyze the VoIP Codec selection and investigate the Mean Opinion Score (MOS) performance areas evolved with the quality of s...

  5. THE DELPHI METHOD AS A TOOL FOR ANALYZING TECHNOLOGY EVOLUTION: CASE OPEN SOURCE THIN COMPUTING

    OpenAIRE

    MATTI KARVONEN; VILLE RYYNÄNEN; TUOMO KÄSSI

    2009-01-01

    The main goal of this paper is to show how the Delphi method works as a management tool when analyzing technology evolution. The paper also provides insights on how thin computing and open source can affect the future IT infrastructure development. The primary data was collected in a three round Delphi study consisting of the following interest groups: (1) Developers of open source thin computing, (2) Industrial experts, (3) Representatives of academic institutes. The Delphi method represents...

  6. A Hybrid Method of Analyzing Patents for Sustainable Technology Management in Humanoid Robot Industry

    Directory of Open Access Journals (Sweden)

    Jongchan Kim

    2016-05-01

    Full Text Available A humanoid, which refers to a robot that resembles a human body, imitates a human’s intelligence, behavior, sense, and interaction in order to provide various types of services to human beings. Humanoids have been studied and developed constantly in order to improve their performance. Humanoids were previously developed for simple repetitive or hard work that required significant human power. However, intelligent service robots have been developed actively these days to provide necessary information and enjoyment; these include robots manufactured for home, entertainment, and personal use. It has become generally known that artificial intelligence humanoid technology will significantly benefit civilization. On the other hand, Successful Research and Development (R & D on humanoids is possible only if they are developed in a proper direction in accordance with changes in markets and society. Therefore, it is necessary to analyze changes in technology markets and society for developing sustainable Management of Technology (MOT strategies. In this study, patent data related to humanoids are analyzed by various data mining techniques, including topic modeling, cross-impact analysis, association rule mining, and social network analysis, to suggest sustainable strategies and methodologies for MOT.

  7. Present and future of the numerical methods in buildings and infrastructures areas of biosystems engineering

    Directory of Open Access Journals (Sweden)

    Francisco Ayuga

    2015-04-01

    Full Text Available Biosystem engineering is a discipline resulting from the evolution of the traditional agricultural engineering to include new engineering challenges related with biological systems, from the cell to the environment. Modern buildings and infrastructures are needed to satisfy crop and animal production demands. In this paper a review on the status of numerical methods applied to solve engineering problems in the field of buildings and infrastructures in biosystem engineering is presented. The history and basic background of the finite element method is presented. This is the first numerical method implemented and also the more developed one. The history and background of other two more recent methods, with practical applications, the computer fluids dynamics and the discrete element method are also presented. Besides, a review on the scientific and professional applications on the field of buildings and infrastructures for biosystem engineering needs is presented. Today we can simulate engineering problems with solids, engineering problems with fluids and engineering problems with particles and get to practical solutions faster and cheaper than in the past. The paper encourages young engineers and researchers to make progress these tools and their engineering applications. The capacities of all numerical methods in their present development status go beyond the present practical applications. There is a broad field to work on it.

  8. Modeling Dendrimers Charge Interaction in Solution: Relevance in Biosystems

    Directory of Open Access Journals (Sweden)

    Domenico Lombardo

    2014-01-01

    Full Text Available Dendrimers are highly branched macromolecules obtained by stepwise controlled, reaction sequences. The ability to be designed for specific applications makes dendrimers unprecedented components to control the structural organization of matter during the bottom-up synthesis of functional nanostructures. For their applications in the field of biotechnology the determination of dendrimer structural properties as well as the investigation of the specific interaction with guest components are needed. We show how the analysis of the scattering structure factor S(q, in the framework of current models for charged systems in solution, allows for obtaining important information of the interdendrimers electrostatic interaction potential. The finding of the presented results outlines the important role of the dendrimer charge and the solvent conditions in regulating, through the modulation of the electrostatic interaction potential, great part of the main structural properties. This charge interaction has been indicated by many studies as a crucial factor for a wide range of structural processes involving their biomedical application. Due to their easily controllable properties dendrimers can be considered at the crossroad between traditional colloids, associating polymers, and biological systems and represent then an interesting new technological approach and a suitable model system of molecular organization in biochemistry and related fields.

  9. Production of biofuels and biochemicals by in vitro synthetic biosystems: Opportunities and challenges.

    Science.gov (United States)

    Zhang, Yi-Heng Percival

    2015-11-15

    The largest obstacle to the cost-competitive production of low-value and high-impact biofuels and biochemicals (called biocommodities) is high production costs catalyzed by microbes due to their inherent weaknesses, such as low product yield, slow reaction rate, high separation cost, intolerance to toxic products, and so on. This predominant whole-cell platform suffers from a mismatch between the primary goal of living microbes - cell proliferation and the desired biomanufacturing goal - desired products (not cell mass most times). In vitro synthetic biosystems consist of numerous enzymes as building bricks, enzyme complexes as building modules, and/or (biomimetic) coenzymes, which are assembled into synthetic enzymatic pathways for implementing complicated bioreactions. They emerge as an alternative solution for accomplishing a desired biotransformation without concerns of cell proliferation, complicated cellular regulation, and side-product formation. In addition to the most important advantage - high product yield, in vitro synthetic biosystems feature several other biomanufacturing advantages, such as fast reaction rate, easy product separation, open process control, broad reaction condition, tolerance to toxic substrates or products, and so on. In this perspective review, the general design rules of in vitro synthetic pathways are presented with eight supporting examples: hydrogen, n-butanol, isobutanol, electricity, starch, lactate,1,3-propanediol, and poly-3-hydroxylbutyrate. Also, a detailed economic analysis for enzymatic hydrogen production from carbohydrates is presented to illustrate some advantages of this system and the remaining challenges. Great market potentials will motivate worldwide efforts from multiple disciplines (i.e., chemistry, biology and engineering) to address the remaining obstacles pertaining to cost and stability of enzymes and coenzymes, standardized building parts and modules, biomimetic coenzymes, biosystem optimization, and scale

  10. Blow and go: the Breath-Analyzed Ignition Interlock Device as a technological response to DWI.

    Science.gov (United States)

    Fulkerson, Andrew

    2003-01-01

    Driving while intoxicated rates have declined substantially in the last 20 years. This is as a result of public opinion combined with increased law enforcement efforts. A recent tool has been the Breath Analyzed Ignition Interlock Device. This new technology is designed to prevent persons with excessive blood alcohol levels from operating the interlocked vehicle. This 3-year recidivism study of the ignition interlock revealed 17.5% recidivism rates for the interlock group compared to 25.3% recidivism rates for the non-interlock group, a 31% decrease. Multiple offenders and younger (under 30) offenders had significantly lower rates of subsequent arrests. The multi-offenders in the comparison group were more than twice as likely as the interlock group to have a subsequent conviction within 3 years. The difference was nearly the same for the under 30 age group. There was almost no difference for first offenders. Accordingly, the ignition interlock appears to significantly reduce recidivism for repeat and younger DWI offenders but offers almost no improvement for first offenders. One driver of 315 (0.32%) was charged with DWI with an interlock in place. This offender had a child provide the breath sample while she drove the vehicle. PMID:12731690

  11. Analyzing interdependencies between policy mixes and technological innovation systems : The case of offshore wind in Germany

    NARCIS (Netherlands)

    Reichardt, Kristin; Negro, Simona O.; Rogge, Karoline S.; Hekkert, Marko P.

    2016-01-01

    One key approach for studying emerging technologies in the field of sustainability transitions is that of technological innovation systems (TIS). While most TIS studies aim at deriving policy recommendations - typically by identifying system barriers - the actual role of these proposed policies in t

  12. Recurrent Routines: Analyzing and Supporting Orchestration in Technology-Enhanced Primary Classrooms

    Science.gov (United States)

    Prieto, Luis P.; Villagra-Sobrino, Sara; Jorrin-Abellan, Ivan M.; Martinez-Mones, Alejandra; Dimitriadis, Yannis

    2011-01-01

    The increasing presence of multiple Information and Communication Technologies (ICT) in the classroom does not guarantee an improvement of the learning experiences of students, unless it is also accompanied by pedagogically effective orchestration of those technologies. In order to help teachers in this endeavour, it can be useful to understand…

  13. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    Directory of Open Access Journals (Sweden)

    Mario Camberos C.

    2013-07-01

    Full Text Available In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on 2008, two hundred percent. This result is in correspondence with diminishing technological gap, perhaps caused by economic crisis impact.

  14. A methodology for capturing and analyzing data from technology base seminar wargames.

    OpenAIRE

    Miles, Jeffrey T.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis provides a structured methodology for obtaining, evaluating, and portraying to a decision maker, the opinions of players of Technology Base Seminar Wargames (TBSW). The thesis then demonstrates the methodology by applying the events of the Fire Support Technology Base Seminar Wargame held in May 1991. Specifically, the evaluation team developed six surveys, each survey capturing opinions using the categorical...

  15. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    OpenAIRE

    Mario Camberos C.; Luis Huesca Reynoso; David Castro Lugo

    2013-01-01

    In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on ...

  16. Geoinformation modeling system for analysis of atmosphere pollution impact on vegetable biosystems using space images

    Science.gov (United States)

    Polichtchouk, Yuri; Ryukhko, Viatcheslav; Tokareva, Olga; Alexeeva, Mary

    2002-02-01

    Geoinformation modeling system structure for assessment of the environmental impact of atmospheric pollution on forest- swamp ecosystems of West Siberia is considered. Complex approach to the assessment of man-caused impact based on the combination of sanitary-hygienic and landscape-geochemical approaches is reported. Methodical problems of analysis of atmosphere pollution impact on vegetable biosystems using geoinformation systems and remote sensing data are developed. Landscape structure of oil production territories in southern part of West Siberia are determined on base of processing of space images from spaceborn Resource-O. Particularities of atmosphere pollution zones modeling caused by gas burning in torches in territories of oil fields are considered. For instance, a pollution zones were revealed modeling of contaminants dispersal in atmosphere by standard model. Polluted landscapes areas are calculated depending on oil production volume. It is shown calculated data is well approximated by polynomial models.

  17. The Rücker-Markov invariants of complex Bio-Systems: applications in Parasitology and Neuroinformatics.

    Science.gov (United States)

    González-Díaz, Humberto; Riera-Fernández, Pablo; Pazos, Alejandro; Munteanu, Cristian R

    2013-03-01

    Rücker's walk count (WC) indices are well-known topological indices (TIs) used in Chemoinformatics to quantify the molecular structure of drugs represented by a graph in Quantitative structure-activity/property relationship (QSAR/QSPR) studies. In this work, we introduce for the first time the higher-order (kth order) analogues (WCk) of these indices using Markov chains. In addition, we report new QSPR models for large complex networks of different Bio-Systems useful in Parasitology and Neuroinformatics. The new type of QSPR models can be used for model checking to calculate numerical scores S(Lij) for links Lij (checking or re-evaluation of network connectivity) in large networks of all these fields. The method may be summarized as follows: (i) first, the WCk(j) values are calculated for all jth nodes in a complex network already created; (ii) A linear discriminant analysis (LDA) is used to seek a linear equation that discriminates connected or linked (Lij=1) pairs of nodes experimentally confirmed from non-linked ones (Lij=0); (iii) The new model is validated with external series of pairs of nodes; (iv) The equation obtained is used to re-evaluate the connectivity quality of the network, connecting/disconnecting nodes based on the quality scores calculated with the new connectivity function. The linear QSPR models obtained yielded the following results in terms of overall test accuracy for re-construction of complex networks of different Bio-Systems: parasite-host networks (93.14%), NW Spain fasciolosis spreading networks (71.42/70.18%) and CoCoMac Brain Cortex co-activation network (86.40%). Thus, this work can contribute to the computational re-evaluation or model checking of connectivity (collation) in complex systems of any science field.

  18. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can be used to observe Eddy Covariance Flux and Absolute Dry Mole Fraction of CO2 from stationary and airborne...

  19. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can observe eddy covariance flux of CO2 from unmanned airborne platforms. For both phases, a total of four...

  20. Multimodal methodologies for analyzing preschool children’s engagements with digital technologies

    DEFF Research Database (Denmark)

    Chimirri, Niklas Alexander

    and between reality and virtuality. In its stead, it suggests understanding technologies as mediating devices or artifacts, which offer the possibility to communicate with others for the sake of negotiating and actualizing imaginations together, for the sake of transforming pedagogical practice...... and problems of children can be accounted for throughout the technologically mediated collective development of a pedagogical practice – how it can be avoided that the adults’ perspectives take the sole lead throughout this collective engagement. What lies at the heart of these challenges is the question...... of age-transgressing collective transformations of technologically mediated pedagogical practice with empirical experiences made while participating in a preschool over the course of four months....

  1. Novel Indirect Calorimetry Technology to Analyze Metabolism in Individual Neonatal Rodent Pups

    NARCIS (Netherlands)

    Dominguez, Jesus F.; Guo, Lixin; Carrasco Molnar, Marco A.; Ballester Escobedo, Antonio; Dunphy, Taylor; Lund, Trent D.; Turman, Jack E.

    2009-01-01

    Background: The ability to characterize the development of metabolic function in neonatal rodents has been limited due to technological constraints. Low respiratory volumes and flows at rest pose unique problems, making it difficult to reliably measure O(2) consumption, CO(2) production, respiratory

  2. Application of Printed Circuit Board Technology to FT-ICR MS Analyzer Cell Construction and Prototyping

    Science.gov (United States)

    Leach, Franklin E.; Norheim, Randolph; Anderson, Gordon; Pasa-Tolic, Ljiljana

    2014-12-01

    Although Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) remains the mass spectrometry platform that provides the highest levels of performance for mass accuracy and resolving power, there is room for improvement in analyzer cell design as the ideal quadrupolar trapping potential has yet to be generated for a broadband MS experiment. To this end, analyzer cell designs have improved since the field's inception, yet few research groups participate in this area because of the high cost of instrumentation efforts. As a step towards reducing this barrier to participation and allowing for more designs to be physically tested, we introduce a method of FT-ICR analyzer cell prototyping utilizing printed circuit boards at modest vacuum conditions. This method allows for inexpensive devices to be readily fabricated and tested over short intervals and should open the field to laboratories lacking or unable to access high performance machine shop facilities because of the required financial investment.

  3. Fusion of Nuclear and Emerging Technology

    International Nuclear Information System (INIS)

    The presentation discussed the following subjects: emerging technology; nuclear technology; fusion emerging and nuclear technology; progressive nature of knowledge; optically stimulated luminescence - application of luminescence technology to sediments; Biosystemics technology -convergence nanotechnology, ecological science, biotechnology, cognitive science and IT - prospective impact on materials science, the management of public system for bio-health, eco and food system integrity and disease mitigation

  4. Analyzing the Impact of Software-Defined Video Networking to Broadcast Technology Business

    OpenAIRE

    Niiranen, Heikki

    2015-01-01

    Broadcast video systems have traditionally been the domain of purpose built video interfaces and routing products. The advances in technology have reduced the price of Ethernet network bandwidth and processing to levels where uncompressed video can be feasibly transported within packet switched IP networks. This trend is accompanied with the paradigm of software-defined networking, which allows easy control of network parameters and features from high level applications. This has made it poss...

  5. Analyzing the Direct Methanol Fuel Cell technology in portable applications by a historical and bibliometric analysis

    OpenAIRE

    Suominen, A.; Tuominen, A. (Aulis)

    2010-01-01

    The development of direct methanol fuel cell (DMFC) technology through an analysis of research, patenting and commercial adoption is studied in this paper. The analysis uses a dataset gathered from both publication and patent databases. This data is complemented with a review on commercial efforts on portable fuel cells. Bibliometric methods are used to identify research networks and research trends. The Fisher-Pry growth model is used to estimate future research activity. The patent landscap...

  6. Satellite Technology as a Source of Integration. A Comparative Analyze: Europe MERCOSUR

    Science.gov (United States)

    Castillo Argañarás, Luis F.

    2002-01-01

    Satellite technology as a source of integration. A comparative several changes in the field of international law, creating the need to build a new framework for integration and cooperation. analysis between the development of the European Integration for space activities and the first steps towards the same target by the MERCOSUR with a comparative point of view will show the positive and negative side effects of its development up to our present time.

  7. Analyzing Public Library Service Interactions to Improve Public Library Customer Service and Technology Systems

    Directory of Open Access Journals (Sweden)

    Holly Arnason

    2012-03-01

    Full Text Available Objective – To explore the types and nature of assistance library customers are asking library staff for in a large Canadian urban public library system.Methods – A qualitative study employing transaction logging combined with embedded observation occurred for three-day sample periods at a selection of nine branches over the course of eight months. Staff recorded questions and interactions at service desks (in person, by phone, and electronically, as well as questions received during scheduled and non-scheduled provision of mobile reference service. In addition to recording interaction details and interaction medium, staff members were also asked to indicate briefly the process or resources used to resolve the interaction. Survey data were entered and coded through thematic analysis.Results – The survey collected 6,099 interactions between staff and library customers. Of those 6,099 interactions, 1,920 (31.48% were coded as pertaining to technology help. Further analysis revealed significant library customer need for help with Internet workstations and printing.Conclusions – Technology help is a core customer need for Edmonton Public Library, with requests varying in complexity and sometimes resolved with instruction. The library’s Internet workstations and printing system presented critical usability challenges that drove technology help requests.

  8. Analyzing the Life Cycle Energy Savings of DOE Supported Buildings Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Hostick, Donna J.; Dirks, James A.; Elliott, Douglas B.

    2009-08-31

    This report examines the factors that would potentially help determine an appropriate analytical timeframe for measuring the U.S. Department of Energy's Building Technology (BT) benefits and presents a summary-level analysis of the life cycle savings for BT’s Commercial Buildings Integration (CBI) R&D program. The energy savings for three hypothetical building designs are projected over a 100-year period using Building Energy Analysis and Modeling System (BEAMS) to illustrate the resulting energy and carbon savings associated with the hypothetical aging buildings. The report identifies the tasks required to develop a long-term analytical and modeling framework, and discusses the potential analytical gains and losses by extending an analysis into the “long-term.”

  9. Analyzing Accuracy and Accessibility in Information and Communication Technology Ethical Scenario Context

    Directory of Open Access Journals (Sweden)

    M. Masrom

    2011-01-01

    Full Text Available Problem statement: Recently, the development of Information and Communication Technology (ICT is indispensable to life. The utilization of ICT has provided advantages for people, organizations and society as a whole. Nevertheless, the widespread and rapid use of ICT in society has exacerbated existing ethical issues or dilemmas and also led to the emergence of new ethical issues such as unauthorized access, software piracy, internet pornography, privacy protection, information gap and many others. Approach: Therefore, the aim of this study is to discuss several issues of the ICT ethics. It will focusing on two major issues, that is, data accuracy and accessibility. Results: The results indicated that more than half percentage of respondents tend to be ethical in data accuracy scenario and also in accessibility scenario. Several computer ethics scenarios that relate to the data accuracy and accessibility are presented and the results of analysis are then discussed. Conclusion: Based on the results in this study, computer ethics issues such as data accuracy and accessibility should receive more attention in the ICT field.

  10. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants

    Institute of Scientific and Technical Information of China (English)

    Miguel A Pineros; Pierre-Luc Pradier; Nathanael M Shaw; Ithipong Assaranurak; Susan R McCouch; Craig Sturrock; Malcolm Bennett; Leon V Kochian; Brandon G Larson; Jon E Shaff; David J Schneider; Alexandre Xavier Falcao; Lixing Yuan; Randy T Clark; Eric J Craft; Tyler W Davis

    2016-01-01

    A plant’s ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architec-ture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyp-ing software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimen-sional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions.

  11. Temperature variation in metal ceramic technology analyzed using time domain optical coherence tomography

    Science.gov (United States)

    Sinescu, Cosmin; Topala, Florin I.; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Podoleanu, Adrian G.

    2014-01-01

    The quality of dental prostheses is essential in providing good quality medical services. The metal ceramic technology applied in dentistry implies ceramic sintering inside the dental oven. Every ceramic material requires a special sintering chart which is recommended by the producer. For a regular dental technician it is very difficult to evaluate if the temperature inside the oven remains the same as it is programmed on the sintering chart. Also, maintaining the calibration in time is an issue for the practitioners. Metal ceramic crowns develop a very accurate pattern for the ceramic layers depending on the temperature variation inside the oven where they are processed. Different patterns were identified in the present study for the samples processed with a variation in temperature of +30 °C to +50 °C, respectively - 30 0°C to -50 °C. The OCT imagistic evaluations performed for the normal samples present a uniform spread of the ceramic granulation inside the ceramic materials. For the samples sintered at a higher temperature an alternation between white and darker areas between the enamel and opaque layers appear. For the samples sintered at a lower temperature a decrease in the ceramic granulation from the enamel towards the opaque layer is concluded. The TD-OCT methods can therefore be used efficiently for the detection of the temperature variation due to the ceramic sintering inside the ceramic oven.

  12. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants.

    Science.gov (United States)

    Piñeros, Miguel A; Larson, Brandon G; Shaff, Jon E; Schneider, David J; Falcão, Alexandre Xavier; Yuan, Lixing; Clark, Randy T; Craft, Eric J; Davis, Tyler W; Pradier, Pierre-Luc; Shaw, Nathanael M; Assaranurak, Ithipong; McCouch, Susan R; Sturrock, Craig; Bennett, Malcolm; Kochian, Leon V

    2016-03-01

    A plant's ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architecture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyping software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimensional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions. PMID:26683583

  13. Assessing the validity of using serious game technology to analyze physician decision making.

    Directory of Open Access Journals (Sweden)

    Deepika Mohan

    Full Text Available BACKGROUND: Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes have emerged as a method of studying physician decision making. However, little is known about their validity. METHODS: We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines. We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case. We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. FINDINGS: We recruited 209 physicians, of whom 168 (79% began and 142 (68% completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C: 10.9 [SD 4.8] vs. cognitive load (CL:10.7 [SD 5.6], p = 0.74, despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01. Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20, but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03. CONCLUSIONS: We found that physicians made decisions consistent with actual practice, that we could

  14. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    Science.gov (United States)

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  15. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    Science.gov (United States)

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  16. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    Directory of Open Access Journals (Sweden)

    Peyman Ghafari Ashtiani

    2016-08-01

    Full Text Available One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a customer to another one. The purpose of this research is to analyze the relationship between customers' loyalty and the acceptance of viral marketing based on the theory of technology acceptance model (TAM model among the civil engineers and architects who are the members of Engineering Council in Isfahan (ECI. The research method is descriptive–survey and it is applicable in target. The statistical population includes civil engineers and architects who are the members of Engineering Council in Isfahan including 14400 members. The sample size was determined 762 members based on Cochran sampling formula, the sample was selected as accessible. The data was collected by field method. Analyzing the data and recent research hypothesis, the data was extracted from the questionnaires. Then, all the data was analyzed by computer and SPSS and LISREL software. According to the results of the data, the loyalty of the civil engineers and architects members of ECI was associated with the acceptance and practical involvement of viral marketing.

  17. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program

    OpenAIRE

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: “Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?” Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), ...

  18. Fluorescent In Situ Hybridization as a Genetic Technology to Analyzing Chromosomal Organization of Alien Wheat Recombinant Lines

    International Nuclear Information System (INIS)

    Fluorescent in situ hybridization is a valuable method for physical mapping of DNA sequence to chromosomes and genomes and to analyzing their organization, diversity, evolution and function. Using genomic DNA the origin of chromatin in hybrids and alien introgression lines can be identified and followed through breeding programmes. We have applied this technology to study the chromosome composition of new recombinants and genomes derived from spontaneous and induced translocations in particular involving rye and the goat grass Thinoyrum intermedium that transfer disease and stress resistance to wheat. We have established flow diagrammes for easy identification of the alien chromosome material. (author)

  19. Analyzing the Technology of Using Ash and Slag Waste from Thermal Power Plants in the Production of Building Ceramics

    Science.gov (United States)

    Malchik, A. G.; Litovkin, S. V.; Rodionov, P. V.; Kozik, V. V.; Gaydamak, M. A.

    2016-04-01

    The work describes the problem of impounding and storing ash and slag waste at coal thermal power plants in Russia. Recovery and recycling of ash and slag waste are analyzed. Activity of radionuclides, the chemical composition and particle sizes of ash and slag waste were determined; the acidity index, the basicity and the class of material were defined. The technology for making ceramic products with the addition of ash and slag waste was proposed. The dependencies relative to the percentage of ash and slag waste and the optimal parameters for baking were established. The obtained materials were tested for physical and mechanical properties, namely for water absorption, thermal conductivity and compression strength. Based on the findings, future prospects for use of ash and slag waste were identified.

  20. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    Science.gov (United States)

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger. PMID:25120949

  1. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    Science.gov (United States)

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger.

  2. Application of the Linux-based share memory technology in a visual and remote analyzing software for the high-speed nuclear information

    International Nuclear Information System (INIS)

    Taking a remote analyzing software for high-speed nuclear information acquisition as an example, the paper presents the application of the shared memory technology based on the Linux operating system. The characteristic of universality and credibility provides the reference for similar analyzing software of other nuclear information acquisition systems

  3. 声音制作技术的现象学解读%Analyzing Sound Production Technology in Phenomenology of Technology

    Institute of Scientific and Technical Information of China (English)

    李松林

    2011-01-01

    声音是自然界存在的一种属性,也是人类最古老的交流方式之一。声音制作技术是对声音的录制、传播过程,它的发展与人类发展、进步的历史是息息相关的,经历了从原始社会、近代乃至当代社会漫长的历史时期。技术现象学是有关技术与人类关系的学说,运用伊德的技术现象学理论,可以把声音制作技术发展的历史分为萌芽阶段、产生阶段、发展阶段、成熟阶段与后现代阶段。声音制作技术前进、发展过程中的基本规律就是通过对声音的制作与传播,使意识得到了延绵,其本质是"声音现象学",制作一种"存在"。声音制作技术还具有改变人的存在方式的深层次哲学意义,哲学的基本命题"存在"也许将成为一种新的存在方式。%Sound is a property of natural existence,and one of the most ancient forms of communication.Sound production technology is a sound recording and its the communication process.Sound development and progress of history is closely related with human beings,which has gone for a long historical period from primitive society and modern to contemporary society.Phenomenology of technology deals with the theory of the technology and the phenomenology of human relations.Buy using the Idhe's theory,this paper analyzes the history of sound production technology development stage,which could be divided into the embryonic stage,production stage,development stage,mature stage and the post-modern stage.The basic law of sound production technology in the development process is the consciousness of the stretches through the production and communication of sound.Its essence is the "voice of phenomenology",producing a kind of "existence".Sound production technology also has the deep philosophical meaning changing the way of human existence,the basic philosophy of the proposition "existence" may become a new kind of existence.

  4. Does Personality Matter? Applying Holland's Typology to Analyze Students' Self-Selection into Science, Technology, Engineering, and Mathematics Majors

    Science.gov (United States)

    Chen, P. Daniel; Simpson, Patricia A.

    2015-01-01

    This study utilized John Holland's personality typology and the Social Cognitive Career Theory (SCCT) to examine the factors that may affect students' self-selection into science, technology, engineering, and mathematics (STEM) majors. Results indicated that gender, race/ethnicity, high school achievement, and personality type were statistically…

  5. Ab initio O(N) elongation-counterpoise method for BSSE-corrected interaction energy analyses in biosystems

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi; Xie, Peng; Liu, Kai [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Yamamoto, Ryohei [Department of Molecular and Material Sciences, Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Imamura, Akira [Hiroshima Kokusai Gakuin University, 6-20-1 Nakano, Aki-ku, Hiroshima 739-0321 (Japan); Aoki, Yuriko, E-mail: aoki.yuriko.397@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2015-03-14

    An Elongation-counterpoise (ELG-CP) method was developed for performing accurate and efficient interaction energy analysis and correcting the basis set superposition error (BSSE) in biosystems. The method was achieved by combining our developed ab initio O(N) elongation method with the conventional counterpoise method proposed for solving the BSSE problem. As a test, the ELG-CP method was applied to the analysis of the DNAs’ inter-strands interaction energies with respect to the alkylation-induced base pair mismatch phenomenon that causes a transition from G⋯C to A⋯T. It was found that the ELG-CP method showed high efficiency (nearly linear-scaling) and high accuracy with a negligibly small energy error in the total energy calculations (in the order of 10{sup −7}–10{sup −8} hartree/atom) as compared with the conventional method during the counterpoise treatment. Furthermore, the magnitude of the BSSE was found to be ca. −290 kcal/mol for the calculation of a DNA model with 21 base pairs. This emphasizes the importance of BSSE correction when a limited size basis set is used to study the DNA models and compare small energy differences between them. In this work, we quantitatively estimated the inter-strands interaction energy for each possible step in the transition process from G⋯C to A⋯T by the ELG-CP method. It was found that the base pair replacement in the process only affects the interaction energy for a limited area around the mismatch position with a few adjacent base pairs. From the interaction energy point of view, our results showed that a base pair sliding mechanism possibly occurs after the alkylation of guanine to gain the maximum possible number of hydrogen bonds between the bases. In addition, the steps leading to the A⋯T replacement accompanied with replications were found to be unfavorable processes corresponding to ca. 10 kcal/mol loss in stabilization energy. The present study indicated that the ELG-CP method is promising for

  6. Collect, analyze and data base for building up the investment reports of Center for Nuclear Science and Technology construction project

    International Nuclear Information System (INIS)

    Following the Contract No.19/HD/NVCB dated July 10, 2013 signed by the President of Vietnam Atomic Energy Institute (VINATOM), an additional ministerial Project was approval by the Decision No. 526/QD-VNLNT dated July 8, 2013 by the VINATOM President in order to implement an important task for VINATOM. This project was implemented by the Institute for Nuclear Science and Technology (INST) in Hanoi as management organization and VINATOM as the owner of project results. Main objectives of this Project are to support national budget for implementing to collected the general report from previous projects which are relevant to CNEST and new research reactor, IAEA guidance documents, documents provided by ROSATOM in seminars in 2010, 2012 and 2013, report from expert visits of Ministry of Science and Technology and completed the general report about the construction project of CNEST. (author)

  7. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    OpenAIRE

    Peyman Ghafari Ashtiani; Atefeh Parsayan; Moein Mohajerani

    2016-01-01

    One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a cu...

  8. Analyzing the Effect of Technology-Based Intervention in Language Laboratory to Improve Listening Skills of First Year Engineering Students

    Directory of Open Access Journals (Sweden)

    Pasupathi Madhumathi

    2013-04-01

    Full Text Available First year students pursuing engineering education face problems with their listening skills. Most of the Indian schools use a bilingual method for teaching subjects from primary school through high school. Nonetheless, students entering university education develop anxiety in listening to classroomlectures in English. This article reports an exploratory study that aimed to find out whether the listening competences of students improved when technology was deployed in language laboratory. It also investigated the opinions of the students about using teacher-suggested websites for acquiring listening skills. The results of the study indicated that the use of technology in a language laboratory for training students in listening competences had reduced the anxiety of the students when listening to English. Further, there was a significant improvement on the part of students in acquiring listening skills through technology-based intervention.Muchos estudiantes de ingeniería de primer año en India tienen problemas con sus habilidades de escucha en inglés; experimentan ansiedad al momento de escuchar conferencias en inglés, pese a que provienen de colegios donde se sigue un modelo bilingüe para enseñar materias desde la primariahasta la secundaria. Con el objetivo de averiguar si las competencias de escucha de los estudiantes mejoran cuando se introduce la tecnología en el laboratorio de idiomas, se realizó un estudio exploratorio en el que se tuvieron en cuenta las opiniones de los estudiantes acerca del uso de sitios web sugeridos por el docente para adquirir habilidades de escucha. Los resultados indican que el uso de la tecnología en el laboratorio de idiomas reduce la ansiedad de los estudiantes al momento de escuchar conferencias en inglés y que progresan significativamente en sus habilidades de escucha.

  9. Analyzing the Development of Linux Technology%嵌入式Linux系统移植技术研究

    Institute of Scientific and Technical Information of China (English)

    张伟杰; 李明理

    2012-01-01

    本文阐述了嵌入式Linux系统开发流程和交叉开发环境的建立,分析了Linux的内部组织结构及其对系统移植的影响,介绍了目标硬件平台和现有的软件基础,并从本次目标系统实现的角度,以理论分析为基础,移植实现为目标,围绕系统移植的主要内容做了重点剖析和实现.%The paper states the development environment, analysises the internal organizational structure and it' s effect of system transplant, introduces target hardware platform and the existing software foundation, and from the angle of the realization of the target system, based on the theoretical analysis, transplantation for goal realization, around the system is an important content of transplantation, do it analyzes and realization.

  10. Seismic Technology Adapted to Analyzing and Developing Geothermal Systems Below Surface-Exposed High-Velocity Rocks Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; DeAngelo, Michael V. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Ermolaeva, Elena [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Remington, Randy [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Sava, Diana [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wagner, Donald [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wei, Shuijion [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology

    2013-02-01

    The objective of our research was to develop and demonstrate seismic data-acquisition and data-processing technologies that allow geothermal prospects below high-velocity rock outcrops to be evaluated. To do this, we acquired a 3-component seismic test line across an area of exposed high-velocity rocks in Brewster County, Texas, where there is high heat flow and surface conditions mimic those found at numerous geothermal prospects. Seismic contractors have not succeeded in creating good-quality seismic data in this area for companies who have acquired data for oil and gas exploitation purposes. Our test profile traversed an area where high-velocity rocks and low-velocity sediment were exposed on the surface in alternating patterns that repeated along the test line. We verified that these surface conditions cause non-ending reverberations of Love waves, Rayleigh waves, and shallow critical refractions to travel across the earth surface between the boundaries of the fast-velocity and slow-velocity material exposed on the surface. These reverberating surface waves form the high level of noise in this area that does not allow reflections from deep interfaces to be seen and utilized. Our data-acquisition method of deploying a box array of closely spaced geophones allowed us to recognize and evaluate these surface-wave noise modes regardless of the azimuth direction to the surface anomaly that backscattered the waves and caused them to return to the test-line profile. With this knowledge of the surface-wave noise, we were able to process these test-line data to create P-P and SH-SH images that were superior to those produced by a skilled seismic data-processing contractor. Compared to the P-P data acquired along the test line, the SH-SH data provided a better detection of faults and could be used to trace these faults upward to the boundaries of exposed surface rocks. We expanded our comparison of the relative value of S-wave and P-wave seismic data for geothermal

  11. Network Stack Analyzing and Protocol Add Technology in Linux%Linux网络协议栈分析及协议添加的实现

    Institute of Scientific and Technical Information of China (English)

    唐续; 刘心松; 杨峰

    2003-01-01

    In order to improve the performance of Linux network,new protocols should be implemented and added in original protocol stack. For this demand,this paper has analyzed Linux network stack architecture and implement technology,then presented a method that appended new protocols in the network stack of Linux. The processes of protocol register ,protocol operation,protocol header implement,packets receiving, user interface are involved in this method.

  12. Greenhouse gas (GHG) emission in organic farming. Approximate quantification of its generation at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM)

    Science.gov (United States)

    Campos, Jorge; Barbado, Elena; Maldonado, Mariano; Andreu, Gemma; López de Fuentes, Pilar

    2016-04-01

    As it well-known, agricultural soil fertilization increases the rate of greenhouse gas (GHG) emission production such as CO2, CH4 and N2O. Participation share of this activity on the climate change is currently under study, as well as the mitigation possibilities. In this context, we considered that it would be interesting to know how this share is in the case of organic farming. In relation to this, a field experiment was carried out at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM). The orchard included different management growing areas, corresponding to different schools of organic farming. Soil and gas samples were taken from these different sites. Gas samples were collected throughout the growing season from an accumulated atmosphere inside static chambers inserted into the soil. Then, these samples were carried to the laboratory and there analyzed. The results obtained allow knowing approximately how ecological fertilization contributes to air pollution due to greenhouse gases.

  13. An application of multiplier analysis in analyzing the role of information and communication technology sectors on Indonesian national economy: 1990-2005

    Science.gov (United States)

    Zuhdi, Ubaidillah

    2015-01-01

    The purpose of this study is to continue the previous studies which focused on Indonesian Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Indonesian national economy using simple household income multiplier, one of the analysis tools in Input-Output (IO) analysis. The analysis period of this study is from 1990-2005. The results show that the sectors did not have an important role on the national economy of Indonesia on the period. Besides, the results also show that, from the point of view of the multiplier, Indonesian national economy tended to stable during the period.

  14. An application of multiplier analysis in analyzing the role of information and communication technology sectors on Indonesian national economy: 1990-2005

    International Nuclear Information System (INIS)

    The purpose of this study is to continue the previous studies which focused on Indonesian Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Indonesian national economy using simple household income multiplier, one of the analysis tools in Input-Output (IO) analysis. The analysis period of this study is from 1990-2005. The results show that the sectors did not have an important role on the national economy of Indonesia on the period. Besides, the results also show that, from the point of view of the multiplier, Indonesian national economy tended to stable during the period

  15. Laser influence to biosystems

    OpenAIRE

    Jevtić Sanja D.; Srećković Milesa Ž.; Pelemiš Svetlana S.; Konstantinović Ljubica M.; Jovanić Predrag B.; Petrović Lazar D.; Dukić Milan M.

    2015-01-01

    In this paper a continous (cw) lasers in visible region were applied in order to study the influence of quantum generator to certain plants. The aim of such projects is to analyse biostimulation processes of living organizms which are linked to defined laser power density thresholds (exposition doses). The results of irradiation of corn and wheat seeds using He-Ne laser in the cw regime of 632.8nm, 50mW are presented and compared to results for other laser ...

  16. Laser influence to biosystems

    Directory of Open Access Journals (Sweden)

    Jevtić Sanja D.

    2015-01-01

    Full Text Available In this paper a continous (cw lasers in visible region were applied in order to study the influence of quantum generator to certain plants. The aim of such projects is to analyse biostimulation processes of living organizms which are linked to defined laser power density thresholds (exposition doses. The results of irradiation of corn and wheat seeds using He-Ne laser in the cw regime of 632.8nm, 50mW are presented and compared to results for other laser types. The dry and wet plant seeds were irradiated in defined time intervals and the germination period plant was monitored by days. Morphological data (stalk thickness, height, cob lenght for chosen plants were monitored. From the recorded data, for the whole vegetative period, we performed appropriate statistical data processing. One part of experiments were the measurements of coefficient of reflection in visible range. Correlation estimations were calculated and discussed for our results. Main conclusion was that there were a significant increments in plant's height and also a cob lenght elongation for corn.

  17. Oxygen analyzer

    Science.gov (United States)

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  18. An art report to analyze research status for the establishment of the space food development and future food system using the advanced food technology

    International Nuclear Information System (INIS)

    The quality of food for the astronaut accomplishing the mission in the space is one of the most important matters, and it is time to study and develop Korean space food for the Korean astronaut in the space. Therefore, in the beginning of the space exploration era, it is necessary to establish a national long-term plan and study and develop Korean space food in order to provide food with better quality for the astronaut accomplishing the space mission. Using current food processing, preservation, and packaging technology, it is necessary to develop the Korean space food, provide Korean astronaut studying at the international space station, and study the future space food systems used for the long-term space voyage and planet habitat base in the era of space exploration. Space food is analyzed through nutritional analysis, sensory evaluation, storage studies, packaging evaluations, and many other methods before its final shipment on the space shuttle. Each technology developed for the advanced food system must provide the required attribute to the food system, including safety, nutrition, and acceptability. It is anticipated that the duration of the exploration class missions can be at least 2, 3 years, and one of the biggest challenges for these missions will be to provide acceptable food with a shelf-life of 3-5 years. The development of space food process/preservation technology and its ripple effect will make a contribution to the improvement of nation's international phase, and the developed space food will potentially be used for combat ration and emergency/special food like the U. S. A. In the 21th century of space exploration era, the development of the advanced food system and life support system in the Mars space base as well as the space shuttle will strengthen the capability to precede the future space exploration era

  19. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO HYDROGEN SULFIDE ANALYZERS: HORIBA INSTRUMENTS, INC., APSA-360 AND TELEDYNE-API MODEL 101E

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  20. Biochemical Technology Program progress report for the period January 1--June 30, 1976. [Centrifugal analyzers and advanced analytical systems for blood and body fluids

    Energy Technology Data Exchange (ETDEWEB)

    Mrochek, J.E.; Burtis, C.A.; Scott, C.D. (comps.)

    1976-09-01

    This document, which covers the period January 1-June 30, 1976, describes progress in the following areas: (1) advanced analytical techniques for the clinical laboratory, (2) fast clinical analyzers, (3) development of a miniaturized analytical clinical laboratory system, (4) centrifugal fast analyzers for animal toxicological studies, and (5) chemical profile of body fluids.

  1. A COGNITIVE AND MEMETIC SCIENCE APPROACH TO ANALYZE THE HUMAN FACTORS IN PREDICTING THE EVOLUTION OF PAPER TECHNOLOGY AND PRODUCTS IN THE 21sT CENTURY

    Institute of Scientific and Technical Information of China (English)

    FumihikoONABE

    2004-01-01

    Predicting the future of paper industry is conventionally conducted from the technological and market-oriented aspects as well as a variety of constraints lying ahead of the industry such as resource, energy, and environmental issues.

  2. (Environmental technology)

    Energy Technology Data Exchange (ETDEWEB)

    Boston, H.L.

    1990-10-12

    The traveler participated in a conference on environmental technology in Paris, sponsored by the US Embassy-Paris, US Environmental Protection Agency (EPA), the French Environmental Ministry, and others. The traveler sat on a panel for environmental aspects of energy technology and made a presentation on the potential contributions of Oak Ridge National Laboratory (ORNL) to a planned French-American Environmental Technologies Institute in Chattanooga, Tennessee, and Evry, France. This institute would provide opportunities for international cooperation on environmental issues and technology transfer related to environmental protection, monitoring, and restoration at US Department of Energy (DOE) facilities. The traveler also attended the Fourth International Conference on Environmental Contamination in Barcelona. Conference topics included environmental chemistry, land disposal of wastes, treatment of toxic wastes, micropollutants, trace organics, artificial radionuclides in the environment, and the use biomonitoring and biosystems for environmental assessment. The traveler presented a paper on The Fate of Radionuclides in Sewage Sludge Applied to Land.'' Those findings corresponded well with results from studies addressing the fate of fallout radionuclides from the Chernobyl nuclear accident. There was an exchange of new information on a number of topics of interest to DOE waste management and environmental restoration needs.

  3. Systemic structural modular generalization of the crystallography of bound water applied to study the mechanisms of processes in biosystems at the atomic and molecular level

    International Nuclear Information System (INIS)

    The main reasons of the modern scientific revolution, one of the consequences of which are nanotechnologies and the development of interdisciplinary overall natural science (which can build potentially possible atomic structures and study the mechanisms of the processes occurring in them), are considered. The unifying role of crystallography in the accumulation of interdisciplinary knowledge is demonstrated. This generalization of crystallography requires the introduction of a new concept: a module which reflects the universal condition for stability of all real and potential and equilibrium and nonequilibrium structures of matter (their connectivity). A modular generalization of crystallography covers all forms of solids, including the structure of bound water (a system-forming matrix for the self-organization and morphogenesis of hierarchical biosystems which determines the metric selection of all other structural components of these systems). A dynamic model of the water surface layer, which serves as a matrix in the formation of Langmuir monolayers and plays a key role in the occurrence of life on the Earth, is developed.

  4. Waste Not, Want Not: Analyzing the Economic and Environmental Viability of Waste-to-Energy (WTE) Technology for Site-Specific Optimization of Renewable Energy Options

    Energy Technology Data Exchange (ETDEWEB)

    Funk, K.; Milford, J.; Simpkins, T.

    2013-02-01

    Waste-to-energy (WTE) technology burns municipal solid waste (MSW) in an environmentally safe combustion system to generate electricity, provide district heat, and reduce the need for landfill disposal. While this technology has gained acceptance in Europe, it has yet to be commonly recognized as an option in the United States. Section 1 of this report provides an overview of WTE as a renewable energy technology and describes a high-level model developed to assess the feasibility of WTE at a site. Section 2 reviews results from previous life cycle assessment (LCA) studies of WTE, and then uses an LCA inventory tool to perform a screening-level analysis of cost, net energy production, greenhouse gas (GHG) emissions, and conventional air pollution impacts of WTE for residual MSW in Boulder, Colorado. Section 3 of this report describes the federal regulations that govern the permitting, monitoring, and operating practices of MSW combustors and provides emissions limits for WTE projects.

  5. Key Technologies Progress of Automatic Biochemical Analyzer%全自动生化分析仪关键技术进展

    Institute of Scientific and Technical Information of China (English)

    王炜

    2010-01-01

    @@ 0引言 全自动生化分析仪(Chemistry Analyzer,简称生化仪)是临床检验中最经常使用的重要分析仪器之一,主要用于测定血清、血浆或其他体液的各种生化指标,如葡萄糖、白蛋白、总蛋白、胆固醇、肌肝、转氨酶等.

  6. Field Evaluation of MERCEM Mercury Emission Analyzer System at the Oak Ridge TSCA Incinerator East Tennessee Technology Park Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-03-01

    The authors reached the following conclusions: (1) The two-month evaluation of the MERCEM total mercury monitor from Perkin Elmer provided a useful venue in determining the feasibility of using a CEM to measure total mercury in a saturated flue gas. (2) The MERCEM exhibited potential at a mixed waste incinerator to meet requirements proposed in PS12 under conditions of operation with liquid feeds only at stack mercury concentrations in the range of proposed MACT standards. (3) Performance of the MERCEM under conditions of incinerating solid and liquid wastes simultaneously was less reliable than while feeding liquid feeds only for the operating conditions and configuration of the host facility. (4) The permeation tube calibration method used in this test relied on the CEM internal volumetric and time constants to relate back to a concentration, whereas a compressed gas cylinder concentration is totally independent of the analyzer mass flowmeter and flowrates. (5) Mercury concentration in the compressed gas cylinders was fairly stable over a 5-month period. (6) The reliability of available reference materials was not fully demonstrated without further evaluation of their incorporation into routine operating procedures performed by facility personnel. (7) The degree of mercury control occurring in the TSCA Incinerator off-gas cleaning system could not be quantified from the data collected in this study. (8) It was possible to conduct the demonstration at a facility incinerating radioactively contaminated wastes and to release the equipment for later unrestricted use elsewhere. (9) Experience gained by this testing answered additional site-specific and general questions regarding the operation and maintenance of CEMs and their use in compliance monitoring of total mercury emissions from hazardous waste incinerators.

  7. Ring Image Analyzer

    Science.gov (United States)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  8. COMPARING AND ANALYZING THE SIMILARITIES AND DIFFERENCES BETWEEN CPU HYPER-THREADING AND DUAL-CORE TECHNOLOGIES%比较分析CPU超线程技术与双核技术的异同

    Institute of Scientific and Technical Information of China (English)

    林杰; 余建坤

    2011-01-01

    Hyper-threading and dual-core are two important technologies during the CPU evolution. Hyper-threading technology simulates a physical processor as two "virtual" processors to reduce the idle time of the execution units and some resources, thus increasing CPU utilization. Dual-core technology encapsulates two physical processing cores into one CPU to improve the performance of programs. The paper describes the basic model of CPU, analyzes Hyper-threading and dual-core technology principles, and compares their similarities and differences from three perspectives of system architecture, parallel degree and improved efficiency.%超线程技术和双核技术是CPU发展历程中的重要技术.超线程技术把一个物理处理器模拟成两个“虚拟”的处理器,减少执行单元和一些资源的闲置时间,提高CPU的利用率.双核技术是将两个物理处理核心封装在一个CPU中,提高程序的执行效率.介绍CPU的基本模型,分析超线程和双核的技术原理,并从系统架构、并行程度和提升的效率三个方面比较它们的异同点.

  9. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  10. To Analyze the Science and Technology Novelty Searching System of Chi-nese Medicine and Establish Countermeasures%中医药科技查新系统分析及对策研究

    Institute of Scientific and Technical Information of China (English)

    黄薇

    2015-01-01

    该研究对目前的中医药科技查新系统进行了分析,找出了问题所在,以及产生问题的原因。就如何提高查新工作质量,提高工作效率等方面依次给予了解答,并提出了一些有建设性意义的方案。提出了丰富查新人员的业余生活,规范中医药科技查新标准等方案。通过分析问题,解决问题的方法,完善了中医药科技查新系统。最后得出结论,进行临床科研前,必须要在科技查新管理部门做好全面细致的查新工作,确定立项的创新点是否具有研究价值。科研完成后,通过中医药科技查新系统进行评估,再次确定研究的新颖性。%To analyze the science and technology novelty searching system of Chinese medicine at present in this paper, find out the problem in it and the reason of the problem. Then We answered that how to improve the quality of novelty searching work and work efficiency, put forward some countermeasures and suggestions. We proposed to enrich staff's spare life, and make the science and technology novelty searching system of Chinese medicine standard much more better. we completed the science and technology novelty searching system of Chinese medicine by analyzing the problem and solving the problem. We have a conclusion that we must to determine whether the project have research value and search in the Science and technology novelty search management before the research. We must to estimate the novelty of research though the science and technology novelty searching system of Chinese medicine again after completing the research.

  11. Serum Protein Fingerprint of Patients with Pancreatic Cancer by SELDI Technology

    Institute of Scientific and Technical Information of China (English)

    MA Ning; GE Chun-lin; LUAN Feng-ming; YAO Dian-bo; HU Chao-jun; LI Ning; LIU Yong-feng

    2008-01-01

    Objective:To study the serum protein fingerprint of patients with pancreatic cancer and to screen for protein molecules closely related to pancreatic cancer during the onset and progression of the disease using surface-enhanced laser desorption and ionization time of fight mass spectrometry(SELDI-TOF-MS).Methods:Serum samples from 20 pancreatic cancers,20 healthy volunteers and 18 patients with other pancreatic diseases.WCX magnetic beans and PBSII-C protein chips reader(Ciphergen Biosystems Ins.)were used.The protein fingerprint expression of all the Serum samples and the resulting profiles between cancer and normal were analyzed with Biomarker Wizard system.Results:A group of proteomic peaks were detected.Four differently expressed potential biomarkers were identified with the relative molecular weights of 5705 Da,4935 Da,5318 Da and 3243 Da.Among them,two proteins with m/z5705,5318Da down-regulated,and two proteins with m/z 4935,3243 Da were up-regulated in pancreatic cancers.Conclusion:SELDI technology can be used to screen significant proteins of differential expression in the serum of pancreatic cancer patients.These different proteins could be specific biomarkers of the patients with pancreatic cancer in the serum and have the potential value of further investigation.

  12. 我国校企技术转移效率及影响因素分析%Analyzing on University-Enterprise Technology Transfer Efficiency and Its Influencing Factors in China

    Institute of Scientific and Technical Information of China (English)

    廖述梅; 徐升华

    2009-01-01

    高校对企业的技术转移是国家创新体系建设的主要工作之一.采用SFE方法测算了我国27个省市高校从2000-2006年以来对企业的技术转移效率,并分析了非效率因素.分析发现,我国总体上校企技术转移效率比较低,各省市差异大;校企技术转移受到了诸如专利、地区人均研发投入等内外部因素的影响较大.分别从政府、高校和企业三个方面给出了提高校企技术转移效率的政策性建议.%Technology transfer from university to enterprise is one of the important works to develop national innovation systems.Based on 27 provincial panel data from 2000 to 2006,this paper measures the efficiency of technology transfer from university to enterprise,and analyzes inefficiency factors by SFE.The results suggest that the overall efficiencies ale low,and heterogeneity among region.Moreover,the inefficiency model shows that inner and environmental factors,such as patents and R&D investment per head have significant impact on technology transfer.Lastly,some suggestions are put forward to promote technology transfer efficiency from government,university,and enterprise perspectives.

  13. Analyzing crime scene videos

    Science.gov (United States)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  14. The Intermodulation Lockin Analyzer

    CERN Document Server

    Tholen, Erik A; Forchheimer, Daniel; Schuler, Vivien; Tholen, Mats O; Hutter, Carsten; Haviland, David B

    2011-01-01

    Nonlinear systems can be probed by driving them with two or more pure tones while measuring the intermodulation products of the drive tones in the response. We describe a digital lock-in analyzer which is designed explicitly for this purpose. The analyzer is implemented on a field-programmable gate array, providing speed in analysis, real-time feedback and stability in operation. The use of the analyzer is demonstrated for Intermodulation Atomic Force Microscopy. A generalization of the intermodulation spectral technique to arbitrary drive waveforms is discussed.

  15. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  16. Analyzing in the present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Tanggaard, Lene

    2015-01-01

    the interdependency between researcher and researched. On this basis, we advocate an explicit “open-state-of mind” listening as a key aspect of analyzing qualitative material, often described only as a matter of reading transcribed empirical materials, reading theory, and writing. The article contributes......The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts...... of various interviews conveyed diverse significance to the listening researcher at different times became a method of continuously opening up the empirical material in a reflexive, breakdown-oriented process of analysis. We argue that situating analysis in the present of analyzing emphasizes and acknowledges...

  17. Analyzing binding data.

    Science.gov (United States)

    Motulsky, Harvey J; Neubig, Richard R

    2010-07-01

    Measuring the rate and extent of radioligand binding provides information on the number of binding sites, and their affinity and accessibility of these binding sites for various drugs. This unit explains how to design and analyze such experiments.

  18. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  19. 血细胞分析仪形态学分析技术与镜检筛选%The technology of morphological analysis using blood cell analyzer and microscopic screening examination

    Institute of Scientific and Technical Information of China (English)

    丛玉隆

    2014-01-01

    In this paper,the technology development process of morphological analysis was reviewed based on the blood cell analyzers in recent 40 years.The clinical values,advantages and limitations using blood cell analyzer to classify the white blood cell in different development stages were illuminated.It was evident that the manual microscopic examination is still a necessary way for clinical diagnosis,which is not necessarily replaced by the automated clinical analyzers,although some high-precision or high-intelligence machines are helpful for detecting the disease better on various levels.Therefore,it should be adopted in the rational ways for clinical examination based on the evidence-based medicine,and the appropriate diagnosis instruments should be chosen which not only satisfy the needs of clinical diagnosis,but also conform to the national benefit-for-people policies and requirements of healthcare reform.Herein,valuing the morphological examination in clinical diagnosis and strengthening the basic skills trainings should be advocated to improve the qualities and academic levels of clinic examination technicians.%本文系统阐述了近40年来血细胞分析仪形态学分析技术发展进程.阐明了不同时期生产的不同功能的血细胞分析仪在白细胞分类中的临床价值、优势与局限性.明确指出目前不管多么精尖的自动识别技术,都不能代替人工显微镜检查.只不过不同档次的设备“筛选”水平不同而已.建议要从循证医学的角度,使用适宜检验技术.购置设备要考虑既满足临床诊断标准,又符合国家惠民政策和医改的要求.同时呼吁业界重视形态学检查,加强基本技能培训,不断提高医学检验技术人员的素质和学术水平.

  20. Analyzing the Alternatives

    Science.gov (United States)

    Grayson, Jennifer

    2010-01-01

    Technologies like solar, wind, and geothermal are exciting, but relatively new and untested in the context of universities, many of which are large enough to be cities unto themselves. And, like cities, universities count on a reliable, uninterruptible source of power. It is imperative, too, that precious dollars are wisely invested in the right…

  1. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  2. Analyzing radioligand binding data.

    Science.gov (United States)

    Motulsky, Harvey; Neubig, Richard

    2002-08-01

    Radioligand binding experiments are easy to perform, and provide useful data in many fields. They can be used to study receptor regulation, discover new drugs by screening for compounds that compete with high affinity for radioligand binding to a particular receptor, investigate receptor localization in different organs or regions using autoradiography, categorize receptor subtypes, and probe mechanisms of receptor signaling, via measurements of agonist binding and its regulation by ions, nucleotides, and other allosteric modulators. This unit reviews the theory of receptor binding and explains how to analyze experimental data. Since binding data are usually best analyzed using nonlinear regression, this unit also explains the principles of curve fitting with nonlinear regression.

  3. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster

  4. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  5. Analyzing Workforce Education. Monograph.

    Science.gov (United States)

    Texas Community & Technical Coll. Workforce Education Consortium.

    This monograph examines the issue of task analysis as used in workplace literacy programs, debating the need for it and how to perform it in a rapidly changing environment. Based on experiences of community colleges in Texas, the report analyzes ways that task analysis can be done and how to implement work force education programs more quickly.…

  6. List mode multichannel analyzer

    Science.gov (United States)

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  7. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  8. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    , because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...

  9. Lear CAN analyzer

    OpenAIRE

    Peiró Ibañez, Felipe

    2013-01-01

    Since it was introduced in the automotive industry, the protocol CAN (Controller Area Network) has been widely used for its benefits. This has led many companies to offer several hardware and software solutions in order to monitor the communications that gives this protocol. The current master thesis presents the Lear CAN Analyzer as a software tool developed within the company LEAR Corporation. It is designed to be used in the automobile industry as a complement or substitute for other co...

  10. Analyzing business process management

    OpenAIRE

    Skjæveland, Børge

    2013-01-01

    Within the Oil & Gas Industry, the market is constantly growing more competitive, forcing companies to continually adapt to changes. Companies need to cut costs and improve the business efficiency. One way of successfully managing these challenges is to implement business process management in the organization. This thesis will analyze how Oceaneering Asset Integrity AS handled the implementation of a Business Process Management System and the effects it had on the employees. The main goal...

  11. Radioisotope analyzer of barium

    International Nuclear Information System (INIS)

    Principle of operation and construction of radioisotope barium sulphate analyzer type MZB-2 for fast determination of barium sulphate content in barite ores and enrichment products are described. The gauge equipped with Am-241 and a scintillation detector enables measurement of barium sulphate content in prepared samples of barite ores in the range 60% - 100% with the accuracy of 1%. The gauge is used in laboratories of barite mine and ore processing plant. 2 refs., 2 figs., 1 tab. (author)

  12. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  13. 模糊环境下不对称企业的技术创新投资期权博弈分析%Option-Game Approach to Analyze Technology Innovation Investment With Cost Asymmetry Under the Fuzzy Environment

    Institute of Scientific and Technical Information of China (English)

    谭英双; 衡爱民; 龙勇; 吴宏伟; 江礼梅

    2011-01-01

    Basing on the asymmetric duopoly option-game model with investment cost asymmetry, this research discusses the present value of profit flows and the sunk investment costs for the trapezoidal fuzzy number, and Firms' technology innovation investment strategy is analyzed. It constructs followers, leaders of investment value and investment threshold of fuzzy expressions under the fuzzy environment to conduct numerical analysis. And it is concluded that there is still existing the best increasing investment strategy under fuzzy environment, with the development of the trapezoidal fuzzy number of the sunk cost of expected investment, the investment value of business declines, but the critical value for the investment ascends. This offers a kind of explanation to the investment strategies under the fuzzy environment.%本文在不对称双头垄断期权博弈模型基础上讨论了利润流现值和沉没投资成本为梯形模糊数的情形并进行了扩展,对企业技术创新投资策略进行了分析。构建了模糊环境下追随者、领导者的投资价值和投资临界值的模糊表达式并进行数值分析。分析表明模糊环境下仍存在最优投资策略,随着梯形模糊数的沉没投资成本期望值的增加,企业的投资价值下降而投资临界值上升。为模糊环境下投资决策提供了一种解释。

  14. Method of analyzing impact features of media in vibration mills based on virtual prototyping technology%振动磨中介质冲击特性的虚拟样机分析方法

    Institute of Scientific and Technical Information of China (English)

    张晓钟; 张西仲; 王晓明

    2011-01-01

    The performance control of the vibration mill is analyzed regarding to the quality control of ground materials. In view of the impact features of media in the core-controlled grinding tube, the features of physical prototype experiments, theoretical analysis and virtual prototyping method are compared and contrasted. The relevancy among the impact features of media in the grinding tube, kinematic behavior and the dynamic behavior of the grinding mill, technical parameters are concluded through applying the virtual vibration mill and parameterization technology. Based on the physicochemical properties and crushing features of grinding materials, the impact features of media is determined, and then the dynamic parameters, structural parameters, technical parameters and quality control of the vibration mill are determined. In addition, a kind of virtual prototyping method studying vibration and impact is introduced.%针对物料粉磨的品质控制问题,分析了振动磨的性能控制。针对控制核心的磨筒介质冲击特性,比较了实体样机试验、理论分析和虚拟样机方法的特点。应用虚拟振动磨和参数化技术,获得了磨筒中的介质冲击特性、运动学行为与磨机动力学、工艺参数等之间的关联。依据磨料的物化特性、破磨特性确定了介质的冲击特性,进而确定振动磨动力学参数、结构参数、工艺参数、性能控制方法及品质控制等。介绍了一种振动与冲击问题研究的虚拟样机方法。

  15. Fluorescence analyzer for lignin

    Science.gov (United States)

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  16. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  17. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  18. Residual gas analyzer calibration

    Science.gov (United States)

    Lilienkamp, R. H.

    1972-01-01

    A technique which employs known gas mixtures to calibrate the residual gas analyzer (RGA) is described. The mass spectra from the RGA are recorded for each gas mixture. This mass spectra data and the mixture composition data each form a matrix. From the two matrices the calibration matrix may be computed. The matrix mathematics requires the number of calibration gas mixtures be equal to or greater than the number of gases included in the calibration. This technique was evaluated using a mathematical model of an RGA to generate the mass spectra. This model included shot noise errors in the mass spectra. Errors in the gas concentrations were also included in the valuation. The effects of these errors was studied by varying their magnitudes and comparing the resulting calibrations. Several methods of evaluating an actual calibration are presented. The effects of the number of gases in then, the composition of the calibration mixture, and the number of mixtures used are discussed.

  19. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  20. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  1. Diversity of endophytic bacteria in walnut analyzed by Illumina MiSeq high-throughput sequencing technology%Illumina MiSeq高通量测序分析核桃内生细菌多样性

    Institute of Scientific and Technical Information of China (English)

    陈泽斌; 李冰; 王定康; 余磊; 徐胜光; 任禛; 靳松; 张永福; 彭声静

    2015-01-01

    In this study, the species abundance and alpha diversity of walnut endophytic bacteria were analyzed by Illumina MiSeq high-throughput sequencing of the 16S rDNA-V4 region. Softwares such as Uparse, Flash, and Qiime were employed to sort and calculate the number of sequences and operational taxonomic units ( OTUs) . The numbers of effective sequences and OTUs for each sample were 63183 and 103, respectively. The rarefaction curves showed that adequate sampling was achieved, and the number of OTUs was close to saturation. Majority of the endophytic bacteria belonged to Sphingomonas ( 27. 27%) , Halomonas ( 27. 27%) and Agrobacterium ( 45. 45%) which were therefore the dominant bacterial families in walnut. Illumi-na MiSeq high-throughput sequencing technology provided more accurate and scientific data resources for the study of endophytic bacteria.%应用Illumina MiSeq高通量测序技术测定核桃内生细菌的16S rDNA-V4变异区序列,使用Uparse等软件整理和统计样品序列数目和操作分类单元( OTUs)数量,分析内生细菌的丰度和a-多样性。获得了用于分析的有效序列63183条,OTU数为103。稀释曲线表明测序深度充分,OTU数量接近于饱和。核桃样品的Chao1指数为105.143,Shannon多样性指数为1.823。核桃内生细菌主要分布于鞘氨醇单胞菌属( Sphingomonas,27.27%)、盐单胞菌属( Halomonas,27.27%)、土壤杆菌属( Agrobacterium,45.45%),这3个属是核桃内生细菌优势菌属。

  2. Optimizing Technology of Volatile Oil from Flower of Gomphrena globosa and Analyzing Extracting its Chemical Compositions%千日红挥发油提取工艺优化及其化学成分分析

    Institute of Scientific and Technical Information of China (English)

    黄良勤; 王刚

    2014-01-01

    确定千日红(Gomphrena gliobosa. L)挥发油的最佳提取工艺,并分析其化学成分。采用微波辅助萃取法从千日红中提取挥发油,以挥发油提取率为指标,考察萃取时间、液料比、微波功率等影响因素,采用正交试验法优化千日红挥发油提取工艺。通过气相色谱-质谱法对千日红挥发油进行分析,用归一化法测定挥发油成分的百分含量,再用Wiley和Nist05标准谱库解析挥发油成分的结构。千日红挥发油最佳提取工艺:料液比为1:14,萃取时间为5 h,微波功率为700 W,此工艺条件下,挥发油提取率达1.34%。共鉴定了33种成分,占挥发油总成分的98.19%。为进一步开发利用千日红提供科学依据。%To optimize extraction process of volatile oil from flower of Gomphrena globosa and to analyze its chemical compo-sitions, the essential oils were extracted from Gomphrena globosa by microwave-assisted extraction (MAE). Distillation time, solid-liquid ratio, microwave power were investigated with extraction ratio of volatile oil. Extraction technology of volatile oil from flower of Gomphrena globosa was optimized by orthogonal test. Chemical compositions of the volatile oil were analysed by gas chromatography-mass spectrometry (GC-MS). The amount of the components from the essential oil were determined through normalization method. The essential oils were identified with Wiley and Nist05 mass spectrum atlas. Optimal extraction process of volatile oil was as follows: solid-liquid ratio 1:14 and distillation time 4 h. Under optimal extraction conditions, yield of volatile oil was 1.34%. There were 33 components composing of about 98.19% of the total essentials separated and i-dentified from Gomphrena globosa. by MAE. The study will provide scientific basis for the further exploration and utilization of Gomphrena globosa.

  3. Pseudostupidity and analyzability.

    Science.gov (United States)

    Cohn, L S

    1989-01-01

    This paper seeks to heighten awareness of pseudostupidity and the potential analyzability of patients who manifest it by defining and explicating it, reviewing the literature, and presenting in detail the psychoanalytic treatment of a pseudostupid patient. Pseudostupidity is caused by an inhibition of the integration and synthesis of thoughts resulting in a discrepancy between intellectual capacity and apparent intellect. The patient's pseudostupidity was determined in part by his need to prevent his being more successful than father, i.e., defeating his oedipal rival. Knowing and learning were instinctualized. The patient libidinally and defensively identified with father's passive, masochistic position. He needed to frustrate the analyst as he had felt excited and frustrated by his parents' nudity and thwarted by his inhibitions. He wanted to cause the analyst to feel as helpless as he, the patient, felt. Countertransference frustration was relevant and clinically useful in the analysis. Interpretation of evolving relevant issues led to more anxiety and guilt, less pseudostupidity, a heightened alliance, and eventual working through. Negative therapeutic reactions followed the resolution of pseudostupidity. PMID:2708771

  4. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  5. Analyzing Spacecraft Telecommunication Systems

    Science.gov (United States)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  6. Analyzing Pseudophosphatase Function.

    Science.gov (United States)

    Hinton, Shantá D

    2016-01-01

    Pseudophosphatases regulate signal transduction cascades, but their mechanisms of action remain enigmatic. Reflecting this mystery, the prototypical pseudophosphatase STYX (phospho-serine-threonine/tyrosine-binding protein) was named with allusion to the river of the dead in Greek mythology to emphasize that these molecules are "dead" phosphatases. Although proteins with STYX domains do not catalyze dephosphorylation, this in no way precludes their having other functions as integral elements of signaling networks. Thus, understanding their roles in signaling pathways may mark them as potential novel drug targets. This chapter outlines common strategies used to characterize the functions of pseudophosphatases, using as an example MK-STYX [mitogen-activated protein kinase (MAPK) phospho-serine-threonine/tyrosine binding], which has been linked to tumorigenesis, apoptosis, and neuronal differentiation. We start with the importance of "restoring" (when possible) phosphatase activity in a pseudophosphatase so that the active mutant may be used as a comparison control throughout immunoprecipitation and mass spectrometry analyses. To this end, we provide protocols for site-directed mutagenesis, mammalian cell transfection, co-immunoprecipitation, phosphatase activity assays, and immunoblotting that we have used to investigate MK-STYX and the active mutant MK-STYXactive. We also highlight the importance of utilizing RNA interference (RNAi) "knockdown" technology to determine a cellular phenotype in various cell lines. Therefore, we outline our protocols for introducing short hairpin RNA (shRNA) expression plasmids into mammalians cells and quantifying knockdown of gene expression with real-time quantitative PCR (qPCR). A combination of cellular, molecular, biochemical, and proteomic techniques has served as powerful tools in identifying novel functions of the pseudophosphatase MK-STYX. Likewise, the information provided here should be a helpful guide to elucidating the

  7. Analyzing Pseudophosphatase Function.

    Science.gov (United States)

    Hinton, Shantá D

    2016-01-01

    Pseudophosphatases regulate signal transduction cascades, but their mechanisms of action remain enigmatic. Reflecting this mystery, the prototypical pseudophosphatase STYX (phospho-serine-threonine/tyrosine-binding protein) was named with allusion to the river of the dead in Greek mythology to emphasize that these molecules are "dead" phosphatases. Although proteins with STYX domains do not catalyze dephosphorylation, this in no way precludes their having other functions as integral elements of signaling networks. Thus, understanding their roles in signaling pathways may mark them as potential novel drug targets. This chapter outlines common strategies used to characterize the functions of pseudophosphatases, using as an example MK-STYX [mitogen-activated protein kinase (MAPK) phospho-serine-threonine/tyrosine binding], which has been linked to tumorigenesis, apoptosis, and neuronal differentiation. We start with the importance of "restoring" (when possible) phosphatase activity in a pseudophosphatase so that the active mutant may be used as a comparison control throughout immunoprecipitation and mass spectrometry analyses. To this end, we provide protocols for site-directed mutagenesis, mammalian cell transfection, co-immunoprecipitation, phosphatase activity assays, and immunoblotting that we have used to investigate MK-STYX and the active mutant MK-STYXactive. We also highlight the importance of utilizing RNA interference (RNAi) "knockdown" technology to determine a cellular phenotype in various cell lines. Therefore, we outline our protocols for introducing short hairpin RNA (shRNA) expression plasmids into mammalians cells and quantifying knockdown of gene expression with real-time quantitative PCR (qPCR). A combination of cellular, molecular, biochemical, and proteomic techniques has served as powerful tools in identifying novel functions of the pseudophosphatase MK-STYX. Likewise, the information provided here should be a helpful guide to elucidating the

  8. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  9. Managing healthcare information: analyzing trust.

    Science.gov (United States)

    Söderström, Eva; Eriksson, Nomie; Åhlfeldt, Rose-Mharie

    2016-08-01

    Purpose - The purpose of this paper is to analyze two case studies with a trust matrix tool, to identify trust issues related to electronic health records. Design/methodology/approach - A qualitative research approach is applied using two case studies. The data analysis of these studies generated a problem list, which was mapped to a trust matrix. Findings - Results demonstrate flaws in current practices and point to achieving balance between organizational, person and technology trust perspectives. The analysis revealed three challenge areas, to: achieve higher trust in patient-focussed healthcare; improve communication between patients and healthcare professionals; and establish clear terminology. By taking trust into account, a more holistic perspective on healthcare can be achieved, where trust can be obtained and optimized. Research limitations/implications - A trust matrix is tested and shown to identify trust problems on different levels and relating to trusting beliefs. Future research should elaborate and more fully address issues within three identified challenge areas. Practical implications - The trust matrix's usefulness as a tool for organizations to analyze trust problems and issues is demonstrated. Originality/value - Healthcare trust issues are captured to a greater extent and from previously unchartered perspectives. PMID:27477934

  10. Crew Activity Analyzer

    Science.gov (United States)

    Murray, James; Kirillov, Alexander

    2008-01-01

    The crew activity analyzer (CAA) is a system of electronic hardware and software for automatically identifying patterns of group activity among crew members working together in an office, cockpit, workshop, laboratory, or other enclosed space. The CAA synchronously records multiple streams of data from digital video cameras, wireless microphones, and position sensors, then plays back and processes the data to identify activity patterns specified by human analysts. The processing greatly reduces the amount of time that the analysts must spend in examining large amounts of data, enabling the analysts to concentrate on subsets of data that represent activities of interest. The CAA has potential for use in a variety of governmental and commercial applications, including planning for crews for future long space flights, designing facilities wherein humans must work in proximity for long times, improving crew training and measuring crew performance in military settings, human-factors and safety assessment, development of team procedures, and behavioral and ethnographic research. The data-acquisition hardware of the CAA (see figure) includes two video cameras: an overhead one aimed upward at a paraboloidal mirror on the ceiling and one mounted on a wall aimed in a downward slant toward the crew area. As many as four wireless microphones can be worn by crew members. The audio signals received from the microphones are digitized, then compressed in preparation for storage. Approximate locations of as many as four crew members are measured by use of a Cricket indoor location system. [The Cricket indoor location system includes ultrasonic/radio beacon and listener units. A Cricket beacon (in this case, worn by a crew member) simultaneously transmits a pulse of ultrasound and a radio signal that contains identifying information. Each Cricket listener unit measures the difference between the times of reception of the ultrasound and radio signals from an identified beacon

  11. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  12. 基于RS与GIS的大庆市城市空间形态演化分析%Analyzing Daqing's Urban Spatial Form Evolution: Basing on the Technology of RS and GIS

    Institute of Scientific and Technical Information of China (English)

    王士君; 王若菊; 王永超; 刘成玉

    2012-01-01

    Taking the current largest oil industrial city of China--Daqing as researching object. In this paper GIS technology is used to extracts the related information such as the scale, shape, growth and location coordinate of urban construct land use from satellitic imagine data of Landsat MSS and TM in 1984, 1995 and 2007.Based on the above data, calculate the speed, intensity, compact degree, fractal dimension of urban sprawl and level of urban smart growth, so as to analyze the urban sprawl characteristics of Daqing since 1984. The paper also explores the formation reasons of these characteristics using economic and social data, the method of regional statistic analyse as well. Daqing' s large scale of construction land use, fast sprawl, spatial imbalance, anomaly of shape, low compact degree, direction consistency of urban sprawl and urban center movement are universal characters of general oil industrial cities expanding and development. But these characters last only a short time. The formation reasons of these characters are oil exploitation strategy, the limitation of nature environment, the guidance of traffic infrastructure, the control of urban planning, as well as the fimction transfer of central place.%以我国目前最大石油城市——大庆市为研究对象,选取1984、1995、2007年3个城市建设关键时间节点,以Landsat MSS、TM卫星影像为主要数据源,利用GIS技术提取城市建设用地规模、形态、增长幅度、区位坐标等信息,计算城市空间扩张速度、强度、城市空间形态紧凑度、分维指数、城市增长理性程度等空间特征量,分析大庆市1984年以来城市空间扩张现象和特征,并结合经济社会统计相关数据及区域分析方法,探究其因果关系。研究认为,大庆市城市空间扩张与形态演化具有石油城市发展的典型性和一般规律,表现出建设用地总量大、扩张速度快、强度高、增量分布不均衡、空间形态不规则、

  13. Investigation of techniques for analyzing and evaluating the effects of newly developed new-energy and energy-saving technologies; Shinsho energy gijutsu kaihatsu koka no bunseki hyoka shuho ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    For the comprehensive and rational evaluation of technologies related to new energy and energy saving, investigations have been conducted about the history and trend of related policies and technological development. Japan saw a great change in the demand-supply structure in 10 to 15 years following the first oil crisis when oil was partially replaced with atomic energy and LNG and energy-saving efforts came into practice. However, the diversification of energy sources by developing new energies and improvement on the self-sustenance rate remain far from success. Related policies both in Japan and overseas are shifting from the conventional efforts for stable supply and economic growth to the handling of deregulation and environmental problems. Energy stratagem has been studied, not regarding energy technology as a mere economic task and working out how the demand for energy should be with the restricted resources, social economy, and environments, taken into account. Models and scenarios are proposed, for the quantitative analysis and evaluation of the energy stratagem wherein the development of technologies for new energy and energy saving are positioned clearly. Especially, concepts emphasized therein are the technique of power source planning and the development of regenerative energy. 24 refs., 32 figs., 15 tabs.

  14. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  15. Analyzing Languages for Specific Purposes Discourse

    Science.gov (United States)

    Bowles, Hugo

    2012-01-01

    In the last 20 years, technological advancement and increased multidisciplinarity has expanded the range of data regarded as within the scope of languages for specific purposes (LSP) research and the means by which they can be analyzed. As a result, the analytical work of LSP researchers has developed from a narrow focus on specialist terminology…

  16. 浅析流化床生物质与煤共气化技术方案%Briefly Analyzing Scheme for Biomass and Coal Co-gasification Technology of Fluidized Bed

    Institute of Scientific and Technical Information of China (English)

    毕可军; 毛少祥; 孔北方; 柏林红

    2012-01-01

    In allusion to problems that the biomass was difficult to gasity independently, author has discussed the co-complemented technical scheme of biomass with coal co-gasification; has introduced the physical property of biomass and its gasification features; has discussed the technical features and process flow for pulverized coal gasification technology with fluidized bed of ash meh collection ; has presented the technical scheme to make co-gasification of biomass with coal on basis of pulverized coal gasification technology with fluidized bed of ash meh collection ; and also has presented the relative solution measures for existing problems.%针对生物质能源难以单独气化的问题,探讨了生物质与煤共气化的互补性技术方案;介绍了生物质的物理性质和气化特性;论述了灰融聚流化床粉煤气化技术的特点和工艺流程;提出了在灰融聚流化床粉煤气化的基础上进行生物质与煤共气化技术方案,对存在的问题提出了相关解决措施。

  17. NMR of porous Bio-systems

    NARCIS (Netherlands)

    Snaar, E.J.M.

    2002-01-01

    The structure and dynamics of water diffusion and -transport at a microscale in heterogeneous porous media have been investigated using various 1H NMR techniques. In particular in biological porous media the dynamics are usually very complex since it is intimately related to th

  18. The CD11a binding site of efalizumab in psoriatic skin tissue as analyzed by Multi-Epitope Ligand Cartography robot technology. Introduction of a novel biological drug-binding biochip assay.

    Science.gov (United States)

    Bonnekoh, B; Böckelmann, R; Pommer, A J; Malykh, Y; Philipsen, L; Gollnick, H

    2007-01-01

    Efalizumab (Raptiva) is an immunomodulating recombinant humanized IgG1 monoclonal antibody that binds to CD11a, the alpha-subunit of leukocyte function antigen-1 (LFA-1). By blocking the binding of LFA-1 to ICAM-1, efalizumab inhibits the adhesion of leukocytes to other cell types and interferes with the migration of T lymphocytes to sites of inflammation (including psoriatic skin plaques). Analysis of the response in patients treated with efalizumab to date shows that distinct groups of responders and nonresponders to the drug exist. It would therefore be of great practical value to be able to predict which patients are most likely to respond to treatment, by identifying key parameters in the mechanism of action of efalizumab. Detailed investigation and detection of multiple epitopes in microcompartments of skin tissue has until recently been restricted by the available technology. However, the newly developed technique of Multi-Epitope Ligand Cartography (MELC) robot technology combines proteomics and biomathematical tools to visualize protein networks at the cellular and subcellular levels in situ, and to decipher cell functions. The MELC technique, which is outlined in this paper, was used to help characterize the binding of efalizumab to affected and unaffected psoriatic skin as compared to normal control skin under ex vivomodel conditions. Efalizumab was labeled with fluorescein isothiocyanate and integrated into a MELC library of more than 40 antibodies. These antibodies were selected for their potential to detect epitopes which may be indicative of (a) various cell types, (b) structural components of the extracellular matrix, or (c) the processes of cell proliferation, activation and adhesion. Efalizumab bound to CD11a in affected psoriatic skin by a factor 15x and 32x higher than in unaffected psoriatic skin and normal control skin, respectively. CD11a and the efalizumab binding site were primarily expressed in the extravascular dermis, whereas CD54 (ICAM

  19. Analyzing the Biology on the System Level

    Institute of Scientific and Technical Information of China (English)

    Wei Tong

    2004-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology, and summarizes the analysis methods, experimental technologies, research developments, and so on in the four key fields of systems biology-systemic structures, dynamics, control methods, and design principles.

  20. O profissional da informática e sua personalidade analisada por meio da técnica de Rorschach The information technology professionals and their personality analyzed by Rorschach technique

    Directory of Open Access Journals (Sweden)

    Seille Cristine Garcia Santos

    2005-12-01

    Full Text Available Este artigo apresenta os resultados de um estudo comparativo da capacidade de análise, iniciativa e relacionamento humano entre informatas gerentes e operacionais. Participaram 66 informatas, de 9 empresas e 5 departamentos de informática com até 150 funcionários, de Porto Alegre e Grande Porto Alegre. Foram utilizados a técnica de Rorschach (Sistema Klopfer e um questionário estruturado. O mesmo questionário foi respondido pelo superior imediato de cada participante do estudo. Os resultados mostram que não existem diferenças significativas (t-Test e correlação de Pearson entre informatas gerentes e informatas operacionais com relação à capacidade de análise, iniciativa e relacionamento humano. Os operacionais se diferenciam dos gerentes no que diz respeito à liberação das reações emocionais com menos controle. É discutida a presença em ambos os grupos de indicativos no Rorschach de dificuldades para interagir com outras pessoas.This article presents the outcome of a comparative study carried out about the analysis capabilities, initiative, and human relationship among IT (Information Technology managers and IT systems analysts and programmers. Sixty-six IT individuals from nine different companies and five IT divisions with up to 150 employees working in Porto Alegre and its metropolitan region were surveyed. The Rorschach's technique (Klopfer System and a structured questionnaire were applied. The same questionnaire was answered by the immediate superior of each subordinate surveyed by this study. The results show that there is no significant difference (t-Test and Pearson correlation between IT managers and IT systems analysts and programmers regarding their analysis capabilities, initiative and human relationship. The operational individuals distinguish themselves from the managerial ones regarding the release of less controlled emotional reactions. Difficulties to interact with others based on Rorschach's indicatives

  1. Research on environmental bioecosensing technology using ecological information; Seitaikei joho ni yoru kankyo bio eco sensing gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    The bioecosensing technology was studied which detects and identifies feeble signals generated by biosystem communication in wide biological environment. The following were reported as current notable environmental biosensing technologies: a quick measurement method of environmental contaminants using immunological measurement method, analysis method of ecological state of microorganism using DNA probes, observation of ecosystem by bioluminescent system, measurement method of environmental changes and contaminants using higher animals and plants, and detection method of chemical contaminants using chemotaxis of microorganism. As a result, the new bioecosensing/monitoring technology in molecular level was suggested for identifying comprehensive environmental changes which could not be measured by previous physical and chemical methods, as changes in ecosystem corresponding to environmental changes. As the wide area remote sensing technology of environmental ecological information, sensing technology on the earth, aircraft and satellite was also discussed. 247 refs., 55 figs., 17 tabs.

  2. Transcriptome characteristics of Paspalum vaginatum analyzed with Illumina sequencing technology%基于高通量测序的海滨雀稗转录组学研究

    Institute of Scientific and Technical Information of China (English)

    贾新平; 叶晓青; 梁丽建; 邓衍明; 孙晓波; 佘建明

    2014-01-01

    The transcriptome of Paspalum vaginatum leaf was sequenced using an Illumina HiSeq 2000 plat-form,which is a new generation of high-throughput sequencing technology used to study expression profiles and to predict functional genes.In the target sample,a total of 47520544 reads containing 4752054400 bp of se-quence information were generated.A total of 81220 unigenes containing 87542503 bp sequence information were formed by initial sequence splicing,with an average read length of 1077 bp.Unigene qualities for several aspects were assessed,such as length distribution,GC content and gene expression level.The sequencing data was of high quality and reliability.The 46169 unigenes were annotated using BLAST searches against the Nr, Nt and SwissProt databases.All the assembled unigenes could be broadly divided into biological processes,cel-lular components and 48 branches of molecular function categories by gene ontology,including metabolic process,binding and cellular processes.The unigenes were further annotated based on COG category,which could be grouped into 25 functional categories.The unigenes could be broadly divided into 112 classes according to their metabolic pathway,including the phenylalanine metabolism pathway,plant-pathogen interaction,plant hormone biosynthesis and signal transduction,flavonoid biosynthesis,terpenoid backbone biosynthesis,lipid metabolism,and RNA degradation.There were 22721 SSR in 81220 unigenes and in the SSR,A/T was the highest repeat,following by CCG/CGG and AGC/CTG.This study is the first comprehensive transcriptome a-nalysis for Paspalum vaginatum ,providing valuable genome data sources for the molecular biology of this grass.%采用新一代高通量测序技术 Illumina HiSeq 2000对海滨雀稗叶片转录组进行测序,结合生物信息学方法开展基因表达谱研究和功能基因预测。通过测序,获得了47520544个序列读取片段(reads),包含了4752054400个碱基序列(bp)信息。对 reads

  3. Analyzing Valuation Practices through Contracts

    DEFF Research Database (Denmark)

    Tesnière, Germain; Labatut, Julie; Boxenbaum, Eva

    This paper seeks to analyze the most recent changes in how societies value animals. We analyze this topic through the prism of contracts between breeding companies and farmers. Focusing on new valuation practices and qualification of breeding animals, we question the evaluation of difficult...... commensurable entities (animal, embryo, mating) and the impacts of these valuation and qualifications on government of living entities....

  4. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  5. VoIP Quality Analyzer

    OpenAIRE

    Havelka, Ondřej

    2011-01-01

    This thesis deals with the quality of the IP telephony and its measuring using the netflow technology. It describes individual factors influencing the quality from sampling and quantization over the impairment caused by codecs to the degradation during network transfers. Next part focuses on models allowing to regard quality of IP telephony with emphasis to the E-model and R-factor. It shortly describes the netflow technology and the quality measuring connected with it. Practical part describ...

  6. Nuclear fuel microsphere gamma analyzer

    Science.gov (United States)

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  7. Complete denture analyzed by optical coherence tomography

    Science.gov (United States)

    Negrutiu, Meda L.; Sinescu, Cosmin; Todea, Carmen; Podoleanu, Adrian G.

    2008-02-01

    The complete dentures are currently made using different technologies. In order to avoid deficiencies of the prostheses made using the classical technique, several alternative systems and procedures were imagined, directly related to the material used and also to the manufacturing technology. Thus, at the present time, there are several injecting systems and technologies on the market, that use chemoplastic materials, which are heat cured (90-100°C), in dry or wet environment, or cold cured (below 60°C). There are also technologies that plasticize a hard cured material by thermoplastic processing (without any chemical changes) and then inject it into a mold. The purpose of this study was to analyze the existence of possible defects in several dental prostheses using a non invasive method, before their insertion in the mouth. Different dental prostheses, fabricated from various materials were investigated using en-face optical coherence tomography. In order to discover the defects, the scanning was made in three planes, obtaining images at different depths, from 0,01 μm to 2 mm. In several of the investigated prostheses we found defects which may cause their fracture. These defects are totally included in the prostheses material and can not be vizualised with other imagistic methods. In conclusion, en-face OCT is an important investigative tool for the dental practice.

  8. Analyzing petabytes of data with Hadoop

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  9. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  10. Analyzing the Grammar of English

    CERN Document Server

    Teschner, Richard V

    2007-01-01

    Analyzing the Grammar of English offers a descriptive analysis of the indispensable elements of English grammar. Designed to be covered in one semester, this textbook starts from scratch and takes nothing for granted beyond a reading and speaking knowledge of English. Extensively revised to function better in skills-building classes, it includes more interspersed exercises that promptly test what is taught, simplified and clarified explanations, greatly expanded and more diverse activities, and a new glossary of over 200 technical terms.Analyzing the Grammar of English is the only English gram

  11. Simulation of a Hyperbolic Field Energy Analyzer

    CERN Document Server

    Gonzalez-Lizardo, Angel

    2016-01-01

    Energy analyzers are important plasma diagnostic tools with applications in a broad range of disciplines including molecular spectroscopy, electron microscopy, basic plasma physics, plasma etching, plasma processing, and ion sputtering technology. The Hyperbolic Field Energy Analyzer (HFEA) is a novel device able to determine ion and electron energy spectra and temperatures. The HFEA is well suited for ion temperature and density diagnostics at those situations where ions are scarce. A simulation of the capacities of the HFEA to discriminate particles of a particular energy level, as well as to determine temperature and density is performed in this work. The electric field due the combination of the conical elements, collimator lens, and Faraday cup applied voltage was computed in a well suited three-dimensional grid. The field is later used to compute the trajectory of a set of particles with a predetermined energy distribution. The results include the observation of the particle trajectories inside the sens...

  12. The Convertible Arbitrage Strategy Analyzed

    NARCIS (Netherlands)

    Loncarski, I.; Ter Horst, J.R.; Veld, C.H.

    2006-01-01

    This paper analyzes convertible bond arbitrage on the Canadian market for the period 1998 to 2004.Convertible bond arbitrage is the combination of a long position in convertible bonds and a short position in the underlying stocks. Convertible arbitrage has been one of the most successful strategies

  13. Software-Design-Analyzer System

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  14. FORTRAN Static Source Code Analyzer

    Science.gov (United States)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  15. The Photo-Pneumatic CO2 Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  16. Proteomics: an efficient tool to analyze nematode proteins

    Science.gov (United States)

    Proteomic technologies have been successfully used to analyze proteins structure and characterization in plants, animals, microbes and humans. We used proteomics methodologies to separate and characterize soybean cyst nematode (SCN) proteins. Optimizing the quantity of proteins required to separat...

  17. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  18. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  19. Introduction: why analyze single cells?

    Science.gov (United States)

    Di Carlo, Dino; Tse, Henry Tat Kwong; Gossett, Daniel R

    2012-01-01

    Powerful methods in molecular biology are abundant; however, in many fields including hematology, stem cell biology, tissue engineering, and cancer biology, data from tools and assays that analyze the average signals from many cells may not yield the desired result because the cells of interest may be in the minority-their behavior masked by the majority-or because the dynamics of the populations of interest are offset in time. Accurate characterization of samples with high cellular heterogeneity may only be achieved by analyzing single cells. In this chapter, we discuss the rationale for performing analyses on individual cells in more depth, cover the fields of study in which single-cell behavior is yielding new insights into biological and clinical questions, and speculate on how single-cell analysis will be critical in the future.

  20. The Statistical Loop Analyzer (SLA)

    Science.gov (United States)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  1. Satellite-based interference analyzer

    Science.gov (United States)

    Varice, H.; Johannsen, K.; Sabaroff, S.

    1977-01-01

    System identifies terrestrial sources of radiofrequency interference and measures their frequency spectra and amplitudes. Designed to protect satellite communication networks, system measures entire noise spectrum over selected frequency band and can raster-scan geographical region to locate noise sources. Once interference is analyzed, realistic interference protection ratios are determined and mathematical models for predicting ratio-frequency noise spectra are established. This enhances signal-detection and locates optimum geographical positions and frequency bands for communication equipment.

  2. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  4. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  5. 基于有效专利的师范大学技术创新能力研究——与综合性大学的对比分析%The Research for Technological Innovation Efficiency of Normal University by Effective Patent——Analyzing with Comprehensive University

    Institute of Scientific and Technical Information of China (English)

    黄荣晓

    2011-01-01

    以有效专利拥有量作为衡量技术创新能力的指标,从专利类型、专利构成和有效专利维持年限等方面,分析了师范大学技术创新能力状况。应该通过完善政策导向、注重专利转化等政策来提高技术创新能力和水平。%By taking the effective patent as the measuring objects that can represent the technological innovation abilith,this paper analyzes the technological innovation efficiency of South China Normal University from the types,composition and effective lasting length of the patent.The innovation advantage of patent mostly distributes in physics,chemistry,biology and so on.In order to improving the efficiency of technological innovation,we should consummate the patent policy and importance the patent conversion.

  6. The Aqueduct Global Flood Analyzer

    Science.gov (United States)

    Iceland, Charles

    2015-04-01

    As population growth and economic growth take place, and as climate change accelerates, many regions across the globe are finding themselves increasingly vulnerable to flooding. A recent OECD study of the exposure of the world's large port cities to coastal flooding found that 40 million people were exposed to a 1 in 100 year coastal flood event in 2005, and the total value of exposed assets was about US 3,000 billion, or 5% of global GDP. By the 2070s, those numbers were estimated to increase to 150 million people and US 35,000 billion, or roughly 9% of projected global GDP. Impoverished people in developing countries are particularly at risk because they often live in flood-prone areas and lack the resources to respond. WRI and its Dutch partners - Deltares, IVM-VU University Amsterdam, Utrecht University, and PBL Netherlands Environmental Assessment Agency - are in the initial stages of developing a robust set of river flood and coastal storm surge risk measures that show the extent of flooding under a variety of scenarios (both current and future), together with the projected human and economic impacts of these flood scenarios. These flood risk data and information will be accessible via an online, easy-to-use Aqueduct Global Flood Analyzer. We will also investigate the viability, benefits, and costs of a wide array of flood risk reduction measures that could be implemented in a variety of geographic and socio-economic settings. Together, the activities we propose have the potential for saving hundreds of thousands of lives and strengthening the resiliency and security of many millions more, especially those who are most vulnerable. Mr. Iceland will present Version 1.0 of the Aqueduct Global Flood Analyzer and provide a preview of additional elements of the Analyzer to be released in the coming years.

  7. [Examination of the olfactory analyzer].

    Science.gov (United States)

    Domrachev, A A; Afon'kin, V Iu

    2002-01-01

    A method of threshold olfactometry is proposed consisting in the use of three olfactive substances (tincture of valerian, acetic acid, liquid ammonia) in selected concentrations. This allows to investigate the thresholds of certain modality. Each concentration of the olfactive substance is placed into a glass bottle (100 ml) and stored at the temperature 18-20 degrees C. The examination of the state of the olfactory analyzer within a 24-h working day showed stability of threshold olfactometry when the organism is tired. Utilization of threshold olfactometry in some diagnostic areas is shown. PMID:12056163

  8. COBSTRAN - COMPOSITE BLADE STRUCTURAL ANALYZER

    Science.gov (United States)

    Aiello, R. A.

    1994-01-01

    The COBSTRAN (COmposite Blade STRuctural ANalyzer) program is a pre- and post-processor that facilitates the design and analysis of composite turbofan and turboprop blades, as well as composite wind turbine blades. COBSTRAN combines composite mechanics and laminate theory with a data base of fiber and matrix properties. As a preprocessor for NASTRAN or another Finite Element Method (FEM) program, COBSTRAN generates an FEM model with anisotropic homogeneous material properties. Stress output from the FEM program is provided as input to the COBSTRAN postprocessor. The postprocessor then uses the composite mechanics and laminate theory routines to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. COBSTRAN is designed to carry out the many linear analyses required to efficiently model and analyze blade-like structural components made of multilayered angle-plied fiber composites. Components made from isotropic or anisotropic homogeneous materials can also be modeled as a special case of COBSTRAN. NASTRAN MAT1 or MAT2 material cards are generated according to user supplied properties. COBSTRAN is written in FORTRAN 77 and was implemented on a CRAY X-MP with a UNICOS 5.0.12 operating system. The program requires either COSMIC NASTRAN or MSC NASTRAN as a structural analysis package. COBSTRAN was developed in 1989, and has a memory requirement of 262,066 64 bit words.

  9. Combining two technologies for full genome sequencing of human.

    Science.gov (United States)

    Skryabin, K G; Prokhortchouk, E B; Mazur, A M; Boulygina, E S; Tsygankova, S V; Nedoluzhko, A V; Rastorguev, S M; Matveev, V B; Chekanov, N N; D A, Goranskaya; Teslyuk, A B; Gruzdeva, N M; Velikhov, V E; Zaridze, D G; Kovalchuk, M V

    2009-10-01

    At present, the new technologies of DNA sequencing are rapidly developing allowing quick and efficient characterisation of organisms at the level of the genome structure. In this study, the whole genome sequencing of a human (Russian man) was performed using two technologies currently present on the market - Sequencing by Oligonucleotide Ligation and Detection (SOLiD™) (Applied Biosystems) and sequencing technologies of molecular clusters using fluorescently labeled precursors (Illumina). The total number of generated data resulted in 108.3 billion base pairs (60.2 billion from Illumina technology and 48.1 billion from SOLiD technology). Statistics performed on reads generated by GAII and SOLiD showed that they covered 75% and 96% of the genome respectively. Short polymorphic regions were detected with comparable accuracy however, the absolute amount of them revealed by SOLiD was several times less than by GAII. Optimal algorithm for using the latest methods of sequencing was established for the analysis of individual human genomes. The study is the first Russian effort towards whole human genome sequencing.

  10. Thomson parabola ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Cobble, James A [Los Alamos National Laboratory; Flippo, Kirk A [Los Alamos National Laboratory; Letzring, Samuel A [Los Alamos National Laboratory; Lopez, Frank E [Los Alamos National Laboratory; Offermann, Dustin T [Los Alamos National Laboratory; Oertel, John A [Los Alamos National Laboratory; Mastrosimone, Dino [UNIV OF ROCHESTER

    2010-01-01

    A new, versatile Thomson parabola ion energy (TPIE) analyzer has been designed and constructed for use at the OMEGA-EP facility. Multi-MeV ions from EP targets are transmitted through a W pinhole into a (5- or 8-kG) magnetic field and subsequently through a parallel electric field of up to 30 kV/cm. The ion drift region may have a user-selected length of 10, 50, or 80 cm. With the highest fields, 500-Me V C{sup 6+} and C{sup 5+} may be resolved. TPIE is TIM-mounted at OMEGA-EP and is qualified in all existing TIMs. The instrument runs on pressure-interlocked 15-VDC power available in EP TIM carts. It may be inserted to within several inches of the target to attain sufficient flux for a measurement. For additional flux control, the user may select a square-aperture W pinhole of 0.004-inch or 0.010-inch. The detector consists of CR-39 backed by an image plate. The fully relativistic design code and design features are discussed. Ion spectral results from first use at OMEGA-EP are expected.

  11. Analyzing and modeling heterogeneous behavior

    Science.gov (United States)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  12. Complex networks theory for analyzing metabolic networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; YU Hong; LUO Jianhua; CAO Z.W.; LI Yixue

    2006-01-01

    One of the main tasks of post-genomic informatics is to systematically investigate all molecules and their interactions within a living cell so as to understand how these molecules and the interactions between them relate to the function of the organism,while networks are appropriate abstract description of all kinds of interactions. In the past few years, great achievement has been made in developing theory of complex networks for revealing the organizing principles that govern the formation and evolution of various complex biological, technological and social networks. This paper reviews the accomplishments in constructing genome-based metabolic networks and describes how the theory of complex networks is applied to analyze metabolic networks.

  13. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution.

  14. The Tragedy of a Modern Prometheus:Analyzing the Sci-technological Alienation in Mary Shelly ’s Frankenstein%一个现代普罗米修斯的悲剧--科技异化视域下解读玛丽·雪莱的《弗兰肯斯坦》

    Institute of Scientific and Technical Information of China (English)

    朱岩岩

    2015-01-01

    With the development of postmodern views on science and technology ,sci-technological alienation is attracting increasing attention nowadays .In Mary Shelly’s Frankenstein ,a British science fiction in 19th centu-ry ,the author depicts an unbelievable scientific experiment of a Prometheus-like scientist and his tragic fate brought by his unfathomable endeavor .The novel shows not only the fast paces of sci-technological development then ,but also the sincere reflections of people on the potential disasters that the unbridled scientific development may incur .Besides ,from the perspective of sci-technological alienation ,the article analyzes Frankenstein’s ma-nia in scientific pursuit ,his blasphemy in imitating God in creating a man ,his betrayal of humanity and morality in his experiments and finally his suffering from a tragic fate .%随着后现代主义科学观的兴起,科技异化的现象越来越引起人们关注。十九世纪英国作家玛丽·雪莱的科幻小说《弗兰肯斯坦》通过一个科学家制造怪物的冒险经历和由此遭受的悲剧命运,反映出当时英国科技突飞猛进的发展进程,并深刻反思科技异化给人类带来的恶果。以科技异化为理论依据剖析小说同名主人公弗兰肯斯坦疯狂追逐科技、狂妄模仿上帝造人和最终幡然悔悟的人生历程,借以诠释一个现代普罗米修斯的悲剧并警示放纵科技野心可能给人类带来无妄之灾。

  15. Analyzing E-Learning Adoption via Recursive Partitioning

    OpenAIRE

    Köllinger, Philipp; Schade, Christian

    2003-01-01

    The paper analyzes factors that influence the adoption of e-learning and gives an example of how to forecast technology adoption based on a post-hoc predictive segmentation using a classification and regression tree (CART). We find strong evidence for the existence of technological interdependencies and organizational learning effects. Furthermore, we find different paths to e-learning adoption. The results of the analysis suggest a growing ?digital divide? among firms. We use cross-sectional...

  16. Usage of data mining for analyzing customer mindset

    Directory of Open Access Journals (Sweden)

    Priti Sadaria

    2012-09-01

    Full Text Available As this is the era of Information Technology, no filed remains untouched by computer science. The technology has become an integral part of the business process. By implementing different data mining techniques and algorithms on the feedback collected from the customer, we can analyzed the data. With help of this analyzed information we have clear idea about the customer’s mind set and can take meaning full decision for production and marketing of particular product. To study about customer mindset differentmodels like classification and association models are used in data mining.

  17. Geospatial Technology

    Science.gov (United States)

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  18. Toward Integrated μNetwork Analyzer

    Science.gov (United States)

    Kmec, M.; Helbig, M.; Herrmann, R.; Rauschenbach, P.; Sachs, J.; Schilling, K.

    The article deals with recent development steps toward monolithically integrated micro-Network Analyzer (μNA). The device will deploy M-Sequence-based single-chip transceivers with a built-in ultra-wideband wave separation unit in the receiver chains. The introduced on-chip wideband wave separation is realized using an optimized resistive directional coupler combined with a customized differential LNA as detector. The wave separation works almost down to DC, and its upper frequency limit is determined by the performance of the implemented technology (i.e., bridge resistors, transistors, etc.), the selected circuit topology, and the wirings of particular coupler components but also by the IC packaging itself. Even though the upper limit is designed to be compatible with the analog input bandwidth of the receiver circuit [which is about 18 GHz for naked die (Kmec et al., M-Sequence based single chip UWB-radar sensor. ANTEM/AMEREM 2010 Conference, Ottawa, 2010)], the packaged IC is intended for use up to 8 GHz. Finally, the discussed transceiver is a further development of the mother SiGe System-on-Chip (SoC) presented in the work cited above.

  19. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  20. Analyzing 5 years of EC-TEL proceedings

    NARCIS (Netherlands)

    Reinhardt, Wolfgang; Meier, Christian; Drachsler, Hendrik; Sloep, Peter

    2011-01-01

    Reinhardt, W., Meier, C., Drachsler, H., & Sloep, P. B. (2011). Analyzing 5 years of EC-TEL proceedings. In C. D. Kloos, D. Gillet, R. M. Crespo García, F. Wild, & M. Wolpers (Eds.), Towards Ubiquitous Learning: 6th European Conference of Technology Enhanced Learning, EC-TEL 2011 (pp. 531-536). Sept

  1. Analyze Trace Aroma Subtances of Chinese Distillery by Large Volume Injection Techonolgy and Mass Spectrometry Technology%利用大体积进样技术与质谱技术联合分析中国白酒中的微量香味成分

    Institute of Scientific and Technical Information of China (English)

    王双; 徐占成; 徐姿静

    2014-01-01

    中国白酒中含有大量的香味成分和滋味成分,其中很多物质的含量极其稀少,这就给这些物质的分析工作造成了很大的困难。通常情况下为了分析这些极微量的物质,人们需要对白酒样品进行浓缩前处理,然后再利用气相色谱对浓缩样品进行分析。样品的浓缩过程是一项非常耗时耗力的工作,并且浓缩样品中含有大量的溶剂,这些物质在色谱分析中可能引起严重的分析误差,损坏色谱柱或者检测器。而大体积进样技术(LVI)则有效地避免了这些缺点。大体积进样技术可以在短时间内对分析样品进行浓缩,并且排除掉绝大多数的溶剂,从而大大提高了分析的灵敏度。将大体积进样技术和质谱分析技术联合应用,可以大大提高对白酒中极微量成分分析工作的准确性。%Contains the flavor and taste a lot of Chinese liquor, the content of many material extremely scarce, caused great difficulties to the analysis of these substances. Normally in order to analyze the trace substance, people need to liquor samples were concentrated before treatment, and then using gas chromatography to concentrated samples analysis. The enrichment process samples is very time consuming work, and contains a lot of solvent concentration in the sample, these substances may cause serious error analysis in chromatography column, damaged or detector. While large volume injection (LVI) technology can effectively avoid these shortcomings. Large volume injection can be concentrated on the analysis of samples in a short time, and eliminate most of the solvent, thereby greatly improving the analytical sensitivity. The large volume injection technology and mass spectrometry combined with accuracy, can trace component analysis work is greatly improved in liquor.

  2. Oxygen analyzers: failure rates and life spans of galvanic cells.

    Science.gov (United States)

    Meyer, R M

    1990-07-01

    Competing technologies exist for measuring oxygen concentrations in breathing circuits. Over a 4-year period, two types of oxygen analyzers were studied prospectively in routine clinical use to determine the incidence and nature of malfunctions. Newer AC-powered galvanic analyzers (North American Dräger O2med) were compared with older, battery-powered polarographic analyzers (Ohmeda 201) by recording all failures and necessary repairs. The AC-powered galvanic analyzer had a significantly lower incidence of failures (0.12 +/- 0.04 failures per machine-month) than the battery-powered polarographic analyzer (4.0 +/- 0.3 failures per machine-month). Disposable capsules containing the active galvanic cells lasted 12 +/- 7 months. Although the galvanic analyzers tended to remain out of service longer, awaiting the arrival of costly parts, the polarographic analyzers were more expensive to keep operating when calculations included the cost of time spent on repairs. Stocking galvanic capsules would have decreased the amount of time the galvanic analyzers were out of service, while increasing costs. In conclusion, galvanic oxygen analyzers appear capable of delivering more reliable service at a lower overall cost. By keeping the galvanic capsules exposed to room air during periods of storage, it should be possible to prolong their life span, further decreasing the cost of using them. In addition, recognizing the aberrations in their performance that warn of the exhaustion of the galvanic cells should permit timely recording and minimize downtime.

  3. 46 CFR 154.1360 - Oxygen analyzer.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  4. Analyzing the Change-Proneness of APIs and web APIs

    OpenAIRE

    Romano, D.

    2015-01-01

    Analyzing the Change-Proneness of APIs and web APIs APIs and web APIs are used to expose existing business logic and, hence, to ease the reuse of functionalities across multiple software systems. Software systems can use the business logic of legacy systems by binding their APIs and web APIs. With the emergence of a new programming paradigm called service-oriented, APIs are exposed as web APIs hiding the technologies used to implement legacy systems. As a consequence, web APIs establish contr...

  5. Analyzing Real-Time Behavior of Flash Memories

    OpenAIRE

    Parthey, Daniel

    2007-01-01

    Flash memories are used as the main storage in many portable consumer electronic devices because they are more robust than hard drives. This document gives an overview of existing consumer flash memory technologies which are mostly removable flash memory cards. It discusses to which degree consumer flash devices are suitable for real-time systems and provides a detailed timing analysis of some consumer flash devices. Further, it describes methods to analyze mount times, access per...

  6. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  7. ADAM: Analyzer for Dialectal Arabic Morphology

    Directory of Open Access Journals (Sweden)

    Wael Salloum

    2014-12-01

    Full Text Available While Modern Standard Arabic (MSA has many resources, Arabic Dialects, the primarily spoken local varieties of Arabic, are quite impoverished in this regard. In this article, we present ADAM (Analyzer for Dialectal Arabic Morphology. ADAM is a poor man’s solution to quickly develop morphological analyzers for dialectal Arabic. ADAM has roughly half the out-of-vocabulary rate of a state-of-the-art MSA analyzer and is comparable in its recall performance to an Egyptian dialectal morphological analyzer that took years and expensive resources to build.

  8. Designing of Acousto-optic Spectrum Analyzer

    Institute of Scientific and Technical Information of China (English)

    WANG Dan-zhi; SHAO Ding-rong; LI Shu-jian

    2004-01-01

    The structure of the acousto-optic spectrum analyzer was investigated including the RF amplifying circuit, the optical structures and the postprocessing circuit, and the design idea of the module was applied to design the spectrum analyzer. The modularization spectrum analyzer takes on the performance stabilization and higher reliability, and according to different demands, the different modules can be used. The spectrum analyzer had such performances as the detecting frequency error of 0.58MHz,detecting responsivity of 90 dBm and bandwidth of 50 Mhz.

  9. Entrepreneur and Technological Change

    OpenAIRE

    Myung-Joong Kwon; Jong-gul Lee

    1999-01-01

    The literature on technological change has grown in the last two decades and has made a number of significant theoretical advances, but the role of the entrepreneur in technological change has been relatively ignored. In this paper we attempted to fill this gap and explored the role of the entrepreneur in generating technological change. We constructed an empirical model to analyze the role of the entrepreneur in technological change in the context of deciding the undertaking of the innovatio...

  10. Social Media: A Phenomenon to be Analyzed

    Directory of Open Access Journals (Sweden)

    danah boyd

    2015-04-01

    Full Text Available The phenomenon of “social media” has more to do with its cultural positioning than its technological affordances. Rooted in the broader “Web 2.0” landscape, social media helped engineers, entrepreneurs, and everyday people reimagine the role that technology could play in information dissemination, community development, and communication. While the technologies invoked by the phrase social media have a long history, what unfolded in the 2000s reconfigured socio-technical practices in significant ways. Reflecting on the brief history of social media, this essay argues for the need to better understand this phenomenon.

  11. Performance evaluation of PL-11 platelet analyzer

    Institute of Scientific and Technical Information of China (English)

    张有涛

    2013-01-01

    Objective To evaluate and report the performance of PL-11 platelet analyzer. Methods Intravenous blood sam-ples anticoagulated with EDTA-K2 and sodium citrate were tested by the PL-11 platelet analyzer to evaluate the intra-assay and interassay coefficient of variation(CV),

  12. [Health technology in Mexico].

    Science.gov (United States)

    Cruz, C; Faba, G; Martuscelli, J

    1992-01-01

    The features of the health technology cycle are presented, and the effects of the demographic, epidemiologic and economic transition on the health technology demand in Mexico are discussed. The main problems of science and technology in the context of a decreasing scientific and technological activity due to the economic crisis and the adjustment policies are also analyzed: administrative and planning problems, low impact of scientific production, limitations of the Mexican private sector, and the obstacles for technology assessment. Finally, this paper also discusses the main support strategies for science and technology implemented by the Mexican government during the 1980s and the challenges and opportunities that lie ahead.

  13. Computational models for analyzing lipoprotein profiles

    NARCIS (Netherlands)

    Graaf, A.A. de; Schalkwijk, D.B. van

    2011-01-01

    At present, several measurement technologies are available for generating highly detailed concentration-size profiles of lipoproteins, offering increased diagnostic potential. Computational models are useful in aiding the interpretation of these complex datasets and making the data more accessible f

  14. Using Three-Dimensional Fluorescence Spectrum Technology to Analyze the Effects of Natural Dissolved Organic Matter on the Pesticide Residues in the Soil%溶解性有机物对土壤中农药残留与分布影响的光谱学研究

    Institute of Scientific and Technical Information of China (English)

    雷宏军; 潘红卫; 韩宇平; 刘鑫; 徐建新

    2015-01-01

    The behavior of pesticide in soil is influenced by dissolved organic matter (DOM ) through competi‐tion adsorption ,adsorption ,solubilization ,accelerated degradation ,and so on .Thus DOM and its components play an important role in the environmental risk in the soil ecosystem and groundwater environment .Current‐ly ,most studies focused on the short‐term effect of high concentration of DOM on the pesticide residues . However ,soil DOM is mainly at low level .Therefore ,there is of some practical significance to probe into the environmental behavior of soil pesticides under natural level of DOM .Thus a site investigation was conducted in the farmland with long‐term application history of pesticide .By using the three dimensional excitation‐emis‐sion fluorescence matrix (3D‐EEM ) technology ,together with the fluorescence regional integration (FRI) quantitative method ,the long‐term effects of pesticide residues under low concentration of natural DOM were analyzed .Results showed that :(1) The long‐term effects of the natural DOM components on the environment behavior of most soil organo‐chlorine pesticides were not significant except for a few pesticides such as γ‐HCH ,p ,p’‐DDE ,etc .(2) The influencing effects of DOM components on different type of pesticides were varied .Among which ,the content of tyrosine component showed a significantly negative correlation (p<0.05) with the concentration of γ‐HCH and p ,p’‐DDE .There were significant positive correlations (p<0.05) between the by‐products of microbial degradation in DOM components and the concentration of hepta‐chlor .There were also a significant positive correlation (p<0.05) between the content of active humus com‐ponent of humic acid in the DOM and the concentration of heptachlor epoxide .These results suggested that the distribution of different types of pesticides residue in the soil was influenced by different components at differ‐ent levels of significance .(3

  15. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  16. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  17. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  18. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  19. Analyzing IS User Requirements using Organizational Semiotics

    Directory of Open Access Journals (Sweden)

    Kamyar Raissifar

    2014-09-01

    Full Text Available In recent years, lack of appropriate understanding of IS user requirements, has been one of the most important causes of IS development failure. Therefore, many methods were introduced for better analyzing user requirements; some of them were philosophically different. Organizational semiotics (OS is one of these methods, which with phenomenological and action-oriented view, tries to get better system requirement analysis. In this research, first, organizational semiotics and SSADM was compared, with focus on their ability to elicit and analyze IS user requirements, and then, OS was applied in analyzing an IS requirement analysis case. Research findings show that OS in many dimensions is superior to SSADM; although SSADM has superiority in few dimensions too. Therefore using OS can help analyzing IS user requirements more appropriately.

  20. Analyzing Log Files using Data-Mining

    OpenAIRE

    Marius Mihut

    2008-01-01

    Information systems (i.e. servers, applications and communication devices) create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka) [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  1. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  2. The Information Flow Analyzing Based on CPC

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhang; LI Hui

    2005-01-01

    The information flow chart within product life cycle is given out based on collaborative production commerce (CPC) thoughts. In this chart, the separated information systems are integrated by means of enterprise knowledge assets that are promoted by CPC from production knowledge. The information flow in R&D process is analyzed in the environment of virtual R&D group and distributed PDM. In addition, the information flow throughout the manufacturing and marketing process is analyzed in CPC environment.

  3. QUBIT DATA STRUCTURES FOR ANALYZING COMPUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov

    2014-11-01

    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  4. Health technology

    International Nuclear Information System (INIS)

    The CEA is an organization with a primarily technological focus, and one of the key areas in which it carries out research is Health Technology. This field of research was recognized and approved by the French Atomic Energy Committee on July 20, 2004. The expectations of both the public and health care professionals relate to demands for the highest standards of health care, at minimum risk. This implies a need to diagnose illness and disease as accurately and as at early a stage as possible, to target surgery precisely to deal only with damaged organs or tissues, to minimize the risk of side effects, allergies and hospital-acquired infections, to follow-up and, as far as possible, tailor the health delivery system to each individual's needs and his or her lifestyle. The health care sector is subject to rapid changes and embraces a vast range of scientific fields. It now requires technological developments that will serve to gather increasing quantities of useful information, analyze and integrate it to obtain a full understanding of highly complex processes and to be able to treat the human body as un-invasively as possible. All the technologies developed require assessment, especially in the hospital environment. (authors)

  5. Analyzing Broadband Divide in the Farming Sector

    DEFF Research Database (Denmark)

    Jensen, Michael; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup

    2013-01-01

    Agriculture industry has been evolving for centuries. Currently, the technological development of Internet oriented farming tools allows to increase the productivity and efficiency of this sector. Many of the already available tools and applications require high bandwidth in both directions...... difference between the broadband availability for farms and the rest of the households/buildings the country. This divide may be slowing down the potential technological development of the farming industry, in order to keep their competitiveness in the market. Therefore, broadband development in rural areas...... could be one of the points to focus in a near future broadband access plans....

  6. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...

  7. An Axiomatic, Unified Representation of Biosystems and Quantum Dynamics

    CERN Document Server

    Baianu, I

    2004-01-01

    An axiomatic representation of system dynamics is introduced in terms of categories, functors, organismal supercategories, limits and colimits of diagrams. Specific examples are considered in Complex Systems Biology, such as ribosome biogenesis and Hormonal Control in human subjects. "Fuzzy" Relational Structures are also proposed for flexible representations of biological system dynamics and organization.

  8. Carbon Nanomaterials: Applications in Physico-chemical Systemsand Biosystems

    Directory of Open Access Journals (Sweden)

    Maheshwar Sharon

    2008-07-01

    Full Text Available In the present article, various forms of carbon and carbon nanomaterials (CNMs and a new approach to classify them on the basis of sp2-sp3 configuration are presented. Utilising the concept of junction formation (like p:n junction a concept is developed to explain the special reactivity of nanosized carbon materials. Geometric consideration of chiral and achiral symmetry of single-walled carbon nanotubes is presented which is also responsible for manifesting special propertiesof carbon nanotubes. A brief introduction to various common synthesis techniques of CNMs is given. These is increased chemical and biological activities have resulted in many engineer ednanoparticles, which are being designed for specific purposes, including diagnostic or the rapeuticmedical uses and environmental remediation.Defence Science Journal, 2008, 58(4, pp.460-485, DOI:http://dx.doi.org/10.14429/dsj.58.1668

  9. A Biosystems Approach to Industrial Patient Monitoring and Diagnostic Devices

    CERN Document Server

    Baura, Gail

    2008-01-01

    A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diag

  10. Optical Detection of core-gold nanoshells inside biosystems

    Science.gov (United States)

    D'Acunto, Mario; Dinarelli, Simone; Cricenti, Antonio; Luce, Marco

    2016-02-01

    Metal nanoshells having a dielectric core with a thin gold layer are generating new interest due to the unique optical, electric and magnetic properties exhibited by the local field enhancement near the metal - dielectric core interface. These nanoshells possess strong, highly tunable local plasmon resonances with frequencies dependent upon the nanoshell shape and core material. These unique characteristics have applications in biosensing, optical communication and medicine. In this paper, we developed a theoretical, numerical and experimental approach based on a scanning near optical microscope to identify nanoshells inside mouse cells. Taking advantage of the characteristic near-infrared transparency window of many biological systems, i.e. the low light absorption coefficient of biological systems between 750-1100 nm, we were able to identify a 100-150 nm diameter barium titanate-gold nanoshell inside the h9c2 mouse cells.

  11. Dashboard for Analyzing Ubiquitous Learning Log

    Science.gov (United States)

    Lkhagvasuren, Erdenesaikhan; Matsuura, Kenji; Mouri, Kousuke; Ogata, Hiroaki

    2016-01-01

    Mobile and ubiquitous technologies have been applied to a wide range of learning fields such as science, social science, history and language learning. Many researchers have been investigating the development of ubiquitous learning environments; nevertheless, to date, there have not been enough research works related to the reflection, analysis…

  12. Energy and technology review

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-01

    Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81). (GHT)

  13. Energy and technology review

    International Nuclear Information System (INIS)

    Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81)

  14. Description of the prototype diagnostic residual gas analyzer for ITER.

    Science.gov (United States)

    Younkin, T R; Biewer, T M; Klepper, C C; Marcus, C

    2014-11-01

    The diagnostic residual gas analyzer (DRGA) system to be used during ITER tokamak operation is being designed at Oak Ridge National Laboratory to measure fuel ratios (deuterium and tritium), fusion ash (helium), and impurities in the plasma. The eventual purpose of this instrument is for machine protection, basic control, and physics on ITER. Prototyping is ongoing to optimize the hardware setup and measurement capabilities. The DRGA prototype is comprised of a vacuum system and measurement technologies that will overlap to meet ITER measurement requirements. Three technologies included in this diagnostic are a quadrupole mass spectrometer, an ion trap mass spectrometer, and an optical penning gauge that are designed to document relative and absolute gas concentrations.

  15. Detecting influenza outbreaks by analyzing Twitter messages

    CERN Document Server

    Culotta, Aron

    2010-01-01

    We analyze over 500 million Twitter messages from an eight month period and find that tracking a small number of flu-related keywords allows us to forecast future influenza rates with high accuracy, obtaining a 95% correlation with national health statistics. We then analyze the robustness of this approach to spurious keyword matches, and we propose a document classification component to filter these misleading messages. We find that this document classifier can reduce error rates by over half in simulated false alarm experiments, though more research is needed to develop methods that are robust in cases of extremely high noise.

  16. Progresses in analyzing 26Al with SMCAMS

    International Nuclear Information System (INIS)

    Shanghai Mini-Cyclotron based Accelerator Mass Spectrometer (SMCAMS) was especially designed for analyzing 14C. In order to accelerate and analyze 26Al the accelerated orbit and beam optics in injection system were calculated and harmonic number and acceleration turns was optimized. Preliminary experiment was carried out. In which a beam current of 1.15 x 10-9A for 27Al- and 0.038 CPS background for 26Al were measured. The limited sensitivity of 26Al/27Al is 5.25 x 10-12. (authors)

  17. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  18. Analyzing the flight of a quadcopter using a smartphone

    CERN Document Server

    Monteiro, Martín; Cabeza, Cecilia; Marti, Arturo C

    2015-01-01

    Remotely-controlled helicopters and planes have been used as toys for decades. However, only recently, advances in sensor technologies have made possible to easily flight and control theses devices at an affordable price. Along with their increasing availability the educational opportunities are also proliferating. Here, a simple experiment in which a smartphone is mounted on a quadcopter is proposed to investigate the basics of a flight. Thanks to the smartphone's built-in accelerometer and gyroscope, take off, landing and yaw are analyzed.

  19. Analyzing the Information Economy: Tools and Techniques.

    Science.gov (United States)

    Robinson, Sherman

    1986-01-01

    Examines methodologies underlying studies which measure the information economy and considers their applicability and limitations for analyzing policy issues concerning libraries and library networks. Two studies provide major focus for discussion: Porat's "The Information Economy: Definition and Measurement" and Machlup's "Production and…

  20. 40 CFR 92.109 - Analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... comparable results to an HFID not using this procedure. These data must be submitted to the Administrator for... consistent with the general requirements of 40 CFR part 1065, subpart I, for sampling and analysis of... NO2 to NO converter. (ii) For high vacuum CL analyzers with heated capillary modules, supplying...

  1. Portable peltier-cooled X RF analyzer

    International Nuclear Information System (INIS)

    Full text: Recent development of semiconductor detectors has made it possible to design portable battery operated XRF-analyzers. Energy resolution and good peak to background ratio are close to liquid nitrogen cooled detector values. Application examples are given and a comparison of the new device between old ones is made. (author)

  2. 40 CFR 90.313 - Analyzers required.

    Science.gov (United States)

    2010-07-01

    ... ionization (HFID) type. For constant volume sampling, the hydrocarbon analyzer may be of the flame ionization (FID) type or of the heated flame ionization (HFID) type. (ii) For the HFID system, if the temperature... drying. Chemical dryers are not an acceptable method of removing water from the sample. Water removal...

  3. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  4. Miniature retarding grid ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, G.W.; Sawin, H.H.

    1992-12-01

    A retarding grid analyzer intended for use as a high-density ({approximately}10{sup 12}/cc) plasma diagnostic has been designed, built and tested. The analyzer`s external dimensions are 0.125 inch x0.125 inch x0.050 inch which are smaller than macroscopic plasma scale lengths, thus allowing it to be stalk mounted and moved throughout the plasma. The grids are 2000 line/inch nickel mesh so that the linear dimension of grid open area is less than the debye length for plasmas with 10 eV electrons and 10{sup 12}/cc densities. Successive grids are separated by 0.01 inch in order to avoid space charge effects between grids and thus allow unprecedented energy resolution. Also, because the linear dimension normal to the grid is small compared to the ion mean free path in high pressure (>100 mTorr) discharges, it can be used without the differential pumping required of larger GEA`s in such discharges. The analyzer has been tested on a plasma beam source (a modified ASTeX Compact ECR source) and on an ASTeX S1500ECR source, and has been used as an edge diagnostic on the VERSATOR tokamak at M.I.T. Ion energy distribution functions as narrow as 5 eV have been measured.

  5. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  6. Analyzing computer system performance with Perl

    CERN Document Server

    Gunther, Neil J

    2011-01-01

    This expanded second edition of Analyzing Computer System Performance with Perl::PDQ, builds on the success of the first edition. It contains new chapters on queues, tools and virtualization, and new Perl listing format to aid readability of PDQ models.

  7. Analyzing volatile compounds in dairy products

    Science.gov (United States)

    Volatile compounds give the first indication of the flavor in a dairy product. Volatiles are isolated from the sample matrix and then analyzed by chromatography, sensory methods, or an electronic nose. Isolation may be performed by solvent extraction or headspace analysis, and gas chromatography i...

  8. Strengthening 4-H by Analyzing Enrollment Data

    Science.gov (United States)

    Hamilton, Stephen F.; Northern, Angela; Neff, Robert

    2014-01-01

    The study reported here used data from the ACCESS 4-H Enrollment System to gain insight into strengthening New York State's 4-H programming. Member enrollment lists from 2009 to 2012 were analyzed using Microsoft Excel to determine trends and dropout rates. The descriptive data indicate declining 4-H enrollment in recent years and peak…

  9. Imaging thermal plasma mass and velocity analyzer

    Science.gov (United States)

    Yau, Andrew W.; Howarth, Andrew

    2016-07-01

    We present the design and principle of operation of the imaging ion mass and velocity analyzer on the Enhanced Polar Outflow Probe (e-POP), which measures low-energy (1-90 eV/e) ion mass composition (1-40 AMU/e) and velocity distributions using a hemispherical electrostatic analyzer (HEA), a time-of-flight (TOF) gate, and a pair of toroidal electrostatic deflectors (TED). The HEA and TOF gate measure the energy-per-charge and azimuth of each detected ion and the ion transit time inside the analyzer, respectively, providing the 2-D velocity distribution of each major ionospheric ion species and resolving the minor ion species under favorable conditions. The TED are in front of the TOF gate and optionally sample ions at different elevation angles up to ±60°, for measurement of 3-D velocity distribution. We present examples of observation data to illustrate the measurement capability of the analyzer, and show the occurrence of enhanced densities of heavy "minor" O++, N+, and molecular ions and intermittent, high-velocity (a few km/s) upward and downward flowing H+ ions in localized regions of the quiet time topside high-latitude ionosphere.

  10. Quantum Key Distribution with Screening and Analyzing

    CERN Document Server

    Kye, W H

    2006-01-01

    We propose a quantum key distribution scheme by using screening angles and analyzing detectors which enable to notice the presence of Eve who eavesdrops the quantum channel. We show the security of the proposed quantum key distribution against impersonation, photon number splitting, Trojan Horse, and composite attacks.

  11. Quantum Key Distribution with Screening and Analyzing

    OpenAIRE

    Kye, Won-Ho

    2006-01-01

    We propose a quantum key distribution scheme by using screening angles and analyzing detectors which enable to notice the presence of Eve who eavesdrops the quantum channel, as the revised protocol of the recent quantum key distribution [Phys. Rev. Lett. 95, 040501 (2005)]. We discuss the security of the proposed quantum key distribution against various attacks including impersonation attack and Trojan Horse attack.

  12. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  13. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  14. APPLICATION OF NEW TECHNOLOGIES TO THE REGIONAL MANAGEMENT AGRIBUSINESS

    OpenAIRE

    S. V. Kolyadenko; Makolkina, Ye. V.

    2014-01-01

    ?????? ???????????????? ???????????????? ?????????? ????? ?????????? ??????????? ????????????? ?????????? ???. ? ?????? ?????????? ???????????? ??????? ????????? ???????? ???????????????? ?????????????? ???????? ? ???????????? ???????????. ??????????? ?????????????? ????????? ????????? ???????????? ????????????? ? ????????????, ???????????????????? ?????????? ??????????? ? ?????? ?????????? ?????????. The authors analyzed the feasibility of new technologies in the management of the Regional O...

  15. Analyzing Music Services Positioning Through Qualitative Research

    OpenAIRE

    Manuel Cuadrado; María José Miquel; Juan D. MONTORO

    2015-01-01

    Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. ...

  16. Technology Policy and Employment.

    Science.gov (United States)

    Williams, Bruce

    1983-01-01

    Current social and economic problems in the United Kingdom are placed in the context of long-term trends in labor economics and the impact of new technology. The relationship of technological change and economic recovery is analyzed. Policy implications and the university's role are discussed. (MSE)

  17. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2003-01-01

    This study analyzes how a group of ‘mediators’ in a large, multinational company adapted a computer-mediated communication technology (a ‘virtual workspace’) to the organizational context (and vice versa) by modifying features of the technology, providing ongoing support for users, and promoting...

  18. BK/TD models for analyzing in vitro impedance data on cytotoxicity

    OpenAIRE

    Teng, Sophie; Barcellini-Couget, Sylvie; Beaudoin, Rémy; Desousa, Georges; Rahmani, Roger; Pery, Alexandre

    2015-01-01

    The ban of animal testing has enhanced the development of new in vitro technologies for cosmetics safety assessment. Impedance metrics is one such technology which enables monitoring of cell viability in real time. However, analyzing real time data requires moving from static to dynamic toxicity assessment. In the present study, we built mechanistic biokinetic/toxicodynamic (BK/TD) models to analyze the time course of cell viability in cytotoxicity assay using impedance. These models accou...

  19. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  20. External technology sources: Embodied or disembodied technology acquisition

    OpenAIRE

    Cassiman, Bruno; Veugelers, Reinhilde

    2000-01-01

    This paper analyzes the choice between different innovation activities of a firm. In particular, we study the technology acquisition decision of the firm, i.e. its technology BUY decision as part of the firm's innovation strategy. We take a closer look at the different types of external technology acquisition where we distinguish two broad types of technology buy decisions. On the one hand, the firm can acquire new technology which is embodied in an asset that is acquir...

  1. An improved prism energy analyzer for neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, J., E-mail: jennifer.schulz@helmholtz-berlin.de [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Ott, F. [Laboratoire Leon Brillouin, Bât 563 CEA Saclay, 91191 Gif sur Yvette Cedex (France); Krist, Th. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany)

    2014-04-21

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å.

  2. The EPOS Automated Selective Chemistry Analyzer evaluated.

    Science.gov (United States)

    Moses, G C; Lightle, G O; Tuckerman, J F; Henderson, A R

    1986-01-01

    We evaluated the analytical performance of the EPOS (Eppendorf Patient Oriented System) Automated Selective Chemistry Analyzer, using the following tests for serum analytes: alanine and aspartate aminotransferases, lactate dehydrogenase, creatine kinase, gamma-glutamyltransferase, alkaline phosphatase, and glucose. Results from the EPOS correlated well with those from comparison instruments (r greater than or equal to 0.990). Precision and linearity limits were excellent for all tests; linearity of the optical and pipetting systems was satisfactory. Reagent carryover was negligible. Sample-to-sample carryover was less than 1% for all tests, but only lactate dehydrogenase was less than the manufacturer's specified 0.5%. Volumes aspirated and dispensed by the sample and reagent II pipetting systems differed significantly from preset values, especially at lower settings; the reagent I system was satisfactory at all volumes tested. Minimal daily maintenance and an external data-reduction system make the EPOS a practical alternative to other bench-top chemistry analyzers.

  3. An improved prism energy analyzer for neutrons

    International Nuclear Information System (INIS)

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å

  4. Neutral Particle Analyzer Diagnostic on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    S.S. Medley; A.L. Roquemore

    2004-03-16

    The Neutral Particle Analyzer (NPA) diagnostic on the National Spherical Torus Experiment (NSTX) utilizes a PPPL-designed E||B spectrometer that measures the energy spectra of minority hydrogen and bulk deuterium species simultaneously with 39 energy channels per mass specie and a time resolution of 1 ms. The calibrated energy range is E = 0.5-150 keV and the energy resolution varies from AE/E = 3-7% over the surface of the microchannel plate detector.

  5. SACO: Static analyzer for concurrent objects

    OpenAIRE

    Albert Albiol, Elvira; Arenas Sánchez, Purificación; Flores Montoya, A.; Genaim, Samir; Gómez-Zamalloa Gil, Miguel; Martín Martín, Enrique; Puebla, G.; Román Díez, Guillermo

    2014-01-01

    We present the main concepts, usage and implementation of SACO, a static analyzer for concurrent objects. Interestingly, SACO is able to infer both liveness(namely termination and resource boundedness) and safety properties (namely deadlock freedom) of programs based on concurrent objects. The system integrates auxiliary analyses such as points-to and may-happen-in-parallel, which are essential for increasing the accuracy of the aforementioned more complex properties. SACO provides accurate ...

  6. LEGAL-EASE:Analyzing Chinese Financial Statements

    Institute of Scientific and Technical Information of China (English)

    EDWARD; MA

    2008-01-01

    In this article,we will focus on under- standing and analyzing the typical accounts of Chinese financial statements,including the balance sheet and income statement. Accounts are generally incorrectly prepared. This can be due to several factors,incom- petence,as well as more serious cases of deliberate attempts to deceive.Regardless, accounts can be understood and errors or specific acts of misrepresentation uncovered. We will conduct some simple analysis to demonstrate how these can be spotted.

  7. Organization theory. Analyzing health care organizations.

    Science.gov (United States)

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible. PMID:10164970

  8. A gas filter correlation analyzer for methane

    Science.gov (United States)

    Sebacher, D. I.

    1978-01-01

    A fast-response instrument for monitoring CH4 was designed and tested using a modified nondispersive infrared technique. An analysis of the single-beam rotating-cell system is presented along with the signal processing circuit. A calibration of the instrument shows that the technique can be used to measure CH4 concentrations as small as 5 ppm-m and the effects of interfering gases are analyzed.

  9. Information Theory for Analyzing Neural Networks

    OpenAIRE

    Sørngård, Bård

    2014-01-01

    The goal of this thesis was to investigate how information theory could be used to analyze artificial neural networks. For this purpose, two problems, a classification problem and a controller problem were considered. The classification problem was solved with a feedforward neural network trained with backpropagation, the controller problem was solved with a continuous-time recurrent neural network optimized with evolution.Results from the classification problem shows that mutual information ...

  10. Firms’ Innovation Strategies Analyzed and Explained

    OpenAIRE

    Tavassoli, Sam; Karlsson, Charlie

    2015-01-01

    This paper analyzes various innovation strategies of firms. Using five waves of the Community Innovation Survey in Sweden, we have traced the innovative behavior of firms over a ten-year period, i.e. between 2002 and 2012. We distinguish between sixteen innovation strategies, which compose of Schumpeterian four types of innovations (process, product, marketing, and organizational) plus various combinations of these four types. First, we find that firms are not homogenous in choosing innovatio...

  11. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  12. Coordinating, Scheduling, Processing and Analyzing IYA09

    Science.gov (United States)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  13. Modular Construction of Shape-Numeric Analyzers

    Directory of Open Access Journals (Sweden)

    Bor-Yuh Evan Chang

    2013-09-01

    Full Text Available The aim of static analysis is to infer invariants about programs that are precise enough to establish semantic properties, such as the absence of run-time errors. Broadly speaking, there are two major branches of static analysis for imperative programs. Pointer and shape analyses focus on inferring properties of pointers, dynamically-allocated memory, and recursive data structures, while numeric analyses seek to derive invariants on numeric values. Although simultaneous inference of shape-numeric invariants is often needed, this case is especially challenging and is not particularly well explored. Notably, simultaneous shape-numeric inference raises complex issues in the design of the static analyzer itself. In this paper, we study the construction of such shape-numeric, static analyzers. We set up an abstract interpretation framework that allows us to reason about simultaneous shape-numeric properties by combining shape and numeric abstractions into a modular, expressive abstract domain. Such a modular structure is highly desirable to make its formalization and implementation easier to do and get correct. To achieve this, we choose a concrete semantics that can be abstracted step-by-step, while preserving a high level of expressiveness. The structure of abstract operations (i.e., transfer, join, and comparison follows the structure of this semantics. The advantage of this construction is to divide the analyzer in modules and functors that implement abstractions of distinct features.

  14. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  15. Analyzing Malware Based on Volatile Memory

    Directory of Open Access Journals (Sweden)

    Liang Hu

    2013-11-01

    Full Text Available To explain the necessity of comprehensive and automatically analysis process for volatile memory, this paper summarized ordinarily analyzing methods and their common points especially for concerned data source. Then, a memory analysis framework Volatiltiy-2.2 and statistical output file size are recommended. In addition, to address the limitation of plug-ins classification in analyzing procedure, a user perspective classify is necessary and proposed. Furthermore, according to target data source differences on the base of result data set volume and employed relational method is introduced for comprehensive analysis guideline procedure. Finally, a test demo including DLLs loading order list analyzing is recommend, in which DLL load list is regard as different kind of characteristics typical data source with process and convert into process behavior fingerprint. The clustering for the fingerprint is employed string similar degree algorithm model in the demo, which has a wide range applications in traditional malware behavior analysis, and it is proposed that these methods also can be applied for volatile memory

  16. Aliasing Errours in Parallel Signature Analyzers

    Institute of Scientific and Technical Information of China (English)

    闵应骅; YashwantK.Malaiya

    1990-01-01

    A Linear Feedback Shift Register(LFSR)can be used to compress test response data as a Signature Analyzer(SA).Parallel Signature Analyzers(PSAs)implemented as multiple input LFSRs are faster and require less hardware overhead than Serial Signature Analyzers(SSAs) for compacting test response data for Built-In Self-Test(BIST)in IC of boare-testing environments.However,the SAs are prone to aliasing errors because of some specific types of error patterns.An alias is a faulty output signature that is identical to the fault-free signature.A penetrating analysis of detecting capability of SAs depends strongly on mathematical manipulations,instead of being aware of some special cases of examples.In addition,the analysis should not be restricted to a particular structure of LFSR,but be appropriate for various structures of LFSRs.This paper presents necessary and sufficient conditions for aliasing errors based on a complete mathematical description of various types of SAs.An LFSR reconfiguration scheme is suggested which will prevent any aliasing double errors.Such a prevention cannot be obtained by any extension of an LFSR.

  17. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  18. On-line chemical composition analyzer development

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, M.J.; Garrison, A.A.; Muly, E.C.; Moore, C.F.

    1992-02-01

    The energy consumed in distillation processes in the United States represents nearly three percent of the total national energy consumption. If effective control of distillation columns can be accomplished, it has been estimated that it would result in a reduction in the national energy consumption of 0.3%. Real-time control based on mixture composition could achieve these savings. However, the major distillation processes represent diverse applications and at present there does not exist a proven on-line chemical composition sensor technology which can be used to control these diverse processes in real-time. This report presents a summary of the findings of the second phase of a three phase effort undertaken to develop an on-line real-time measurement and control system utilizing Raman spectroscopy. A prototype instrument system has been constructed utilizing a Perkin Elmer 1700 Spectrometer, a diode pumped YAG laser, two three axis positioning systems, a process sample cell land a personal computer. This system has been successfully tested using industrially supplied process samples to establish its performance. Also, continued application development was undertaken during this Phase of the program using both the spontaneous Raman and Surface-enhanced Raman modes of operation. The study was performed for the US Department of Energy, Office of Industrial Technologies, whose mission is to conduct cost-shared R D for new high-risk, high-payoff industrial energy conservation technologies. Although this document contains references to individual manufacturers and their products, the opinions expressed on the products reported do not necessarily reflect the position of the Department of Energy.

  19. A incorporação de novas tecnologias nos serviços de saúde: o desafio da análise dos fatores em jogo Adoption of new technologies by health services: the challenge of analyzing relevant factors

    Directory of Open Access Journals (Sweden)

    Evelinda Trindade

    2008-05-01

    Full Text Available A dinâmica exponencial de incorporação tecnológica na saúde tem sido considerada como uma das razões para o crescimento dos gastos do setor. Estas decisões envolvem múltiplos níveis e stakeholders. A descentralização multiplicou os níveis de decisão, com difíceis escolhas múltiplas e recursos restritos. A inter-relação entre os atores é complexa, em sistemas criativos com múltiplos determinantes e fatores de confusão. Esta revisão discute a interação entre os fatores que influenciam as decisões de incorporação de tecnologias nos serviços de saúde e propõe uma estrutura para sua análise. A aplicação e intensidade desses fatores nos processos de decisão de incorporação de produtos e programas nos serviços de saúde conformam a capacidade instalada nas redes locais e regionais e modifica o sistema de saúde. A observação empírica dos processos de decisão de incorporação tecnológica nos serviços de saúde do Brasil constitui um desafio importante. O reconhecimento estruturado e dimensionamento destas variáveis podem auxiliar a melhorar o planejamento pró-ativo dos serviços de saúde.The exponential increase in the incorporation of health technologies has been considered a key factor in increased expenditures by the health sector. Such decisions involve multiple levels and stakeholders. Decentralization has multiplied the decision-making levels, with numerous difficult choices and limited resources. The interrelationship between stakeholders is complex, in creative systems with multiple determinants and confounders. The current review discusses the interaction between the factors influencing the decisions to incorporate technologies by health services, and proposes a structure for their analysis. The application and intensity of these factors in decision-making and the incorporation of products and programs by health services shapes the installed capacity of local and regional networks and modifies the

  20. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-04-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size separated particles is collected electrostatically on a metal filament, resistively desorbed and consequently analyzed for its molecular composition in a time of flight mass spectrometer. We report of technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of known masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  1. A computer program for analyzing channel geometry

    Science.gov (United States)

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  2. Measuring and analyzing on natural radioactive nuclide uranium concentration in mineral water from market

    International Nuclear Information System (INIS)

    Using the Laser-fluorescence analyzing technology and adopting the standard mix method, the measuring and analyzing on mineral water was made. Seventeen samples of mineral water were chosen. The LMA-3 type laser trace analysis instrument was employed. The measuring result showed that the uranium content of the mineral water belongs in normal radioactive background level

  3. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  4. FOMA: A Fast Optical Multichannel Analyzer

    Science.gov (United States)

    Haskovec, J. S.; Bramson, G.; Brooks, N. H.; Perry, M.

    1989-12-01

    A Fast Optical Multichannel Analyzer (FOMA) was built for spectroscopic measurements with fast time resolution on the DIII-D tokamak. The FOMA utilizes a linear photodiode array (RETICON RL 1024 SA) as the detector sensor. An external recharge switch and ultrafast operational amplifiers permit a readout time per pixel of 300 ns. In conjunction with standard CAMAC digitizer and timing modules, a readout time of 500 microns is achieved for the full 1024-element array. Data acquired in bench tests and in actual spectroscopic measurements on the DIII-D tokamak is presented to illustrate the camera's capability.

  5. Spectrum Analyzers Incorporating Tunable WGM Resonators

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry; Maleki, Lute

    2009-01-01

    A photonic instrument is proposed to boost the resolution for ultraviolet/ optical/infrared spectral analysis and spectral imaging allowing the detection of narrow (0.00007-to-0.07-picometer wavelength resolution range) optical spectral signatures of chemical elements in space and planetary atmospheres. The idea underlying the proposal is to exploit the advantageous spectral characteristics of whispering-gallery-mode (WGM) resonators to obtain spectral resolutions at least three orders of magnitude greater than those of optical spectrum analyzers now in use. Such high resolutions would enable measurement of spectral features that could not be resolved by prior instruments.

  6. APPLICATION OF IMAGE MANIPULATION FOR CAVITATION ANALYZING

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new method, which is called image manipulation,is introduced to analyze the cavitation of flow field for the first time. As the complexity of the cavitation development must be considering,only the method of image manipulation can calculate the strength of the cavitation more accurately. This method based on wavelet transform is used to eliminate the noise. The area of the cavitations is deduced to serve as the strength of cavitation. The method is applied in an example of inducer's rotating cavitation. The results show that using image manipulation can get the accurate date of cavitation with ease,and the reason of the inducer shaft's vibration is uncovered clearly.

  7. Analyzing Ever Growing Datasets in PHENIX

    Energy Technology Data Exchange (ETDEWEB)

    Pinkenburg, C.; PHENIX Collaboration

    2010-10-18

    After 10 years of running, the PHENIX experiment has by now accumulated more than 700 TB of reconstructed data which are directly used for analysis. Analyzing these amounts of data efficiently requires a coordinated approach. Beginning in 2005 we started to develop a system for the RHIC Atlas Computing Facility (RACF) which allows the efficient analysis of these large data sets. The Analysis Taxi is now the tool which allows any collaborator to process any data set taken since 2003 in weekly passes with turnaround times of typically three to four days.

  8. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  9. Analyzing PICL trace data with MEDEA

    Energy Technology Data Exchange (ETDEWEB)

    Merlo, A.P. [Pavia Univ. (Italy). Dipt di Informatica e Sistemistica; Worley, P.H. [Oak Ridge National Lab., TN (United States)

    1993-11-01

    Execution traces and performance statistics can be collected for parallel applications on a variety of multiprocessor platforms by using the Portable Instrumented Communication Library (PICL). The static and dynamic performance characteristics of performance data can be analyzed easily and effectively with the facilities provided within the MEasurements Description Evaluation and Analysis tool (MEDEA). This report describes the integration of the PICL trace file format into MEDEA. A case study is then outlined that uses PICL and MEDEA to characterize the performance of a parallel benchmark code executed on different hardware platforms and using different parallel algorithms and communication protocols.

  10. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  11. Analyzing and Mining Ordered Information Tables

    Institute of Scientific and Technical Information of China (English)

    SAI Ying (赛英); Y. Y. Yao

    2003-01-01

    Work in inductive learning has mostly been concentrated on classifying. However,there are many applications in which it is desirable to order rather than to classify instances. For modelling ordering problems, we generalize the notion of information tables to ordered information tables by adding order relations in attribute values. Then we propose a data analysis model by analyzing the dependency of attributes to describe the properties of ordered information tables.The problem of mining ordering rules is formulated as finding association between orderings of attribute values and the overall ordering of objects. An ordering rules may state that "if the value of an object x on an attribute a is ordered ahead of the value of another object y on the same attribute, then x is ordered ahead of y". For mining ordering rules, we first transform an ordered information table into a binary information table, and then apply any standard machine learning and data mining algorithms. As an illustration, we analyze in detail Maclean's universities ranking for the year 2000.

  12. Optoacoustic 13C-breath test analyzer

    Science.gov (United States)

    Harde, Hermann; Helmrich, Günther; Wolff, Marcus

    2010-02-01

    The composition and concentration of exhaled volatile gases reflects the physical ability of a patient. Therefore, a breath analysis allows to recognize an infectious disease in an organ or even to identify a tumor. One of the most prominent breath tests is the 13C-urea-breath test, applied to ascertain the presence of the bacterium helicobacter pylori in the stomach wall as an indication of a gastric ulcer. In this contribution we present a new optical analyzer that employs a compact and simple set-up based on photoacoustic spectroscopy. It consists of two identical photoacoustic cells containing two breath samples, one taken before and one after capturing an isotope-marked substrate, where the most common isotope 12C is replaced to a large extent by 13C. The analyzer measures simultaneously the relative CO2 isotopologue concentrations in both samples by exciting the molecules on specially selected absorption lines with a semiconductor laser operating at a wavelength of 2.744 μm. For a reliable diagnosis changes of the 13CO2 concentration of 1% in the exhaled breath have to be detected at a concentration level of this isotope in the breath of about 500 ppm.

  13. Analyzing Network Coding Gossip Made Easy

    CERN Document Server

    Haeupler, Bernhard

    2010-01-01

    We give a new technique to analyze the stopping time of gossip protocols that are based on random linear network coding (RLNC). Our analysis drastically simplifies, extends and strengthens previous results. We analyze RLNC gossip in a general framework for network and communication models that encompasses and unifies the models used previously in this context. We show, in most settings for the first time, that it converges with high probability in the information-theoretically optimal time. Most stopping times are of the form O(k + T) where k is the number of messages to be distributed and T is the time it takes to disseminate one message. This means RLNC gossip achieves "perfect pipelining". Our analysis directly extends to highly dynamic networks in which the topology can change completely at any time. This remains true even if the network dynamics are controlled by a fully adaptive adversary that knows the complete network state. Virtually nothing besides simple O(kT) sequential flooding protocols was prev...

  14. Atmospheric Aerosol Chemistry Analyzer: Demonstration of feasibility

    Energy Technology Data Exchange (ETDEWEB)

    Mroz, E.J.; Olivares, J.; Kok, G.

    1996-04-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The project objective was to demonstrate the technical feasibility of an Atmospheric Aerosol Chemistry Analyzer (AACA) that will provide a continuous, real-time analysis of the elemental (major, minor and trace) composition of atmospheric aerosols. The AACA concept is based on sampling the atmospheric aerosol through a wet cyclone scrubber that produces an aqueous suspension of the particles. This suspension can then be analyzed for elemental composition by ICP/MS or collected for subsequent analysis by other methods. The key technical challenge was to develop a wet cyclone aerosol sampler suitable for respirable particles found in ambient aerosols. We adapted an ultrasonic nebulizer to a conventional, commercially available, cyclone aerosol sampler and completed collection efficiency tests for the unit, which was shown to efficiently collect particles as small as 0.2 microns. We have completed the necessary basic research and have demonstrated the feasibility of the AACA concept.

  15. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  16. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  17. The Solar Wind Ion Analyzer for MAVEN

    Science.gov (United States)

    Halekas, J. S.; Taylor, E. R.; Dalton, G.; Johnson, G.; Curtis, D. W.; McFadden, J. P.; Mitchell, D. L.; Lin, R. P.; Jakosky, B. M.

    2015-12-01

    The Solar Wind Ion Analyzer (SWIA) on the MAVEN mission will measure the solar wind ion flows around Mars, both in the upstream solar wind and in the magneto-sheath and tail regions inside the bow shock. The solar wind flux provides one of the key energy inputs that can drive atmospheric escape from the Martian system, as well as in part controlling the structure of the magnetosphere through which non-thermal ion escape must take place. SWIA measurements contribute to the top level MAVEN goals of characterizing the upper atmosphere and the processes that operate there, and parameterizing the escape of atmospheric gases to extrapolate the total loss to space throughout Mars' history. To accomplish these goals, SWIA utilizes a toroidal energy analyzer with electrostatic deflectors to provide a broad 360∘×90∘ field of view on a 3-axis spacecraft, with a mechanical attenuator to enable a very high dynamic range. SWIA provides high cadence measurements of ion velocity distributions with high energy resolution (14.5 %) and angular resolution (3.75∘×4.5∘ in the sunward direction, 22.5∘×22.5∘ elsewhere), and a broad energy range of 5 eV to 25 keV. Onboard computation of bulk moments and energy spectra enable measurements of the basic properties of the solar wind at 0.25 Hz.

  18. Sentiment Analyzer for Arabic Comments System

    Directory of Open Access Journals (Sweden)

    Alaa El-Dine Ali Hamouda

    2013-04-01

    Full Text Available Today, the number of users of social network is increasing. Millions of users share opinions on different aspects of life every day. Therefore social network are rich sources of data for opinion mining and sentiment analysis. Also users have become more interested in following news pages on Facebook. Several posts; political for example, have thousands of users’ comments that agree/disagree with the post content. Such comments can be a good indicator for the community opinion about the post content. For politicians, marketers, decision makers …, it is required to make sentiment analysis to know the percentage of users agree, disagree and neutral respect to a post. This raised the need to analyze theusers’ comments in Facebook. We focused on Arabic Facebook news pages for the task of sentiment analysis. We developed a corpus for sentiment analysis and opinion mining purposes. Then, we used different machine learning algorithms – decision tree, support vector machines, and naive bayes - to develop sentiment analyzer. The performance of the system using each technique was evaluated and compared with others.

  19. Approaches for Managing and Analyzing Unstructured Data

    Directory of Open Access Journals (Sweden)

    N. Veeranjaneyulu

    2014-01-01

    Full Text Available Large volumes of data that will be stored and accessed in future is unstructured. The unstructured data is generated in a very fast pace and uses large storage areas. This increases the storage budget. Extracting value from this unstructured data which balances the budget is the most challenging task. Archives of interactive media, satellite and medical images, information from social network sites, legal documents, presentations and web pages from various data sources affects the data center's ability to maintain control over the unstructured data. Therefore, it is very essential to design systems to provide efficient storage, and access to these vast and continuously growing repositories of unstructured data. This can be achieved by retrieving structured information from the unstructured data. In this paper, we discuss approaches to process and manage such data. We also elaborate the architecture, technologies and applications to facilitate system design and evaluation.

  20. Mapping Technology Space by Normalizing Technology Relatedness Networks

    CERN Document Server

    Alstott, Jeff; Yan, Bowen; Luo, Jianxi

    2015-01-01

    Technology is a complex system, with technologies relating to each other in a space that can be mapped as a network. The technology relatedness network's structure can reveal properties of technologies and of human behavior, if it can be mapped accurately. Technology networks have been made from patent data, using several measures of relatedness. These measures, however, are influenced by factors of the patenting system that do not reflect technologies or their relatedness. We created technology networks that precisely controlled for these impinging factors and normalized them out, using data from 3.9 million patents. The normalized technology relatedness networks were sparse, with only ~20% of technology domain pairs more related than would be expected by chance. Different measures of technology relatedness became more correlated with each other after normalization, approaching a single dimension of technology relatedness. The normalized network corresponded with human behavior: we analyzed the patenting his...

  1. 从就业现状分析医疗美容技术专业教育教学改革中的问题与对策%Analyzing the problems and countermeasures needing attention in educating and teaching medical beauty technology from the view of employment present situation

    Institute of Scientific and Technical Information of China (English)

    郝超

    2016-01-01

    作为美容医学教育的重要分支,医疗美容技术是当前新兴专业,市场对毕业生需求量大,就业前景较好,面对求大于供的就业市场,医疗美容行业正如火如荼地发展着[1-2]。可是目前本专业的发展面临很多问题,毕业生频繁跳槽,人才五年流失率高。招生就业形势都不容乐观,如果不解决此困境,再多的教育教学改革只能成为资源浪费。笔者结合本校历年来医疗美容技术专业的就业现状,校企合作情况,探讨教育教学中改革的思路、措施与对策。%As an important branch of cosmetic medical education, medical beauty technology is a emerging specialty currently. The job market needs large number of graduates, and it has better employment prospect.Faced with demand exceeds supply in the job market,Medical beauty industry is developing in its full swing.But the current specialty development faces many problems,such as graduates change jobs frequently,high wastage rate of graduates talent for five years.Both the enrollment and employment situation are not optimistic.If we don’t resolve this dilemma,no matter how much education and teaching reform can only be a waste of resources.This article attempts to discuss the ideas, measures and countermeasures in education and teaching reform based on the employment present situation of medical beauty technology specialty and school-enterprise cooperation in my collage .

  2. Analyzing the problems and countermeasures needing attention in educating and teaching medical beauty technology from the view of employment present situation%从就业现状分析医疗美容技术专业教育教学改革中的问题与对策

    Institute of Scientific and Technical Information of China (English)

    郝超

    2016-01-01

    作为美容医学教育的重要分支,医疗美容技术是当前新兴专业,市场对毕业生需求量大,就业前景较好,面对求大于供的就业市场,医疗美容行业正如火如荼地发展着[1-2]。可是目前本专业的发展面临很多问题,毕业生频繁跳槽,人才五年流失率高。招生就业形势都不容乐观,如果不解决此困境,再多的教育教学改革只能成为资源浪费。笔者结合本校历年来医疗美容技术专业的就业现状,校企合作情况,探讨教育教学中改革的思路、措施与对策。%As an important branch of cosmetic medical education, medical beauty technology is a emerging specialty currently. The job market needs large number of graduates, and it has better employment prospect.Faced with demand exceeds supply in the job market,Medical beauty industry is developing in its full swing.But the current specialty development faces many problems,such as graduates change jobs frequently,high wastage rate of graduates talent for five years.Both the enrollment and employment situation are not optimistic.If we don’t resolve this dilemma,no matter how much education and teaching reform can only be a waste of resources.This article attempts to discuss the ideas, measures and countermeasures in education and teaching reform based on the employment present situation of medical beauty technology specialty and school-enterprise cooperation in my collage .

  3. Toward Sustainable Anticipatory Governance: Analyzing and Assessing Nanotechnology Innovation Processes

    Science.gov (United States)

    Foley, Rider Williams

    Cities around the globe struggle with socio-economic disparities, resource inefficiency, environmental contamination, and quality-of-life challenges. Technological innovation, as one prominent approach to problem solving, promises to address these challenges; yet, introducing new technologies, such as nanotechnology, into society and cities has often resulted in negative consequences. Recent research has conceptually linked anticipatory governance and sustainability science: to understand the role of technology in complex problems our societies face; to anticipate negative consequences of technological innovation; and to promote long-term oriented and responsible governance of technologies. This dissertation advances this link conceptually and empirically, focusing on nanotechnology and urban sustainability challenges. The guiding question for this dissertation research is: How can nanotechnology be innovated and governed in responsible ways and with sustainable outcomes? The dissertation: analyzes the nanotechnology innovation process from an actor- and activities-oriented perspective (Chapter 2); assesses this innovation process from a comprehensive perspective on sustainable governance (Chapter 3); constructs a small set of future scenarios to consider future implications of different nanotechnology governance models (Chapter 4); and appraises the amenability of sustainability problems to nanotechnological interventions (Chapter 5). The four studies are based on data collected through literature review, document analysis, participant observation, interviews, workshops, and walking audits, as part of process analysis, scenario construction, and technology assessment. Research was conducted in collaboration with representatives from industry, government agencies, and civic organizations. The empirical parts of the four studies focus on Metropolitan Phoenix. Findings suggest that: predefined mandates and economic goals dominate the nanotechnology innovation process

  4. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn (Oak Ridge, TN); Chen, Da-Ren (Creve Coeur, MO)

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  5. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...... and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs...

  6. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... structures, supporting actor involvement in the ecosystem, and (v) proper orchestration and governance of the ecosystem to promote and support the changes and the health of the ecosystem. Our work contributes to Net4Care, a platform to serve as the common platform in the software ecosystem under...... establishment. In addition, it contributes by providing input and guidelines on the role and activity of 4S organization, an organization to serve as an orchestrator in the ecosystem with the aim of managing the platform, supporting actor and software interactions, and promoting the ecosystem health...

  7. Fully Analyzing an Algebraic Polya Urn Model

    CERN Document Server

    Morcrette, Basile

    2012-01-01

    This paper introduces and analyzes a particular class of Polya urns: balls are of two colors, can only be added (the urns are said to be additive) and at every step the same constant number of balls is added, thus only the color compositions varies (the urns are said to be balanced). These properties make this class of urns ideally suited for analysis from an "analytic combinatorics" point-of-view, following in the footsteps of Flajolet-Dumas-Puyhaubert, 2006. Through an algebraic generating function to which we apply a multiple coalescing saddle-point method, we are able to give precise asymptotic results for the probability distribution of the composition of the urn, as well as local limit law and large deviation bounds.

  8. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  9. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  10. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  11. Composite blade structural analyzer (COBSTRAN) user's manual

    Science.gov (United States)

    Aiello, Robert A.

    1989-01-01

    The installation and use of a computer code, COBSTRAN (COmposite Blade STRuctrual ANalyzer), developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades was described. This code combines composite mechanics and laminate theory with an internal data base of fiber and matrix properties. Inputs to the code are constituent fiber and matrix material properties, factors reflecting the fabrication process, composite geometry and blade geometry. COBSTRAN performs the micromechanics, macromechanics and laminate analyses of these fiber composites. COBSTRAN generates a NASTRAN model with equivalent anisotropic homogeneous material properties. Stress output from NASTRAN is used to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. Curved panel structures may be modeled providing the curvature of a cross-section is defined by a single value function. COBSTRAN is written in FORTRAN 77.

  12. Composite Blade Structural Analyzer (COBSTRAN) demonstration manual

    Science.gov (United States)

    Aiello, Robert A.

    1989-01-01

    The input deck setup is described for a computer code, composite blade structural analyzer (COBSTRAN) which was developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades. This manual is intended for use in conjunction with the COBSTRAN user's manual. Seven demonstration problems are described with pre- and postprocessing input decks. Modeling of blades which are solid thru-the-thickness and also aircraft wing airfoils with internal spars is shown. Corresponding NASTRAN and databank input decks are also shown. Detail descriptions of each line of the pre- and post-processing decks is provided with reference to the Card Groups defined in the user's manual. A dictionary of all program variables and terms used in this manual may be found in Section 6 of the user's manual.

  13. Three Practical Methods for Analyzing Slope Stability

    Institute of Scientific and Technical Information of China (English)

    XU Shiguang; ZHANG Shitao; ZHU Chuanbing; YIN Ying

    2008-01-01

    Since the environmental capacity and the arable as well as the inhabitant lands have actually reached a full balance, the slopes are becoming the more and more important options for various engineering constructions. Because of the geological complexity of the slope, the design and thedecision-making of a slope-based engineering is still not ractical to rely solely on the theoretical analysis and numerical calculation, but mainly on the experience of the experts. Therefore, it hasimportant practical significance to turn some successful experience into mathematic equations. Basedupon the abundant typical slope engineering construction cases in Yunnan, Southwestern China, 3methods for yzing the slope stability have been developed in this paper. First of all, the corresponded analogous mathematic equation for analyzing slope stability has been established through case studies. Then, artificial neural network and multivariate regression analysis have alsobeen set up when 7 main influencing factors are adopted

  14. A calibration free vector network analyzer

    Science.gov (United States)

    Kothari, Arpit

    Recently, two novel single-port, phase-shifter based vector network analyzer (VNA) systems were developed and tested at X-band (8.2--12.4 GHz) and Ka-band (26.4--40 GHz), respectively. These systems operate based on electronically moving the standing wave pattern, set up in a waveguide, over a Schottky detector and sample the standing wave voltage for several phase shift values. Once this system is fully characterized, all parameters in the system become known and hence theoretically, no other correction (or calibration) should be required to obtain the reflection coefficient, (Gamma), of an unknown load. This makes this type of VNA "calibration free" which is a significant advantage over other types of VNAs. To this end, a VNA system, based on this design methodology, was developed at X-band using several design improvements (compared to the previous designs) with the aim of demonstrating this "calibration-free" feature. It was found that when a commercial VNA (HP8510C) is used as the source and the detector, the system works as expected. However, when a detector is used (Schottky diode, log detector, etc.), obtaining correct Gamma still requires the customary three-load calibration. With the aim of exploring the cause, a detailed sensitivity analysis of prominent error sources was performed. Extensive measurements were done with different detection techniques including use of a spectrum analyzer as power detector. The system was tested even for electromagnetic compatibility (EMC) which may have contributed to this issue. Although desired results could not be obtained using the proposed standing-wave-power measuring devices like the Schottky diode but the principle of "calibration-free VNA" was shown to be true.

  15. Irradiation Accidents in Radiotherapy Analyze, Manage, Prevent

    International Nuclear Information System (INIS)

    Why do errors occur? How to minimize them? In a context of widely publicized major incidents, of accelerated technological advances in radiotherapy planning and delivery, and of global communication and information resources, this critical issue had to be addressed by the professionals of the field, and so did most national and international organizations. The ISMP, aware of its responsibility, decided as well to put an emphasis on the topic at the occasion of its annual meeting. In this frame, potential errors in terms of scenarios, pathways of occurrence, and dosimetry, will first be examined. The goal being to prioritize error prevention according to likelihood of events and their dosimetric impact. Then, case study of three incidents will be detailed: Epinal, Glasgow and Detroit. For each one, a description of the incident and the way it was reported, its investigation, and the lessons that can be learnt will be presented. Finally, the implementation of practical measures at different levels, intra- and inter institutions, like teaching, QA procedures enforcement or voluntary incident reporting, will be discussed

  16. Comparison of two dry chemistry analyzers and a wet chemistry analyzer using canine serum.

    Science.gov (United States)

    Lanevschi, Anne; Kramer, John W.

    1996-01-01

    Canine serum was used to compare seven chemistry analytes on two tabletop clinical dry chemistry analyzers, Boehringer's Reflotron and Kodak's Ektachem. Results were compared to those obtained on a wet chemistry reference analyzer, Roche Diagnostic's Cobas Mira. Analytes measured were urea nitrogen (BUN), creatinine, glucose, aspartate aminotransferase (AST), alanine aminotransferase (ALT), cholesterol and bilirubin. Nine to 12 canine sera with values in the low, normal, and high range were evaluated. The correlations were acceptable for all comparisons with correlation coefficients greater than 0.98 for all analytes. Regression analysis resulted in significant differences for both tabletop analyzers when compared to the reference analyzer for cholesterol and bilirubin, and for glucose and AST on the Kodak Ektachem. Differences appeared to result from proportional systematic error occurring at high analyte concentrations.

  17. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  18. The model JSR-12 neutron coincidence analyzer

    International Nuclear Information System (INIS)

    This paper reports that one of the ways in which non-destructive assays for nuclear materials is made involved counting the neutron signatures which result from spontaneous or induced fissions in fissile materials. A major problem in determining the number of fission neutrons is trying to separate them from the background of neutrons arising from alpha particle interactions with lighter nuclei in the matrix materials of the samples being assayed. The JSR-12 neutron coincidence analyzer operates on the principle that fission neutrons occur in multiples of two or more, whereas background neutrons occur randomly as single events. By exploiting this time correlation difference, the JSR-12 can determine the fission neutron signal. This instrument represents a considerable upgrade from the industry standard JSR-11, by doubling the response speed and adding complete computer control of all functions, as well as employing non-volatile memory for data storage. Operation has been simplified for field use by using an LCD display to guide the operator in setting up assay parameters, and by time-date tagging all assays for later retrieval

  19. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  20. NRC plant-analyzer development at BNL

    International Nuclear Information System (INIS)

    The objective of this program is to develop an LWR engineering plant analyzer capable of performing realistic and accurate simulations of plant transients and Small-Break Loss of Coolant Accidents at real-time and faster than real-time computing speeds and at low costs for preparing, executing and evaluating such simulations. The program is directed toward facilitating reactor safety analyses, on-line plant monitoring, on-line accident diagnosis and mitigation and toward improving reactor operator training. The AD10 of Applied Dynamics International, Ann Arbor, MI, a special-purpose peripheral processor for high-speed systems simulation, is programmed through a PDP-11/34 minicomputer and carries out digital simulations with analog hardware in the input/output loop (up to 256 channels). Analog signals from a control panel are being used now to activate or to disable valves and to trip pump drive motors or regulators without interrupting the simulation. An IBM personal computer with multicolor graphics capabilities and a CRT monitor are used to produce on-line labelled diagrams of selected plant parameters as functions of time

  1. Qualitative Methodology in Analyzing Educational Phenomena

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2010-12-01

    Full Text Available Semiological analysis of educational phenomena allow researchers access to a multidimensional universe of meanings that is represented by the school, not so much seen as an institution, but as a vector of social action through educational strategies. We consider education as a multidimensional phenomenon since its analysis allows the researcher to explore a variety of research hypotheses of different paradigmatic perspectives that converge in an educational finality. According to the author Simona Branc one of the most appropriate methods used in qualitative data analysis is Grounded Theory; this one assumes a systematic process of generating concepts and theories based on the data collected. Specialised literature defines Grounded Theory as an inductive approach that starts with general observations and during the analytical process creates conceptual categories that explain the theme explored. Research insist on the role of the sociologic theory of managing the research data and for providing ways of conceptualizing the descriptions and explanations.Qualitative content analysis is based on the constructivist paradigm (constructionist in the restricted sense that we used previously. It aims to create an “understanding of the latent meanings of the analyzed messages”. Quantitative content analysis involves a process of encoding and statistical analysis of data extracted from the content of the paper in the form of extractions like: frequencies, contingency analysis, etc

  2. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  3. Signal processing and analyzing works of art

    Science.gov (United States)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  4. Analyzing modified unimodular gravity via Lagrange multipliers

    Science.gov (United States)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  5. Analyzing Contents of a Computer Cache

    Science.gov (United States)

    Beahan, John; Khanoyan, Garen; Some, Raphael; Callum, Leslie

    2004-01-01

    The Cache Contents Estimator (CCE) is a computer program that provides information on the contents of level-1 cache of a PowerPC computer. The CCE is configurable to enable simulation of any processor in the PowerPC family. The need for CCE arises because the contents of level-1 caches are not available to either hardware or software readout mechanisms, yet information on the contents is crucial in the development of fault-tolerant or highly available computing systems and for realistic modeling and prediction of computing- system performance. The CCE comprises two independent subprograms: (1) the Dynamic Application Address eXtractor (DAAX), which extracts the stream of address references from an application program undergoing execution and (2) the Cache Simulator (CacheSim), which models the level-1 cache of the processor to be analyzed, by mimicking what the cache controller would do, in response to the address stream from DAAX. CacheSim generates a running estimate of the contents of the data and the instruction subcaches of the level-1 cache, hit/miss ratios, the percentage of cache that contains valid or active data, and time-stamped histograms of the cache content.

  6. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    Directory of Open Access Journals (Sweden)

    E.Dursun

    2008-01-01

    Full Text Available The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, increasing women workforce and income growth. In this research, original data collected through face-to-face interview from 385 respondents which are located in Istanbul. Different Socio-Economic Status (SES groups‟ ratio for Istanbul was forming sampling distribution. Consumers prefer closest food retailers which are mainly purchasing food products. Consumers purchase more than their planned what their needs; especially C SES group average comes first for the spending money for unplanned shopping. Chain stores and hypermarkets are the most preferred retailers in food purchasing. Moreover, consumer responses to judgments related to retailing are being investigating with factor analysis.

  7. PLC backplane analyzer for field forensics and intrusion detection

    Science.gov (United States)

    Mulder, John; Schwartz, Moses Daniel; Berg, Michael; Van Houten, Jonathan Roger; Urrea, Jorge Mario; King, Michael Aaron; Clements, Abraham Anthony; Trent, Jason; Depoy, Jennifer M; Jacob, Joshua

    2015-05-12

    The various technologies presented herein relate to the determination of unexpected and/or malicious activity occurring between components communicatively coupled across a backplane. Control data, etc., can be intercepted at a backplane where the backplane facilitates communication between a controller and at least one device in an automation process. During interception of the control data, etc., a copy of the control data can be made, e.g., the original control data can be replicated to generate a copy of the original control data. The original control data can continue on to its destination, while the control data copy can be forwarded to an analyzer system to determine whether the control data contains a data anomaly. The content of the copy of the control data can be compared with a previously captured baseline data content, where the baseline data can be captured for a same operational state as the subsequently captured control data.

  8. Analyzing Tibetan Monastics Conception of Universe Through Their Drawings

    Science.gov (United States)

    Sonam, Tenzin; Chris Impey

    2016-06-01

    Every culture and tradition has their own representation of the universe that continues to evolve through new technologies and discoveries, and as a result of cultural exchange. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores the monastics’ conception of the universe prior to their formal instruction in science. Their drawings were analyzed using Tversky’s three criteria for drawing analysis namely—segmentation, order, and hierarchical structure of knowledge. Among the sixty Buddhist monastics included in this study, we find that most of them draw a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. A few monastics draw the traditional Buddhist model of the world. The implications of the monastics' representation of the universe for their assimilation of modern science is discussed.

  9. Field-usable portable analyzer for chlorinated organic compounds

    International Nuclear Information System (INIS)

    In 1992, a chemical sensor was developed which showed almost perfect selectivity to vapors of chlorinated solvents. When interfaced to an instrument, a chemical analyzer will be produced that has near- absolute selectivity to vapors of volatile chlorinated organic compounds. TRI has just completed the second of a 2-phase program to develop this new instrument system, which is called the RCL MONITOR. In Phase II, this instrument was deployed in 5 EM40 operations. Phase II applications covered clean-up process monitoring, environmental modeling, routine monitoring, health and safety, and technology validation. Vapor levels between 0 and 100 ppM can be determined in 90 s with a lower detection limit of 0.5 ppM using the hand-portable instrument. Based on the favorable performance of the RCL MONITOR, the commercial instrument was released for commercial sales on Sept. 20, 1996

  10. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  11. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  12. First cloud-based service for analyzing storage tank data

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-01-15

    Most commercial storage tanks are unmonitored and require manual processes to verify conditions, remediate issues or request servicing. New Boundary Technologies has developed an off-the-shelf solution that eliminates several manual processes. Its TankVista Internet service was launched as the first cloud-based service for continuously monitoring and analyzing the conditions and storage levels of commercial storage tanks, bins, silos and other containers. TankVista takes data from storage tank sensors and translates it into graphics and maps that industry can use to drive new efficiencies in storage tank management. A bulk oil distributor can leverage TankVista to remotely and continuously monitor its own storage tanks as well as those of its clients. TankVista monitors tank level, temperature, pressure, humidity and other storage criteria in order to know exactly when and where to replenish supplies. Rather than re-filling tanks at about 50 per cent capacity, a bulk oil distributor can wait until usage levels dictate more efficient re-filling. The monitoring takes place without manual intervention. TankVista complements the iDigi Tank, which has the unique ability to wirelessly connect dispersed and remote tank assets, and get this information through drop-in wireless mesh technology to the cloud without requiring onsite Internet access. 1 fig.

  13. NON-DESTRUCTIVE SOIL CARBON ANALYZER.

    Energy Technology Data Exchange (ETDEWEB)

    Wielopolski, Lucian; Hendrey, G.; Orion, I.; Prior, S.; Rogers, H.; Runion, B.; Torbert, A.

    2004-02-01

    This report describes the feasibility, calibration, and safety considerations of a non-destructive, in situ, quantitative, volumetric soil carbon analytical method based on inelastic neutron scattering (INS). The method can quantify values as low as 0.018 gC/cc, or about 1.2% carbon by weight with high precision under the instrument's configuration and operating conditions reported here. INS is safe and easy to use, residual soil activation declines to background values in under an hour, and no radiological requirements are needed for transporting the instrument. The labor required to obtain soil-carbon data is about 10-fold less than with other methods, and the instrument offers a nearly instantaneous rate of output of carbon-content values. Furthermore, it has the potential to quantify other elements, particularly nitrogen. New instrumentation was developed in response to a research solicitation from the U.S. Department of Energy (DOE LAB 00-09 Carbon Sequestration Research Program) supporting the Terrestrial Carbon Processes (TCP) program of the Office of Science, Biological and Environmental Research (BER). The solicitation called for developing and demonstrating novel techniques for quantitatively measuring changes in soil carbon. The report includes raw data and analyses of a set of proof-of-concept, double-blind studies to evaluate the INS approach in the first phase of developing the instrument. Managing soils so that they sequester massive amounts of carbon was suggested as a means to mitigate the atmospheric buildup of anthropogenic CO{sub 2}. Quantifying changes in the soils' carbon stocks will be essential to evaluating such schemes and documenting their performance. Current methods for quantifying carbon in soil by excavation and core sampling are invasive, slow, labor-intensive and locally destroy the system being observed. Newly emerging technologies, such as Laser Induced Breakdown Spectroscopy and Near-Infrared Spectroscopy, offer soil

  14. Assistive Technology

    Science.gov (United States)

    ... Page Resize Text Printer Friendly Online Chat Assistive Technology Assistive technology (AT) is any service or tool that helps ... be difficult or impossible. For older adults, such technology may be a walker to improve mobility or ...

  15. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  16. Fuzzy Based Auto-coagulation Control Through Photometric Dispersion Analyzer

    Institute of Scientific and Technical Information of China (English)

    白桦; 李圭白

    2004-01-01

    The main role of water treatment plants is to supply high-quality safe drinking water. Coagulation is one of the most important stages of surface water treatment. The photometric dispersion analyzer(PDA) is a new optical method for flocculation monitoring, and is feasible to realize coagulation feedback control. The on line modification of the coagulation control system' s set point( or optimum dosing coagulant) has influenced the application of this technology in water treatment plant for a long time. A fuzzy control system incorporating the photometric dispersion analyzer was utilized in this coagulation control system. Proposed is a fuzzy logic inference control system by using Takagi and Sugeno' s fuzzy if-then rule for the self-correction of set point on line. Programmed is the dosing rate fuzzy control system in SIEMENS small-scale programmable logic controller. A 400 L/min middle-scale water treatment plant was utilized to simulate the reaction. With the changes of raw water quality, the set point was modified correctly in time, as well as coagulant dosing rate, and residual turbility before filtration was eligible and stable. Results show that this fuzzy inference and control system performs well on the coagulation control system through PDA.

  17. Performance of parametric spectro-temporal analyzer (PASTA).

    Science.gov (United States)

    Zhang, Chi; Wei, Xiaoming; Wong, Kenneth K Y

    2013-12-30

    Parametric spectro-temporal analyzer (PASTA) is an entirely new wavelength resolving modality that focuses the spectral information on the temporal axis, enables ultrafast frame rate, and provides comparable resolution and sensitivity to the state-of-art optical spectrum analyzer (OSA). Generally, spectroscopy relies on the allocation of the spectrum onto the spatial or temporal domain, and the Czerny-Turner monochromator based conventional OSA realizes the spatial allocation by a dispersive grating, while the mechanical rotation limits its operation speed. On the other hand, the PASTA system performs the spectroscopy function by a time-lens focusing mechanism, which all-optically maps the spectral information on the temporal axis, and realizes the single-shot spectrum acquisition. Therefore, the PASTA system provides orders of magnitude improvement on the frame rate, as high as megahertz or even gigahertz in principle. In addition to the implementation of the PASTA system, in this paper, we will primarily discuss its performance, including the tradeoff between the frame rate and the wavelength range, factors that affect the wavelength resolution, the conversion efficiency, the power saturation and the polarization sensitivity. Detection bandwidth and high-order dispersion introduced limitations are also under investigation. All these analyses not only provide an overall guideline for the PASTA design, but also help future research in improving and optimizing this new spectrum resolving technology.

  18. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework. PMID:24110214

  19. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  20. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  1. Analyzing the outcomes of health promotion practices.

    Science.gov (United States)

    Pereira Lima, Vera Lucia Góes; Arruda, José Maria; Barroso, Maria Auxiliadora Bessa; Lobato Tavares, Maria de Fátima; Ribeiro Campos, Nora Zamith; Zandonadil, Regina Celi Moreira Basílio; da Rocha, Rosa Maria; Parreira, Clélia Maria de Souza Ferreira; Cohen, Simone Cynamon; Kligerman, Débora Cynamon; Sperandio, Ana Maria Girotti; Correa, Carlos Roberto Silveira; Serrano, Miguel Malo

    2007-01-01

    of social actors in environmental management and housing, supported by the Public Health Technology Development Project of the Oswaldo Cruz Foundation, was employed as a tool of environmental education and healthy housing. The purpose of this study was to construct an integrated and participatory model of environment management. The methodology included training, research and evaluation of participants, from 21 to 50 years of age, who participated in building Thematic Learning Books and Community Guides about water quality monitoring. Participants'evaluations emphasized the training process, encouraging them to become multiplier agents of environmental education in their communities and to continue learning how to bring together sectors for problems solving. The Potentially Healthy Districts' Network (RMPS) aimed at increasing knowledge and building capacity to develop actions which originate from each of the local units, based on their characteristics and practices. Developed by the Preventive and Social Department of Campinas State University with PAHO/WHO and the Society Special Research Institute (IPES), RMPS's mission was to cooperate in the construction of healthy public policies in a participatory and articulated way through different municipal representatives. The network offered tools to municipal administrations to develop integrated projects that brought together government, managers, technicians, academy and organizations for the construction of public policies aimed at health promotion and quality of life. The methodology is based in the construction of knowledge and action networks by social actors, stimulating trans-sectorial and inter-district actions. The outcome evaluation is based on case studies, focus groups, oral stories, documents and image analyses. PMID:17596094

  2. A Portable, Field-Deployable Analyzer for Isotopic Water Measurements

    Science.gov (United States)

    Berman, E. S.; Gupta, M.; Huang, Y. W.; Lacelle, D.; McKay, C. P.; Fortson, S.

    2015-12-01

    Water stable isotopes have for many years been used to study the hydrological cycle, catchment hydrology, and polar climate among other applications. Typically, discrete water samples are collected and transported to a laboratory for isotope analysis. Due to the expense and labor associated with such sampling, isotope studies have generally been limited in scope and time-resolution. Field sampling of water isotopes has been shown in recent years to provide dense data sets with the increased time resolution illuminating substantially greater short term variability than is generally observed during discrete sampling. A truly portable instrument also opens the possibility to utilize the instrument as a tool for identifying which water samples would be particularly interesting for further laboratory investigation. To make possible such field measurements of liquid water isotopes, Los Gatos Research has developed a miniaturized, field-deployable liquid water isotope analyzer. The prototype miniature liquid water isotope analyzer (mini-LWIA) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology in a rugged, Pelican case housing for easy transport and field operations. The analyzer simultaneously measures both δ2H and δ18O from liquid water, with both manual and automatic water introduction options. The laboratory precision for δ2H is 0.6 ‰, and for δ18O is 0.3 ‰. The mini-LWIA was deployed in the high Arctic during the summer of 2015 at Inuvik in the Canadian Northwest Territories. Samples were collected from Sachs Harbor, on the southwest coast of Banks Island, including buried basal ice from the Lurentide Ice Sheet, some ice wedges, and other types of ground ice. Methodology and water analysis results from this extreme field deployment will be presented.

  3. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    OpenAIRE

    Sayalee Narkhede; Tripti Baraskar

    2013-01-01

    In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes ...

  4. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  5. Using Simulation to Analyze Acoustic Environments

    Science.gov (United States)

    Wood, Eric J.

    2016-01-01

    One of the main projects that was worked on this semester was creating an acoustic model for the Advanced Space Suit in Comsol Multiphysics. The geometry tools built into the software were used to create an accurate model of the helmet and upper torso of the suit. After running the simulation, plots of the sound pressure level within the suit were produced, as seen below in Figure 1. These plots show significant nulls which should be avoided when placing microphones inside the suit. In the future, this model can be easily adapted to changes in the suit design to determine optimal microphone placements and other acoustic properties. Another major project was creating an acoustic diverter that will potentially be used to route audio into the Space Station's Node 1. The concept of the project was to create geometry to divert sound from a neighboring module, the US Lab, into Node 1. By doing this, no new audio equipment would need to be installed in Node 1. After creating an initial design for the diverter, analysis was performed in Comsol in order to determine how changes in geometry would affect acoustic performance, as shown in Figure 2. These results were used to produce a physical prototype diverter on a 3D printer. With the physical prototype, testing was conducted in an anechoic chamber to determine the true effectiveness of the design, as seen in Figure 3. The results from this testing have been compared to the Comsol simulation results to analyze how closely the Comsol results are to real-world performance. While the Comsol results do not seem to closely resemble the real world performance, this testing has provided valuable insight into how much trust can be placed in the results of Comsol simulations. A final project that was worked on during this tour was the Audio Interface Unit (AIU) design for the Orion program. The AIU is a small device that will be used for as an audio communication device both during launch and on-orbit. The unit will have functions

  6. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM PC VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  7. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM 370 VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  8. CONTEMPORARY SOCIAL MANAGEMENT TECHNOLOGIES

    OpenAIRE

    Plotnikov Mikhail Vyacheslavovich

    2012-01-01

    Analyzing the practices of development, application and research in managerial social technologies, the author reveals a number of essential problems in their further development. The revealed problems are combined into three groups: the problems of theory and methodology, the problems of development and the problem of practical application. Basing on the analysis of modern managerial social technologies, the author suggests comprehensive and universal classification that ...

  9. Diagnostic value of serum proteome characters analyzed by proteomic fingerprint technology in patients with inflammatory bowel disease%蛋白指纹图谱技术分析炎症性肠病血清蛋白质组学特征的诊断价值

    Institute of Scientific and Technical Information of China (English)

    杨铭; 章粉明; 单国栋; 陈洪潭; 胡凤玲; 陈文果; 陈李华; 余捷凯; 许国强

    2015-01-01

    目的探索 IBD 血清差异蛋白指纹图谱的诊断模型及其临床应用价值。方法应用弱阳离子磁珠联合基质辅助激光解吸电离飞行时间质谱(MALDI‐TOF‐MS)技术对72例 IBD 患者(其中CD 54例、UC 18例)和44名健康对照者的血清蛋白质谱进行分析,将3组两两对比,采用 Wilcoxon 秩和检验筛选出 P<0.05的差异蛋白质峰,通过遗传算法结合支持向量机模型的方法筛选出最佳诊断模型,用留一法评估模型的预测效果。结果 CD 组与健康对照组,UC 组与健康对照组,CD 组与 UC 组中分别筛选出差异最大的10个蛋白质峰。质量/电荷比(M /Z)值为3275.29,4963.91,4980.53,5336.90的4个蛋白质峰组合所建立的诊断模型能很好地区分 CD 组与健康对照组,其诊断 CD 的特异度为97.7%,敏感度为92.6%。 M /Z 值为2272.41,2660.42,3029.77,5002.78的4个蛋白质峰组合所建立的诊断模型能很好地区分 UC 组与健康对照组,其诊断 UC 的特异度为100.0%,敏感度为94.4%。M /Z 值为2082.63,2210.64,4039.02,4298.30,4978.03,5002.22的6个蛋白质峰组合模型,诊断 CD 的特异度为50.0%,敏感度为88.9%。结论 MALDI‐TOF‐MS 技术联合遗传算法结合支持向量机模型所建立的 CD 与 UC 血清差异蛋白的诊断模型,对 IBD 具有较高的诊断价值。%Objective To explore the diagnostic model and clinical application value of serum proteomic fingerprint in inflammatory bowel disease (IBD) .Methods Serum proteome profiles of 72 IBD patients (54 Crohn′s disease (CD) and 18 ulcerative colitis (UC) and 44 healthy controls were analyzed by the weak cation exchange (WCX) beads combined matrix‐assisted laser desorption/ionization time of flight mass spectrometry (MALDI‐TOF‐MS ) technique . Among three groups , every two groups were compared .Wilcoxon rank sum test was

  10. Academic Spin-Off's Transfer Speed - Analyzing the Time from Leaving University to Venture

    OpenAIRE

    Kathrin Müller

    2008-01-01

    For academic spin-offs I analyze the length of time between the founder's leaving of academia and the establishment of his firm. Technology transfer can take place even years after leaving the mother institution. A duration analysis reveals that a longer time-lag is caused by the necessity of assembling complementary skills, either by acquisition by a single founder or by searching for suitable team members. Furthermore, new ventures are established earlier if the intensity of technology tran...

  11. University Spin-Off's Transfer Speed: Analyzing the Time from Leaving University to Venture

    OpenAIRE

    Müller, Kathrin

    2008-01-01

    For academic spin-offs I analyze the length of time between the founder's leaving of academia and the establishment of his firm. Technology transfer can take place even years after leaving the mother institution. A duration analysis reveals that a longer time-lag is caused by the necessity of assembling complementary skills, either by acquisition by a single founder or by searching for suitable team members. Furthermore, new ventures are established earlier if the intensity of technology tran...

  12. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  13. A Serum Biomarker Model to Diagnose Pancreatic Cancer Using Proteomic Fingerprint Technology

    Institute of Scientific and Technical Information of China (English)

    Chunlin Ge; Ning Ma; Dianbo Yao; Fengming Luan; Chaojun Hu; Yongzhe Li; Yongfeng Liu

    2008-01-01

    OBJECTIVE To establish a serum protein pattern model for screening pancreatic cancer.METHODS Twenty-nine serum samples from patients with pancreatic cancer were collected before surgery,and an additional 57 serum samples from age and sex-matched individuals without cancer were used as controls.WCX magnetic beans and a PBS Ⅱ-C protein chip reader(Ciphergen Biosystems Inc)were employed to detect the protein fingerprint expression of all serum samples.The resulting profiles comparing serum from cancer and normal patients were analyzed with the Biomarker Wizard system,to establish a model using the Biomarker Pattern system software.A double-blind test was used to determine the sensitivity and specificity of the model.RESULTS A group of 4 biomarkers (relative molecular weights were 5,705 Da,4,935 Da,5,318 Da,3,243 Da)were selected to set up a decision tree to produce the classification model to effectively screen pancreatic cancer patients.The results yielded a sensitivitv of 100%(20/20),specificity of 97.4%(37/38).The ROC curve was 99.7%.A double-blind test used to challenge the model resulted in a sensitivity of 88.9% and a specifcity of 89.5%.CONCLUSION New serum biomarkers of pancreatic cancer have been identified.The pattern of combined markers provides a powerful and reliable diagnostic method for pancreatic cancer with high sensitivity and specificity.

  14. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    International Nuclear Information System (INIS)

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  15. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Sung Kee; Jung, U.; Park, H. R.

    2010-04-15

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  16. Noise and Analyzer-Crystal Angular Position Analysis for Analyzer-Based Phase-Contrast Imaging

    OpenAIRE

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-01-01

    The analyzer-based phase-contrast X-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the...

  17. Design of multi-channel amplitude analyzer base on LonWorks

    International Nuclear Information System (INIS)

    The paper introduces the multi-channel analyzer which adopts LonWorks technology. The system detects the pulse peak by hardware circuits and controls data acquisition and network communication by Micro Controller and Unit and Neuron chip. SCM is programmed by Keil C51; the communication between SCM and nerve cell is realized by Neron C language, and the computer program is written by VB language. Test results show that this analyzer is with fast conversion speed and low power consumption. (authors)

  18. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  19. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  20. Implementing High Performance Lexical Analyzer using CELL Broadband Engine Processor

    Directory of Open Access Journals (Sweden)

    P.J.SATHISH KUMAR

    2011-09-01

    Full Text Available The lexical analyzer is the first phase of the compiler and commonly the most time consuming. The compilation of large programs is still far from optimized in today’s compilers. With modern processors moving more towards improving parallelization and multithreading, it has become impossible for performance gains in older compilersas technology advances. Any multicore architecture relies on improving parallelism than on improving single core performance. A compiler that is completely parallel and optimized is yet to be developed and would require significant effort to create. On careful analysis we find that the performance of a compiler is majorly affected by the lexical analyzer’s scanning and tokenizing phases. This effort is directed towards the creation of a completelyparallelized lexical analyzer designed to run on the Cell/B.E. processor that utilizes its multicore functionalities to achieve high performance gains in a compiler. Each SPE reads a block of data from the input and tokenizes them independently. To prevent dependence of SPE’s, a scheme for dynamically extending static block-limits isincorporated. Each SPE is given a range which it initially scans and then finalizes its input buffer to a set of complete tokens from the range dynamically. This ensures parallelization of the SPE’s independently and dynamically, with the PPE scheduling load for each SPE. The initially static assignment of the code blocks is made dynamic as soon as one SPE commits. This aids SPE load distribution and balancing. The PPE maintains the output buffer until all SPE’s of a single stage commit and move to the next stage before being written out to the file, to maintain order of execution. The approach can be extended easily to other multicore architectures as well. Tokenization is performed by high-speed string searching, with the keyword dictionary of the language, using Aho-Corasick algorithm.

  1. Digital Media in Primary Schools: Literacy or Technology? Analyzing Government and Media Discourses

    Science.gov (United States)

    Pereira, Sara; Pereira, Luís

    2015-01-01

    This article examines the political and the media discourses concerning the Portuguese governmental program responsible for delivering a laptop named "Magalhães" to all primary school children. The analysis is based on the official documents related to the launch and development of the initiative as well as the press coverage of this…

  2. Advances and considerations in technologies for growing, imaging, and analyzing 3-D root system architecture

    Science.gov (United States)

    The ability of a plant to mine the soil for nutrients and water is determined by how, where, and when roots are arranged in the soil matrix. The capacity of plant to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, is affected by root system architectu...

  3. The university-industry knowledge relationship: Analyzing patents and the science base of technologies

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    Via the Internet, information scientists can obtain cost-free access to large databases in the hidden or deep web. These databases are often structured far more than the Internet domains themselves. The patent database of the U.S. Patent and Trade Office is used in this study to examine the science base of patents in terms of the literature references in these patents. University-based patents at the global level are compared with results when using the national economy of the Netherlands as a system of reference. Methods for accessing the on-line databases and for the visualization of the results are specified. The conclusion is that 'biotechnology' has historically generated a model for theorizing about university-industry relations that cannot easily be generalized to other sectors and disciplines.

  4. Living Technology

    DEFF Research Database (Denmark)

    2010-01-01

    our lives. The phrase 'living technology' was coined to refer to technology that is alive as well as technology that is useful because it shares the fundamental properties of living systems. In particular, the invention of this phrase was called for to describe the trend of our technology becoming...... increasingly life-like or literally alive. Still, the phrase has different interpretations depending on how one views what life is. This book presents nineteen perspectives on living technology. Taken together, the interviews convey the collective wisdom on living technology's power and promise, as well as its...

  5. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    Directory of Open Access Journals (Sweden)

    Sayalee Narkhede

    2013-07-01

    Full Text Available In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes of log data in a day.These datasets are huge. In order to analyze such large datasets we need parallel processing system andreliable data storage mechanism. Virtual database system is an effective solution for integrating the databut it becomes inefficient for large datasets. The Hadoop framework provides reliable data storage byHadoop Distributed File System and MapReduce programming model which is a parallel processingsystem for large datasets. Hadoop distributed file system breaks up input data and sends fractions of theoriginal data to several machines in hadoop cluster to hold blocks of data. This mechanism helps toprocess log data in parallel using all the machines in the hadoop cluster and computes result efficiently.The dominant approach provided by hadoop to “Store first query later”, loads the data to the HadoopDistributed File System and then executes queries written in Pig Latin. This approach reduces the responsetime as well as the load on to the end system. This paper proposes a log analysis system using HadoopMapReduce which will provide accurate results in minimum response time.

  6. Massively Parallel Sequencing Approaches for Characterization of Structural Variation

    OpenAIRE

    Koboldt, Daniel C.; Larson, David E.; Chen, Ken; Ding, Li; Wilson, Richard K.

    2012-01-01

    The emergence of next-generation sequencing (NGS) technologies offers an incredible opportunity to comprehensively study DNA sequence variation in human genomes. Commercially available platforms from Roche (454), Illumina (Genome Analyzer and Hiseq 2000), and Applied Biosystems (SOLiD) have the capability to completely sequence individual genomes to high levels of coverage. NGS data is particularly advantageous for the study of structural variation (SV) because it offers the sensitivity to de...

  7. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  8. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Farzin Heravi

    2012-09-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.

  9. CSNI specialist meeting on simulators and plant analyzers

    International Nuclear Information System (INIS)

    The Specialist Meeting on Simulators and Plant Analyzers, held in June 9-12, 1992, in Lappeenranta, Finland, was sponsored by the Committee on the Safety on Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organized in collaboration with the Technical Research Centre of Finland (VTT) and the Lappeenranta Technical University of Technology (LTKK). All the presented papers were invited and devided into four sessions. In the first session the objectives, requirements and consepts of simulators were discussed against present standards and guidelines. The second session focused on the capabilities of current analytical models. The third session focused on the experiences gained so far from the applications. The final fourth session concentrated on simulators, which are currently under development, and future plans with regard to both development and utilization. At the end of the meeting topics of the meeting were discussed at the panel discussion. Summaries of the sessions and shortened version of the panel discussion are included into the proceeding. (orig.)

  10. Software Developed for Analyzing High- Speed Rolling-Element Bearings

    Science.gov (United States)

    Fleming, David P.

    2005-01-01

    COBRA-AHS (Computer Optimized Ball & Roller Bearing Analysis--Advanced High Speed, J.V. Poplawski & Associates, Bethlehem, PA) is used for the design and analysis of rolling element bearings operating at high speeds under complex mechanical and thermal loading. The code estimates bearing fatigue life by calculating three-dimensional subsurface stress fields developed within the bearing raceways. It provides a state-of-the-art interactive design environment for bearing engineers within a single easy-to-use design-analysis package. The code analyzes flexible or rigid shaft systems containing up to five bearings acted upon by radial, thrust, and moment loads in 5 degrees of freedom. Bearing types include high-speed ball, cylindrical roller, and tapered roller bearings. COBRA-AHS is the first major upgrade in 30 years of such commercially available bearing software. The upgrade was developed under a Small Business Innovation Research contract from the NASA Glenn Research Center, and incorporates the results of 30 years of NASA and industry bearing research and technology.

  11. Field-usable portable analyzer for chlorinated organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, W.J.; Penrose, W.R.; Stetter, J.R. [Transducer Research, Inc., Naperville, IL (United States)

    1995-10-01

    Transducer Research, Inc. (TRI) has been working with the DOE Morgantown Energy Technology Center to develop a new chemical monitor based on a unique sensor which responds selectively to vapors of chlorinated solvents. We are also developing field applications for the monitor in actual DOE cleanup operations. During the initial phase, prototype instruments were built and field tested. Because of the high degree of selectivity that is obtained, no response was observed with common hydrocarbon organic compounds such as BTX (benzene, toluene, xylene) or POLs (petroleum, oil, lubricants), and in fact, no non-halogen-containing chemical has been identified which induces a measurable response. By the end of the Phase I effort, a finished instrument system was developed and test marketed. This instrument, called the RCL MONITOR, was designed to analyze individual samples or monitor an area with automated repetitive analyses. Vapor levels between 0 and 500 ppm can be determined in 90 s with a lower detection limit of 0.2 ppm using the handportable instrument. In addition to the development of the RCL MONITOR, advanced sampler systems are being developed to: (1) extend the dynamic range of the instrument through autodilution of the vapor and (2) allow chemical analyses to be performed on aqueous samples. When interfaced to the samplers, the RCL MONITOR is capable of measuring chlorinated solvent contamination in the vapor phase up to 5000 ppm and in water and other condensed media from 10 to over 10,000 ppb(wt)--without hydrocarbon and other organic interferences.

  12. Technology and Pedagogical Renewal: Conceptualizing Technology Integration into Teacher Preparation

    Science.gov (United States)

    Duran, Mesut; Fossum, Paul R.; Luera, Gail R.

    2007-01-01

    Research indicates that, if future teachers are to effectively use technology, their pre-service preparation should employ multiple components. These components include core course work in educational technology, faculty modeling, and clinical experiences. This paper describes and analyzes one model for drawing these three components coherently…

  13. Analyzing Science Activities in Force and Motion Concepts: A Design of an Immersion Unit

    Science.gov (United States)

    Ayar, Mehmet C.; Aydeniz, Mehmet; Yalvac, Bugrahan

    2015-01-01

    In this paper, we analyze the science activities offered at 7th grade in the Turkish science and technology curriculum along with addressing the curriculum's original intent. We refer to several science education researchers' ideas, including Chinn & Malhotra's (Science Education, 86:175--218, 2002) theoretical framework and…

  14. Co-production of Knowledge in Multi-stakeholder Processes: Analyzing Joint Experimentation as Social Learning

    NARCIS (Netherlands)

    Akpo, E.; Crane, T.A.; Vissoh, P.; Tossou, C.R.

    2015-01-01

    Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand ho

  15. Study and realization of a multichannel analyzer with a gamma chain acquisition

    International Nuclear Information System (INIS)

    Electronics is an important area, it involves multiple searches on several facilities in order to have effective diagnostic. Our study in the the National Center for Science and Nuclear Technologies was a good opportunity to improve our knowledge and the various acquisitions it was based on the Study and realization of a multichannel analyzer with a gamma chain acquisition.

  16. Effects of Professional Experience and Group Interaction on Information Requested in Analyzing IT Cases

    Science.gov (United States)

    Lehmann, Constance M.; Heagy, Cynthia D.

    2008-01-01

    The authors investigated the effects of professional experience and group interaction on the information that information technology professionals and graduate accounting information system (AIS) students request when analyzing business cases related to information systems design and implementation. Understanding these effects can contribute to…

  17. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    Science.gov (United States)

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  18. Technological Learning for Carbon Capture and Sequestration Technologies

    OpenAIRE

    K. Riahi; Rubin, E.S.; Taylor, M. R.; L. Schrattenholzer; Hounshell, D.

    2004-01-01

    This paper analyzes potentials of carbon capture and sequestration technologies (CCT) in a set of long-term energy-economic-environmental scenarios based on alternative assumptions for technological progress of CCT. In order to get a reasonable guide to future technological progress in managing CO2 emissions, we review past experience in controlling sulfur dioxide (SO2) emissions from power plants. By doing so, we quantify a "learning curve" for CCT, which describes the relationship between ...

  19. Technology Tiers

    DEFF Research Database (Denmark)

    Karlsson, Christer

    2015-01-01

    A technology tier is a level in a product system: final product, system, subsystem, component, or part. As a concept, it contrasts traditional “vertical” special technologies (for example, mechanics and electronics) and focuses “horizontal” feature technologies such as product characteristics...

  20. Application of Digital Mockup Technology

    OpenAIRE

    Gaoming Ding

    2011-01-01

    Digital simulation design is one of the mechanical product modern design methods. We have introduced the concept and meaning of digital mockup technology in mechanical design and its architecture, design cycle, which are concerned multi-domain UML, multi-body dynamics and multidisciplinary design. We also have analyzed the automobile digital simulation design method and digital mockup technology.

  1. University Teaching with Digital Technologies

    OpenAIRE

    Marcelo-García, Carlos; Yot-Domínguez, Carmen; Mayor-Ruiz, Cristina

    2015-01-01

    This research aims to analyze the level of use of technology by university teachers. We are interested by the frequency of their use in designing the teaching-learning process. The research questions were: what types of learning activities which include are designed by university teachers? What types of technologies do teachers use in the design of their instruction? What is the level of use of digital technologies in the learning designs? To respond to these issues, we designed an inventory ...

  2. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2004-01-01

    that deepens our understanding of how organizations appropriate new electronic communication media. The paper analyzes how a group of mediators in a large, multinational company adapted a new web-based CMC technology (a virtual workspace) to the local organizational context (and vice versa) by modifying......Implementation of new computer-mediated communication (CMC) systems in organizations is a complex socio-technical endeavour, involving the mutual adaptation of technology and organization over time. Drawing on the analytic concept of sensemaking, this paper provides a theoretical perspective...... features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. We found that these mediators exerted considerable influence on how the technology was established and used in the organization. The mediators were not neutral facilitators of a well...

  3. Healthcare technology and technology assessment

    OpenAIRE

    Herndon, James H.; Hwang, Raymond; Bozic, K. H.

    2007-01-01

    New technology is one of the primary drivers for increased healthcare costs in the United States. Both physician and industry play important roles in the development, adoption, utilization and choice of new technologies. The Federal Drug Administration regulates new drugs and new medical devices, but healthcare technology assessment remains limited. Healthcare technology assessment originated in federal agencies; today it is decentralized with increasing private sector efforts. Innovation is ...

  4. Technology Lecturer Turned Technology Teacher

    Science.gov (United States)

    Lee, Kerry

    2009-01-01

    This case study outlines a program developed by a group of 6 teachers' college lecturers who volunteered to provide a technology program to year 7 & 8 children (11- and 12-year-olds) for a year. This involved teaching technology once a week. As technology education was a new curriculum area when first introduced to the college, few lecturers had…

  5. Science and Technology Parks in the Context of Social Technologies

    Directory of Open Access Journals (Sweden)

    Edgaras Leichteris

    2013-08-01

    Full Text Available This article aims to present a new approach to science and technology park concept and the development prospects in the context of social technologies. Globalization and the spread of social technologies are expanding the influence of science and technology parks on national innovation systems. It opens new directions for research in this area, as well as the practical use of social technologies in the development of science and technology parks. The paper also examines the science and technology park as an institutionalized concept of social technology. In this article the interdisciplinary approach for analyzing the complex concept of science and technology parks is used to explore the theoretical relationships with the social technologies concept. The possible links are identified and illustrated by practical examples ofLithuanian science and technology parks. Finally suggestions for further research are made. Based on the analysis and synthesis of scientific literature in both fields (science and technology parks; social technologies three possible theoretical links are established: a the use of social technologies in science and technology parks b the role of a science park as an intermediate body in the humanization and socialization of technologies c science and technology parks as an institutionalized concept of social technology. The theoretical model is supported by empirical illustrations from the development of Lithuanian science and technology parks, therefore further research in all three directions is feasible and needed. As this research takes a merely theoretical approach to the social systems investigation, it can be qualified only as a preparational stage for further research. The practical examples used in the article are more illustrative than evidence based and shall not be considered as case studies. The research offers an initial framework for researching science and technology parks in the context of social technologies

  6. Science and Technology Parks in the Context of Social Technologies

    Directory of Open Access Journals (Sweden)

    Edgaras Leichteris

    2011-08-01

    Full Text Available Summary. This article aims to present a new approach to science and technology park concept and the development prospects in the context of social technologies. Globalization and the spread of social technologies are expanding the influence of science and technology parks on national innovation systems. It opens new directions for research in this area, as well as the practical use of social technologies in the development of science and technology parks. The paper also examines the science and technology park as an institutionalized concept of social technology. In this article the interdisciplinary approach for analyzing the complex concept of science and technology parks is used to explore the theoretical relationships with the social technologies concept. The possible links are identified and illustrated by practical examples of Lithuanian science and technology parks. Finally suggestions for further research are made. Based on the analysis and synthesis of scientific literature in both fields (science and technology parks; social technologies three possible theoretical links are established: a the use of social technologies in science and technology parks b the role of a science park as an intermediate body in the humanization and socialization of technologies c science and technology parks as an institutionalized concept of social technology. The theoretical model is supported by empirical illustrations from the development of Lithuanian science and technology parks, therefore further research in all three directions is feasible and needed. As this research takes a merely theoretical approach to the social systems investigation, it can be qualified only as a preparational stage for further research. The practical examples used in the article are more illustrative than evidence based and shall not be considered as case studies. The research offers an initial framework for researching science and technology parks in the context of social

  7. Analysis of Impact of 3D Printing Technology on Traditional Manufacturing Technology

    Science.gov (United States)

    Wu, Niyan; Chen, Qi; Liao, Linzhi; Wang, Xin

    With quiet rise of 3D printing technology in automobile, aerospace, industry, medical treatment and other fields, many insiders hold different opinions on its development. This paper objectively analyzes impact of 3D printing technology on mold making technology and puts forward the idea of fusion and complementation of 3D printing technology and mold making technology through comparing advantages and disadvantages of 3D printing mold and traditional mold making technology.

  8. Flexibility of MIP Technology

    Institute of Scientific and Technical Information of China (English)

    Tang Jinlian; Gong Jianhong; Xu Youhao

    2015-01-01

    The lfexibility of MIP technology to meet market demand is mainly introduced in this study. Their commercial application and technical principle are analyzed too. The MIP technology with wide feed adaptability can form a good com-bination with other technologies. The MIP technology has been applied extensively in China. Based on this platform, the CGP, MIP-LTG and MIP-DCR technologies have been developed, which can further improve the lfexibility of MIP tech-nology. Based on its novel reaction control technique with a sole sequential two-zone riser, the MIP users can easily switch to different operating modes between producing either more clean gasoline and propylene or diesel through changing the catalysts and varying the operating conditions. That offers MIP users with enough production lfexibility and a rational pro-duction arrangement to meet the market demand. The MIP-DCR technology with less dry gas and coke yields can provide a more lfexible operating mode since the catalysts to oil ratio has become an independent variable.

  9. Technology '90

    International Nuclear Information System (INIS)

    The US Department of Energy (DOE) laboratories have a long history of excellence in performing research and development in a number of areas, including the basic sciences, applied-energy technology, and weapons-related technology. Although technology transfer has always been an element of DOE and laboratory activities, it has received increasing emphasis in recent years as US industrial competitiveness has eroded and efforts have increased to better utilize the research and development resources the laboratories provide. This document, Technology '90, is the latest in a series that is intended to communicate some of the many opportunities available for US industry and universities to work with the DOE and its laboratories in the vital activity of improving technology transfer to meet national needs. Technology '90 is divided into three sections: Overview, Technologies, and Laboratories. The Overview section describes the activities and accomplishments of the DOE research and development program offices. The Technologies section provides descriptions of new technologies developed at the DOE laboratories. The Laboratories section presents information on the missions, programs, and facilities of each laboratory, along with a name and telephone number of a technology transfer contact for additional information. Separate papers were prepared for appropriate sections of this report

  10. Sensemaking technologies

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research scope: The scope of the project is to study technological implementation processes by using Weick's sensemaking concept (Weick, 1995). The purpose of using a social constructivist approach to investigate technological implementation processes is to find out how new technologies transform...... patterns of social action and interaction in organisations (Barley 1986; 1990, Orlikowski 2000). Current research in the field shows that new technologies affect organisational routines/structures/social relationships/power relations/dependencies and alter organisational roles (Barley 1986; 1990, Burkhardt......, Orlikowski 2000). Viewing the use of technology as a process of enactment opens up for investigating the social processes of interpreting new technology into the organisation (Orlikowski 2000). The scope of the PhD project will therefore be to gain a deeper understanding of how the enactment of new...

  11. Technological Tyranny

    Science.gov (United States)

    Greenwood, Dick

    1984-08-01

    It is implicitly assumed by those who create, develop, control and deploy new technology, as well as by society at-large, that technological innovation always represents progress. Such an unchallenged assumption precludes an examination and evaluation of the interrelationships and impact the development and use of technology have on larger public policy matters, such as preservation of democratic values, national security and military policies, employment, income and tax policies, foreign policy and the accountability of private corporate entities to society. This brief challenges those assumptions and calls for social control of technology.

  12. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer...

  13. ARC Code TI: Inference Kernel for Open Static Analyzers (IKOS)

    Data.gov (United States)

    National Aeronautics and Space Administration — IKOS is a C++ library designed to facilitate the development of sound static analyzers based on Abstract Interpretation. Specialization of a static analyzer for an...

  14. Study on brackish water treatment technology

    Institute of Scientific and Technical Information of China (English)

    HE Xu-wen(何绪文); Xu De-ping (许德平); WU Bing(吴兵); WANG Tong(王通)

    2003-01-01

    Based on the characters of deep well-water quality in Fenxi Mining Group in Liulin, the feasibilities of two treatment technologies which use electrodialysis and reverse osmosis are analyzed. Through analyzing and comparing, reverse osmosis technology has several advantages, such as good treatment effect, convenient operating management and low run-cost.

  15. Technology Catalogue

    International Nuclear Information System (INIS)

    The Department of Energy's Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM's Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department's clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD's applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina)

  16. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864....5680 Automated heparin analyzer. (a) Identification. An automated heparin analyzer is a device used to determine the heparin level in a blood sample by mixing the sample with protamine (a...

  17. 21 CFR 868.1670 - Neon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Neon gas analyzer. 868.1670 Section 868.1670 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1670 Neon gas analyzer. (a) Identification. A neon gas analyzer is a device intended to measure the concentration of neon in a gas mixture exhaled by...

  18. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Quench checks; NOX analyzer. 86.327-79... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum... capillary, and if used, dilution capillary. (c) Quench check as follows: (1) Calibrate the NOX analyzer...

  19. 40 CFR 92.120 - NDIR analyzer calibration and checks.

    Science.gov (United States)

    2010-07-01

    ...) Record the CO2 calibration gas concentration in ppm. (4) Record the analyzers' response (AR) in ppm to...) of this section. (iv) Record the response of calibration gases having nominal concentrations of 15... room temperature directly to the analyzer. (3) Determine and record the analyzer operating pressure...

  20. 40 CFR 86.1325-94 - Methane analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Methane analyzer calibration. 86.1325... Procedures § 86.1325-94 Methane analyzer calibration. Prior to introduction into service and monthly thereafter, the methane analyzer shall be calibrated: (a) Follow the manufacturer's instructions...

  1. 40 CFR 86.125-94 - Methane analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Methane analyzer calibration. 86.125... Complete Heavy-Duty Vehicles; Test Procedures § 86.125-94 Methane analyzer calibration. Prior to introduction into service and monthly thereafter, the methane analyzer shall be calibrated: (a) Follow...

  2. 21 CFR 868.1640 - Helium gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Helium gas analyzer. 868.1640 Section 868.1640...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1640 Helium gas analyzer. (a) Identification. A helium gas analyzer is a device intended to measure the concentration of helium in a...

  3. 21 CFR 868.2380 - Nitric oxide analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Nitric oxide analyzer. 868.2380 Section 868.2380...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Monitoring Devices § 868.2380 Nitric oxide analyzer. (a) Identification. The nitric oxide analyzer is a device intended to measure the concentration of nitric oxide...

  4. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement...

  5. Thermally activated technologies: Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2003-05-01

    The purpose of this Technology Roadmap is to outline a set of actions for government and industry to develop thermally activated technologies for converting America’s wasted heat resources into a reservoir of pollution-free energy for electric power, heating, cooling, refrigeration, and humidity control. Fuel flexibility is important. The actions also cover thermally activated technologies that use fossil fuels, biomass, and ultimately hydrogen, along with waste heat.

  6. Technological Advancements

    Science.gov (United States)

    Kennedy, Mike

    2010-01-01

    The influx of technology has brought significant improvements to school facilities. Many of those advancements can be found in classrooms, but when students head down the hall to use the washrooms, they are likely to find a host of technological innovations that have improved conditions in that part of the building. This article describes modern…

  7. Maritime Technology

    DEFF Research Database (Denmark)

    Sørensen, Herman

    1997-01-01

    Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text.......Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text....

  8. Radiation Technology

    International Nuclear Information System (INIS)

    The conference was organized to evaluate the application directions of radiation technology in Vietnam and to utilize the Irradiation Centre in Hanoi with the Co-60 source of 110 kCi. The investigation and study of technico-economic feasibility for technology development to various items of food and non-food objects was reported. (N.H.A)

  9. Lasers technology

    International Nuclear Information System (INIS)

    The Lasers Technology Program of IPEN is committed to the development of new lasers based on the research of optical materials and new technologies, as well to laser applications in several areas: Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. The Program is basically divided into two main areas: Material and Laser Development and Laser Applications

  10. Technology collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Jacob [Halliburton (Brazil)

    2011-07-01

    The aim of this paper is to present Halliburton's Brazilian technology center. Halliburton has technology centers in the United States, Saudi Arabia, India, Singapore and Brazil, all of which aim at delivering accelerated innovation in the oil sector. The technology centers engage in research and development activities with the help of various universities and in collaboration with the customer or supplier. The Halliburton Brazil technology center provides its customers with timely research and development solutions for enhancing recovery and mitigating reservoir uncertainty; they are specialized in finding solutions for pre- and post-salt carbonate drilling and in the enhancement of production from mature fields. This presentation showcased the work carried out by the Halliburton Brazil technology center to help customers develop their deepwater field activities.

  11. Sensemaking technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    understanding of the cognitive competencies and barriers towards implementing new technology in organisations. The research will therefore concentrate on researching the development process in the organisation's perception of the external environmental elements of customers, suppliers, competitors, internal...... and external technology and legislation and the internal environmental elements of structure, power relations and political arenas. All of these variables have influence on which/how technologies are implemented thus creating different outcomes all depending on the social dynamics that are triggered by changes......Research objective: The object of the LOK research project is to gain a better understanding of the technological strategic processes in organisations by using the concept/metaphor of sensemaking. The project will investigate the technological strategies in organisations in order to gain a deeper...

  12. A Study on the Revitalizing of technology commercialization in KAERI

    International Nuclear Information System (INIS)

    The TEC training program should be implemented for researches who want to commercialize their own technologies. To build creative organization culture is essential for technology commercialization. Collaboration strategy is related to analyze how KAERI is catching up their technological capabilities in nuclear technology, and what the success factors of KAERI in technology commercialization are.

  13. Automatic proximate analyzer of coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, Youhui; Jiang, Taiyi; Zou, Xianhong [National Laboratory of Coal Combustion, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China)

    2003-12-17

    A new type of rapid and automatic proximate analyzer for coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace is introduced in this paper. This automatic proximate analyzer was developed by combination with some novel technologies, such as the automatic weighting method for multi-samples in a high temperature and dynamic gas flow circumstance, the self-protection system for the electric balance, and the optimal method and procedure for coal analysis process. Additionally, the comparison between standard values and the measurement values derived from the new instrument of standard coals was presented.

  14. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  15. A Framework for Analyzing Digital Payment as a Multi-sided Platform

    DEFF Research Database (Denmark)

    Kazan, Erol; Damsgaard, Jan

    2013-01-01

    payment systems and analyzing strategies of current market actors, such as banks, mobile network operators, and merchants. These market actors are identified as incumbents or contenders, and they are currently jockeying for digital payment platform leadership. We analyze three different contactless......Near Field Communication (NFC) is a promising digital payment technology that is expected to substitute cash. However, despite its potential, NFC-based payment has not reached mass adoption on the customer nor on the merchant side. This paper constructs a preliminary framework for studying digital...

  16. The moral relevance of technological artifacts

    NARCIS (Netherlands)

    Verbeek, P.P.C.C.; Sollie, P.; Düwell, M.

    2009-01-01

    This chapter explores the ethics of technology in a double sense: it lays bare points of application for ethical reflection about technology development, and it analyzes the ethical dimensions of technology itself. First, the chapter addresses the question of how to conceptualize and assess the mora

  17. Development and Applications of Simulation Technology

    Institute of Scientific and Technical Information of China (English)

    WangZicai

    2004-01-01

    The developing process of simulation technology is discussed in view of its development, maturation and further development.The applications of simulation technology in the fields of national economy are introduced. Finally, the level and status quo of simulation technology home and overseas are analyzed, and its future trend in the new century is presented.

  18. Next-generation sequencing technologies and the application in microbiology-A review%高通量测序技术及其在微生物学研究中的应用

    Institute of Scientific and Technical Information of China (English)

    秦楠; 栗东芳; 杨瑞馥

    2011-01-01

    20世纪70年代发明的核酸测序技术为基因组学及其相关学科的发展做出了巨大贡献,本世纪初发展的以Illumina公司的HiSeq 2000,ABI公司的SOLID,和Roche公司的454技术为代表的高通量测序技术又为基因组学的发展注入了新活力.本文在阐述这些技术的基础上,着重讨论了新一代测序技术在微生物领域中的应用.%Since its invention in 1970s, nucleic acid sequencing technology has contributed tremendously to the genomics advances.The next-generation sequencing technologies, represented by HiSeq 2000 from Illumina, SOLiD from Applied Biosystems and 454 from Roche, re-energized the application of genomics.In this review, we first introduced the next-generation sequencing technologies, then, described their potential applications in the field of microbiology.

  19. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  20. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  1. Ergonomics technology

    Science.gov (United States)

    Jones, W. L.

    1977-01-01

    Major areas of research and development in ergonomics technology for space environments are discussed. Attention is given to possible applications of the technology developed by NASA in industrial settings. A group of mass spectrometers for gas analysis capable of fully automatic operation has been developed for atmosphere control on spacecraft; a version for industrial use has been constructed. Advances have been made in personal cooling technology, remote monitoring of medical information, and aerosol particle control. Experience gained by NASA during the design and development of portable life support units has recently been applied to improve breathing equipment used by fire fighters.

  2. Analyzing Real-World Light Duty Vehicle Efficiency Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, Jeffrey; Wood, Eric; Chaney, Larry; Holden, Jacob; Jeffers, Matthew; Wang, Lijuan

    2016-06-08

    Off-cycle technologies represent an important pathway to achieve real-world fuel savings, through which OEMs can potentially receive credit toward CAFE compliance. DOE national labs such as NREL are well positioned to provide objective input on these technologies using large, national data sets in conjunction with OEM- and technology-specific testing. This project demonstrates an approach that combines vehicle testing (dynamometer and on-road) with powertrain modeling and simulation over large, representative datasets to quantify real-world fuel economy. The approach can be applied to specific off-cycle technologies (engine encapsulation, start/stop, connected vehicle, etc.) in A/B comparisons to support calculation of realistic real-world impacts. Future work will focus on testing-based A/B technology comparisons that demonstrate the significance of this approach.

  3. A quantitative assessment of the Hadoop framework for analyzing massively parallel DNA sequencing data

    OpenAIRE

    Siretskiy, Alexey; Sundqvist, Tore; Voznesenskiy, Mikhail; Spjuth, Ola

    2015-01-01

    Background New high-throughput technologies, such as massively parallel sequencing, have transformed the life sciences into a data-intensive field. The most common e-infrastructure for analyzing this data consists of batch systems that are based on high-performance computing resources; however, the bioinformatics software that is built on this platform does not scale well in the general case. Recently, the Hadoop platform has emerged as an interesting option to address the challenges of incre...

  4. Analyzing the structure of computer expert training in higher education system

    OpenAIRE

    Sergej Rusakov; Igor' Semakin; Henner, E

    2010-01-01

    We consider a variety of programs for training computer and information technology experts, and analyze the structure of knowledge of IT experts of various types. With the help of cluster analysis, we have established three clusters of programs, which we chose to call "mathematician/programmer", "engineer/programmer", and "system administrator". We determine didactic units related to each module of the training and obtain expert evaluation of the coverage of the modules by the didactic units....

  5. Analyzing the forces binding a restriction endonuclease to DNA using a synthetic nanopore

    OpenAIRE

    Dorvel, B.; Sigalov, G.; Zhao, Q.; Comer, J.; Dimitrov, V; Mirsaidov, U.; Aksimentiev, A.; Timp, G.

    2009-01-01

    Restriction endonucleases are used prevalently in recombinant DNA technology because they bind so stably to a specific target sequence and, in the presence of cofactors, cleave double-helical DNA specifically at a target sequence at a high rate. Using synthetic nanopores along with molecular dynamics (MD), we have analyzed with atomic resolution how a prototypical restriction endonuclease, EcoRI, binds to the DNA target sequence—GAATTC—in the absence of a Mg2+ ion cofactor. We have previously...

  6. Study on Analyzing Monodisperse Uranium Oxide Particle by FT-TIMS

    Institute of Scientific and Technical Information of China (English)

    CHEN; Yan; WANG; Fan; ZHAO; Yong-gang; LI; Li-li; ZHANG; Yan; SHEN; Yan; CUI; Jian-yong; LIU; Yuang

    2012-01-01

    <正>Environmental sampling is the important one of IAEA safeguards technology, the aim of which is detecting the undeclared nuclear activities. Analyzing isotopic ratio of single uranium-bearing particle in swipe samples was a effective analytic technique in virtue of its ability of achieving the present or past information of nuclear facilities. For this purpose, a new method of Fission track (FT) technique combined with thermal ionization mass spectrometry (TIMS) was developed.

  7. Realization of inhomogeneous magnetic field for prism-type mass analyzer

    Directory of Open Access Journals (Sweden)

    P.O. Kuzema

    2012-06-01

    Full Text Available The configuration of magnet polar tips, which form in its gap the inhomogeneous magnetic field with the axial symmetry, has been determined and the technology of their production has been described. It is shown that for the given value of the polar tip apex angle, the necessary heterogeneity of magnetic field can be provided by the corresponding choice of the interpolar gap width of the mass analyzer magnet.

  8. STATISTICAL ANALYZE OF APEC COUNTRIES AND TURKEY IN TERMS OF KNOWLEDGE SOCIETY INDICATORS AND SOME FINDINGS

    OpenAIRE

    ARIÇ, HATİCE ERKEKOĞLU – K. HALİL

    2013-01-01

    In this study six variables were used which are related about knowledge society and calculated by World Bank. These variables are Knowledge Economy Index (KEI), Knowledge Index (KI), Economic Intensive and Institutional Regime, Innovation System, Education and Human Resources, Knowledge and Communication Technologies. Study includes twenty APEC countries and Turkey. As a method, hierarchical cluster analysis was used. Probability of correct classification was evaluated by discriminant analyze...

  9. Analyzing the impact of course structure on electronic textbook use in blended introductory physics courses

    OpenAIRE

    Seaton, Daniel T.; Kortemeyer, Gerd; Bergner, Yoav; Rayyan, Saif; David E. Pritchard

    2013-01-01

    We investigate how elements of course structure (i.e., the frequency of assessments as well as the sequencing and weight of course resources) influence the usage patterns of electronic textbooks (e-texts) in introductory physics courses. Specifically, we analyze the access logs of courses at Michigan State University and the Massachusetts Institute of Technology, each of which deploy e-texts as primary or secondary texts in combination with different formative assessments (e.g., embedded read...

  10. Analyzer of energy spectra of a magnetized relativistic electron beam

    International Nuclear Information System (INIS)

    Analyzer of magnetized REB instant energy spectrum is described. The analyzer operation principle is based on the application of a sharp change of the direction of force lines of a magnetic field which is non-adiabatic for the beam electrons. The analyzer design is described, the main factors effecting the energy resolution are considered. The analyzer serviceability is examined in the course of experiments on plasma heating using a heavy-current microsecond REB at the GOL-3 device. The analyzer energy resolution which does not exceed 10% at 0.8 MeV energy and 20% at 0.3 MeV is determined. Beam energy spectra are obtained in one of the regimes of beam interaction with plasma. The efficiency of beam interaction with plasma determined using the analyzer achieves 30%. 10 refs.; 7 figs

  11. Exploration technology

    Energy Technology Data Exchange (ETDEWEB)

    Roennevik, H.C. [Saga Petroleum A/S, Forus (Norway)

    1996-12-31

    The paper evaluates exploration technology. Topics discussed are: Visions; the subsurface challenge; the creative tension; the exploration process; seismic; geology; organic geochemistry; seismic resolution; integration; drilling; value creation. 4 refs., 22 figs.

  12. Videodisc technology

    Energy Technology Data Exchange (ETDEWEB)

    Marsh, F.E. Jr.

    1981-03-01

    An overview of the technology of videodiscs is given. The emphasis is on systems that use reflection or transmission of laser light. Possible use of videodiscs for storage of bibliographic information is considered. 6 figures, 3 tables. (RWR)

  13. Plasma technology

    International Nuclear Information System (INIS)

    IREQ was contracted by the Canadian Electrical Association to review plasma technology and assess the potential for application of this technology in Canada. A team of experts in the various aspects of this technology was assembled and each team member was asked to contribute to this report on the applications of plasma pertinent to his or her particular field of expertise. The following areas were examined in detail: iron, steel and strategic-metals production; surface treatment by spraying; welding and cutting; chemical processing; drying; and low-temperature treatment. A large market for the penetration of electricity has been identified. To build up confidence in the technology, support should be provided for selected R and D projects, plasma torch demonstrations at full power, and large-scale plasma process testing

  14. Banana technology

    Science.gov (United States)

    van Amstel, Willem D.; Schellekens, E. P. A.; Walravens, C.; Wijlaars, A. P. F.

    1999-09-01

    With 'Banana Technology' an unconventional hybrid fabrication technology is indicated for the production of very large parabolic and hyperbolic cylindrical mirror systems. The banana technology uses elastic bending of very large and thin glass substrates and fixation onto NC milled metal moulds. This technology has matured during the last twenty years for the manufacturing of large telecentric flat-bed scanners. Two construction types, called 'internal banana' and 'external banana; are presented. Optical figure quality requirements in terms of slope and curvature deviations are discussed. Measurements of these optical specifications by means of a 'finishing rod' type of scanning deflectometer or slope tester are presented. Design constraints for bending glass and the advantages of a new process will be discussed.

  15. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels...... of innovation" understanding of learning. Narula and Smith reconcile an important paradox. On the one hand, locations and firms are increasingly interdependent through supranational organisations, regional integration, strategic alliances, and the flow of investments, technologies, ideas and people....... The boundaries of firms and countries are increasingly porous and imprecise, because firms use alliances and outsourcing, and countries are rarely technologically self-sufficient. On the other hand, locations remain distinct and idiosyncratic, with innovation systems remaining largely nationally bound. Knowledge...

  16. 美国基因技术与专利制度的互动诉求及趋势--以Myriad Genetics案的起因为视角%Study the Interaction Demand and Trend of the Gene Technology and the U.S. Patent Regime:By analyzing the Reasons of the Case 'Association for Molecular Pathology et al. v. Myriad Genetics, Inc., et al'

    Institute of Scientific and Technical Information of China (English)

    吴秀文; 肖冬梅

    2015-01-01

    The conflict between the high-value gene patents and the human life and health rights, the very strin-gent conditions of the research exemption of the US Patent Regime, the open-mind attitude on the subject matter of the U.S. Patent Regime, which is the root of Myriad Genetics case, but also revealed the interactive demand of gene technology and US Patent Regime. However, the Supreme Court of the United States held that the patent claims of the isolated DNA was invalid which can only temporarily quell the "Controversial" situation of the Patent Regime due to the gene technology, far from achieving benign interaction between gene technology and the Patent Regime. Through appropriate relaxation of the research exemption condition, limit the patentable subject matter of gene technology, in order to balance the interests of gene patentees and the people’human life and health rights, promote the optimization and trend of Patent Regime on gene technology.%高价值基因专利权人利益与人类生命健康权益的冲突、美国专利制度对科研豁免规定的严格条件、美国专利法对专利权客体所持的开放、包容态度,这既是Myriad Genetics案发生的根源所在,也传达了基因技术与专利制度的互动诉求。美国联邦最高法院作出的关于分离的DNA序列不具备可专利性的回应只是暂时平息基因专利争议的权宜之计,远未实现基因技术与专利制度的良性互动。通过适度放宽科研豁免条件、限缩基因类专利的客体范围,以平衡基因技术专利权人利益与人类生命健康权益,推动基因专利制度的优化趋势。

  17. Digital signal processing in the radio science stability analyzer

    Science.gov (United States)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  18. Evaluation of the Olympus AU 400 clinical chemistry analyzer.

    Science.gov (United States)

    Bilić, A; Alpeza, I; Rukavina, A S

    2000-01-01

    The performance of the Olympus AU 400 clinical chemistry analyzer was evaluated according to the guidelines of the European Committee for Clinical Laboratory Standards. The following analytes were tested: glucose, urea, creatinine, calcium, AST, ALT, CK, LDH, ALP and amylase. The Olympus AU 400 was compared with the Olympus AU 800. Coefficients of correlation showed high correlation between the compared analyzers. Other performances (intra- and inter-assay variation, carry-over and interferences) of the analyzer were satisfactory.

  19. Technology Trade

    OpenAIRE

    José L. Groizard

    2008-01-01

    This study addresses the question of why some countries import more R&D- intensive goods than others. Using a panel data set of 80 countries for the period 1970 to 1995, results indicate that domestic investment, FDI and the quality of intellectual property rights (IPR) systems positively affect technology imports. However, the higher the percentage of the workforce with primary studies, the lower technology imports are. Moreover, IPRs tend to reinforce the positive role played by FDI in impo...

  20. Fabrication Technology

    Energy Technology Data Exchange (ETDEWEB)

    Blaedel, K.L.

    1993-03-01

    The mission of the Fabrication Technology thrust area is to have an adequate base of manufacturing technology, not necessarily resident at Lawrence Livermore National Laboratory (LLNL), to conduct the future business of LLNL. The specific goals continue to be to (1) develop an understanding of fundamental fabrication processes; (2) construct general purpose process models that will have wide applicability; (3) document findings and models in journals; (4) transfer technology to LLNL programs, industry, and colleagues; and (5) develop continuing relationships with the industrial and academic communities to advance the collective understanding of fabrication processes. The strategy to ensure success is changing. For technologies in which they are expert and which will continue to be of future importance to LLNL, they can often attract outside resources both to maintain their expertise by applying it to a specific problem and to help fund further development. A popular vehicle to fund such work is the Cooperative Research and Development Agreement with industry. For technologies needing development because of their future critical importance and in which they are not expert, they use internal funding sources. These latter are the topics of the thrust area. Three FY-92 funded projects are discussed in this section. Each project clearly moves the Fabrication Technology thrust area towards the goals outlined above. They have also continued their membership in the North Carolina State University Precision Engineering Center, a multidisciplinary research and graduate program established to provide the new technologies needed by high-technology institutions in the US. As members, they have access to and use of the results of their research projects, many of which parallel the precision engineering efforts at LLNL.

  1. Lasers technology

    International Nuclear Information System (INIS)

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners

  2. Lasers technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-07-01

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners.

  3. Knowledge Technologies

    OpenAIRE

    Milton, Nick

    2008-01-01

    Several technologies are emerging that provide new ways to capture, store, present and use knowledge. This book is the first to provide a comprehensive introduction to five of the most important of these technologies: Knowledge Engineering, Knowledge Based Engineering, Knowledge Webs, Ontologies and Semantic Webs. For each of these, answers are given to a number of key questions (What is it? How does it operate? How is a system developed? What can it be used for? What tools are available? Wha...

  4. Technological Inovattion

    OpenAIRE

    Alexandra Bostan

    2009-01-01

    The spectacular development of technology within the field of informatics and telecommunicationfor the last decade, associated with a postindustrial revolution, has solidly contributed to the globalization ofthe contemporary international economic life. A very important factor in promoting the globalization ofproduction and the financial globalization is the recent progress from the technology of information andcommunication which has a strong impact on the economic, social and cultural life....

  5. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  6. Technology overview

    International Nuclear Information System (INIS)

    The Integrated Assessment Program, funded by the ERDA Division of Technology Overview, is the mechanism by which health, environmental, social, economic and institutional factors are combined into a form useful for energy planning and decision making. This program selectively combines information about effects of alternative energy technologies (such as waste releases, land and water use, and social effects) to produce broad-based assessments of the advantages and disadvantages of energy and conservation options. As a corollary, needs for further research, development, and technology transfer are identified. The program is focused on four interrelated activities: supporting systems analysis to develop and improve methods for use in assessing and comparing impacts of energy and conservation options; integrated technological impact assessment, applying these methods to help select technologies for development that are safe, clean, and environmentally acceptable; regional comparative assessments, applying the results of the technological impact assessments to identification of regional energy strategies; and a regional outreach effort to assist regional and state agencies in their energy planning programs

  7. Biomedical sensing analyzer (BSA) for mobile-health (mHealth)-LTE.

    Science.gov (United States)

    Adibi, Sasan

    2014-01-01

    The rapid expansion of mobile-based systems, the capabilities of smartphone devices, as well as the radio access and cellular network technologies are the wind beneath the wing of mobile health (mHealth). In this paper, the concept of biomedical sensing analyzer (BSA) is presented, which is a novel framework, devised for sensor-based mHealth applications. The BSA is capable of formulating the Quality of Service (QoS) measurements in an end-to-end sense, covering the entire communication path (wearable sensors, link-technology, smartphone, cell-towers, mobile-cloud, and the end-users). The characterization and formulation of BSA depend on a number of factors, including the deployment of application-specific biomedical sensors, generic link-technologies, collection, aggregation, and prioritization of mHealth data, cellular network based on the Long-Term Evolution (LTE) access technology, and extensive multidimensional delay analyses. The results are studied and analyzed in a LabView 8.5 programming environment.

  8. A Novel Analyzer Control System for Diffraction Enhanced Imaging

    Science.gov (United States)

    Rhoades, Glendon; Belev, George; Rosenberg, Alan; Chapman, Dean

    2013-03-01

    Diffraction Enhanced Imaging is an imaging modality that derives contrast from x-ray refraction, an extreme form of scatter rejection (extinction) and absorption which is common to conventional radiography. A critical part of the imaging system is the "analyzer crystal" which is used to re-diffract the beam after passing through the object being imaged. The analyzer and monochromator crystals form a matched parallel crystal set. This analyzer needs to be accurately aligned and that alignment maintained over the course of an imaging session. Typically, the analyzer needs to remain at a specific angle within a few tens of nanoradians to prevent problems with image interpretation. Ideally, the analyzer would be set to a specific angle and would remain at that angle over the course of an imaging session which might be from a fraction of a second to several minutes or longer. In many instances, this requirement is well beyond what is possible by relying on mechanical stability alone and some form of feedback to control the analyzer setting is required. We describe a novel analyzer control system that allows the analyzer to be set at any location in the analyzer rocking curve, including the peak location. The method described is extensible to include methods to extend the range of analyzer control to several Darwin widths away from the analyzer peaked location. Such a system is necessary for the accurate implementation of the method and is intended to make the use of the method simpler without relying on repeated alignment during the imaging session.

  9. THE LINK BETWEEN NON TECHNOLOGICAL INNOVATIONS AND TECHNOLOGICAL INNOVATION

    OpenAIRE

    Nguyen-Thi, Thuc Uyen; Mothe, Caroline

    2010-01-01

    Purpose This paper aims to provide evidence of the major role of non-technological activities in the innovation process. It highlights the effects of marketing and organizational innovation strategies on technological innovation performance. Design/methodology/approach The article tests theoretical hypotheses on a sample of 555 firms of the 4th Community Innovation Survey (CIS 4) in 2006 in Luxembourg. Data are analyzed through a generalizedTobit model. Findings In the present study, evidence...

  10. Technology and technology transfer: some basic issues

    OpenAIRE

    Shamsavari, Ali; Adikibi, Owen; Taha, Yasser

    2002-01-01

    This paper addresses various issues relating to technology and transfer of technology such as technology and society, technology and science, channels and models of technology transfer, the role of multinational companies in transfer of technology, etc. The ultimate objective is to pose the question of relevance of some existing models and ideas like technological independence in an increasingly globalised world economy.

  11. Technological Unemployment and an Attainable Way Out

    OpenAIRE

    Pavlova, Adelina

    2015-01-01

    The purpose of the thesis is to analyze the available information on the technological unemployment issue. The hypothesis of the thesis is that displacement of workers because of technological development has reasonable chances to happen in the future. Technological unemployment is hotly debated issue. Some part of economists argue that technological unemployment is a short-term problem; others see it as a risk for society. Thus, at this stage it is important to identify controversy in studie...

  12. Information Technology Teachers' Practices Regarding Mainstreaming Students

    OpenAIRE

    ERDOĞDU, Funda; ÖZBEY, Fidan

    2014-01-01

    This study is conducted to investigate information technology teachers’ practices related to instructional adaptations for mainstreaming students at information technology courses. In the research, the data were collected with the qualitatively patterned and semi-structured interview form. The data obtained from the interviews were analyzed through content analysis. The research findings indicate that information technology teachers at information technology courses carry out such practices a...

  13. 21世纪制纸技术、纸制品进化之予测的有关人类要素分析的认识科学和文化遗传基因科学的方法%A COGNITIVE AND MEMETIC SCIENCE APPROACH TO ANALYZE THE HUMAN FACTORS IN PREDICTING THE EVOLUTION OF PAPER TECHNOLOGY AND PRODUCTS IN THE 21sT CENTURY

    Institute of Scientific and Technical Information of China (English)

    尾锅史彦

    2004-01-01

    @@ INTRODUCTION Predicting the future of paper industry is conventionally conducted from the technological and market-oriented aspects as well as a variety of constraints lying ahead of the industry such as resource, energy, and environmental issues.Since paper products, particularly paper media,have higher affinity to human being compared with other sheet-like materials such as plastics, metals,glasses and so on, not only the above factors but human factors such as ‘the affinity of paper to human being' and ‘the cognitive characteristics of paper'have to be taken into consideration in constructing a precise prediction model for the future of paper industry.

  14. 21 CFR 868.1720 - Oxygen gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... gases by techniques such as mass spectrometry, polarography, thermal conductivity, or gas chromatography... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Oxygen gas analyzer. 868.1720 Section 868.1720...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1720 Oxygen gas analyzer....

  15. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  16. 40 CFR 89.318 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ...) Bubble a mixture of 3 percent CO2 in N2 through water at room temperature and record analyzer response... check. A CO2 span gas having a concentration of 80 percent to 100 percent of full scale of the maximum operating range used during testing shall be passed through the CO2 NDIR analyzer and the value recorded...

  17. 40 CFR 86.122-78 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... nitrogen. (3) Bubble a mixture of 3 percent CO2 in N2 through water at room temperature and record analyzer... monoxide analyzer shall be checked for response to water vapor and CO2: (1) Follow the manufacturer's... carbon monoxide in N2 calibration gases having nominal concentrations of 15, 30, 45, 60, 75, and...

  18. 40 CFR 91.325 - Analyzer interference checks.

    Science.gov (United States)

    2010-07-01

    ... room temperature a CO2 span gas having a concentration of between 80 percent and 100 percent inclusive... expected concentrations experienced during testing. (1) NOX analyzer CO2 quench check. (i) Pass a CO2 span... used during testing through the CO2 NDIR analyzer and record the value as “a.” (ii) Dilute the CO2...

  19. 40 CFR 90.317 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... CO2 in N2 through water at room temperature and record analyzer response. (4) An analyzer response of... vapor and CO2. (1) Follow good engineering practices for instrument start-up and operation. Adjust the... with carbon monoxide-in-N2 calibration gases having nominal concentrations between 10 and 90 percent...

  20. DUAL-CHANNEL PARTICLE SIZE AND SHAPE ANALYZER

    Institute of Scientific and Technical Information of China (English)

    Arjen van der Schoot

    2004-01-01

    @@ Fig. 1 shows a newly developed analyzer (Ankersmid CIS-100) that brings together two different measurement channels for accurate size and shape measurement of spherical and non-spherical particles. The size of spherical particles is measured by a HeNe Laser Beam; the size of non-spherical particles is analyzed by Dynamic Video Analysis of the particles' shape.

  1. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  2. Empirical Validity of Ertl's Brain-Wave Analyzer (BWA02).

    Science.gov (United States)

    Fischer, Donald G.; And Others

    1978-01-01

    The empirical validity of Ertl's brain wave analyzer was investigated by the known contrasted groups method. Thirty-two academically talented and 16 academically handicapped were compared on four Primary Mental Abilities tests, two Sequential Tests of Educational Progress measures, and seven Brain Wave Analyzer measures. (Author/JKS)

  3. MESAFace, a graphical interface to analyze the MESA output

    CERN Document Server

    Giannotti, Maurizio; Mohammed, Aaron

    2012-01-01

    MESA (Modules for Experiments in Stellar Astrophysics) has become very popular among astrophysicists as a powerful and reliable code to simulate stellar evolution. Analyzing the output data thoroughly may, however, present some challenges and be rather time-consuming. Here we describe MESAFace, a graphical and dynamical interface which provides an intuitive, efficient and quick way to analyze the MESA output.

  4. 21 CFR 868.1120 - Indwelling blood oxyhemoglobin concentration analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Indwelling blood oxyhemoglobin concentration... Indwelling blood oxyhemoglobin concentration analyzer. (a) Identification. An indwelling blood oxyhemoglobin concentration analyzer is a photoelectric device used to measure, in vivo, the oxygen-carrying capacity...

  5. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... at any point, use the best-fit non-linear equation which represents the data to within two percent of... been set to the most common operating range. (4) Introduce into the NOX generator analyzer-system an NO... off the NOX generator but maintain gas flow through the system. The oxides of nitrogen analyzer...

  6. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID... methanol-fueled vehicles shall be operated at 235° ±15 °F (113° ±8 °C)). Analyzers used with gasoline-fuel...-fuel may be optimized using methane, or if calibrated using propane the FID response to methane...

  7. The design of wavelength selector for full-automatic ELISA analyzer

    Science.gov (United States)

    Bao, Yan; Dong, Mingli; Zhu, Lianqing; Chang, Haitao; Li, Hong

    2011-05-01

    In recent years, ELISA technology has been developed rapidly and full-automatic ELISA analyzer, which is of significant practical value is widely used in the diagnosis of many diseases, such as, bacteria and viruses. Optical detection system is the hard core of fully automated ELISA analyzer and the key part of system is high-precision wavelength selector. The authors in this paper present the design of wavelength selector composed of light source, optical circuit plan, color filters and the module of signal acquisition. A control system for stepper motor which is used to choose the suitable color filters based on the microcontroller 8051 is introduced. From the results of experiment test, it can be seen that the wavelength selector is sufficient to meet the requirements of the full-automatic ELISA analyzer.

  8. Technology cycles and technology revolutions

    Energy Technology Data Exchange (ETDEWEB)

    Paganetto, Luigi; Scandizzo, Pasquale Lucio

    2010-09-15

    Technological cycles have been characterized as the basis of long and continuous periods economic growth through sustained changes in total factor productivity. While this hypothesis is in part consistent with several theories of growth, the sheer magnitude and length of the economic revolutions experienced by humankind seems to indicate surmise that more attention should be given to the origin of major technological and economic changes, with reference to one crucial question: role of production and use of energy in economic development.

  9. Analyzing Service Oriented Architecture (SOA) in Open Source Products

    OpenAIRE

    Gohar, Adnan

    2010-01-01

    Service Oriented Architecture (SOA) is an architectural paradigm that allows building of infrastructures for diverse application interaction and integration via services across different platforms, domains of technology and locations. SOA differs from traditional architectures, as it focuses on integrating capabilities that are distributed and implemented using a mixture of technologies. SOA provides a set of methodologies and strategies to accomplish interoperability and integration among di...

  10. Persuasive Technology

    DEFF Research Database (Denmark)

    This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included in this vol......This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included...... in this volume. The topics covered are emotions and user experience, ambient persuasive systems, persuasive design, persuasion profiles, designing for health, psychology of persuasion, embodied and conversational agents, economic incentives, and future directions for persuasive technology....

  11. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  12. Soulful Technologies

    DEFF Research Database (Denmark)

    Fausing, Bent

    2010-01-01

    Samsung introduced in 2008 a mobile phone called "Soul" made with a human touch and including itself a "magic touch". Through the analysis of a Nokia mobile phone TV-commercials I want to examine the function and form of digital technology in everyday images. The mobile phone and its digital camera...... commercials and internet commercials for mobile phones from Nokia, or handheld computers, as Sony-Ericsson prefers to call them. Digital technology points towards a forgotten pre-human and not only post-human condition....

  13. Mirror Technology

    Science.gov (United States)

    1992-01-01

    Under a NASA contract, MI-CVD developed a process for producing bulk silicon carbide by means of a chemical vapor deposition process. The technology allows growth of a high purity material with superior mechanical/thermal properties and high polishability - ideal for mirror applications. The company employed the technology to develop three research mirrors for NASA Langley and is now marketing it as CVD SILICON CARBIDE. Its advantages include light weight, thermal stability and high reflectivity. The material has nuclear research facility applications and is of interest to industrial users of high power lasers.

  14. Playful Technology

    DEFF Research Database (Denmark)

    Johansen, Stine Liv; Eriksson, Eva

    2013-01-01

    In this paper, the design of future services for children in Danish public libraries is discussed, in the light of new challenges and opportunities in relation to new media and technologies. The Danish government has over the last few years initiated and described a range of initiatives regarding...... in the library, the changing role of the librarians and the library space. We argue that intertwining traditional library services with new media forms and engaging play is the core challenge for future design in physical public libraries, but also that it is through new media and technology that new...

  15. Architectural technology

    DEFF Research Database (Denmark)

    2005-01-01

    The booklet offers an overall introduction to the Institute of Architectural Technology and its projects and activities, and an invitation to the reader to contact the institute or the individual researcher for further information. The research, which takes place at the Institute of Architectural...... Technology at the Roayl Danish Academy of Fine Arts, School of Architecture, reflects a spread between strategic, goal-oriented pilot projects, commissioned by a ministry, a fund or a private company, and on the other hand projects which originate from strong personal interests and enthusiasm of individual...

  16. Aplicación de un modelo de ecuaciones estructurales para analizar los sistemas de gestión en la integración de la RSC y su influencia en la estrategia y el performance de las empresas tecnológicas || Applying a structural equation model to analyze management systems in the integration of CSR and its influence on the strategy and performance of technology companies

    Directory of Open Access Journals (Sweden)

    Bernal Conesa, Juan Andrés

    2016-06-01

    Full Text Available La importancia de los sistemas de gestión para la integración de la RSC en la estrategia de la empresa es un recurso vital que ha sido poco estudiado en las empresas tecnológicas. En este artículo se propone un modelo de ecuaciones estructurales para explicar la influencia de la RSC y su integración en el sistema de gestión de la empresa, facilitada por la existencia de sistemas de gestión normalizados previos, y cómo influye dicha integración en la estrategia de la empresa y si esto tiene un reflejo en el performance económico de la empresa tecnológica. El estudio se llevó a cabo en empresas ubicadas en parques científicos y tecnológicos españoles. Los resultados del modelo revelan que existe una relación positiva, directa y estadísticamente significativas entre la integración de la RSC y la estrategia, por un lado, y la integración y el performance, por el otro. Asimismo se evidencia unas relaciones indirectas entre los sistemas de gestión normalizados previos a la implantación de la RSC y el performance y, por tanto, con implicaciones prácticas para la gestión de la RSC en empresas tecnológicas. || The importance of management systems for the integration of CSR in the company strategy is a vital resource that has been little studied in technology companies. In this paper a structural equation model is proposed in order to explain the influence of CSR and its integration into the management system of the company. This influence is facilitated by the existence of previous standardized management systems, and how this integration affects the strategy of the company and if this is a reflection on the economic performance of the technology company. The study was conducted in companies located in Spanish Science and Technology Parks. On the one hand, model results shows that there is a positive, direct and statistically significant relationship between the integration of CSR and strategy; on the other hand, performance and

  17. International Space Station Major Constituent Analyzer On-orbit Performance

    Science.gov (United States)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  18. Emergency response training with the BNL plant analyzer

    International Nuclear Information System (INIS)

    Presented in the experience in the use of the BNL plant analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training

  19. Relations between the technological standards and technological appropriation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto PRADO GUERRERO

    2010-06-01

    Full Text Available The objective of this study is to analyze the educational practices of using Blackboard in blended learning environments with students of higher education to understand the relationship between technological appropriation and standards of educational technology. To achieve that goal, the following research question was raised: ¿To what extent are the standards of education technology with the appropriation of technology in blended learning environments in higher educa­tion related? The contextual framework of this work includes the following topics: the institution, teaching, teachers and students. The design methodology that was used is of a correlation type. Correlations were carried out to determine the frequency and level in the technological standards as well as the appropriation of technology. In the comparison of the results obtained by the students, the teachers and the platform; we found that students in the school study showed a high degree of technology ownership and this was the same for the performance shown on the technological standards. It was established that teachers play a key role in developing the techno­logical appropriation of students and performance in technology standards.

  20. Development of a Virtual Technology Coach to Support Technology Integration for K-12 Educators

    Science.gov (United States)

    Sugar, William; van Tryon, Patricia J. Slagter

    2014-01-01

    In an effort to develop a virtual technology coach for K-12 educators, this article analyzed survey results from sixty teachers with regards to specific resources that a technology coach could provide within a virtual environment. A virtual technology coach was proposed as a possible solution to provide continual professional development for…

  1. Technological Dynamics and Social Capability

    DEFF Research Database (Denmark)

    Fagerberg, Jan; Feldman, Maryann; Srholec, Martin

    2014-01-01

    for the sample as a whole between 1998 and 2008. The results indicate that social capabilities, such as well-developed public knowledge infrastructure, an egalitarian distribution of income, a participatory democracy and prevalence of public safety condition the growth of technological capabilities. Possible......This article analyzes factors shaping technological capabilities in USA and European countries, and shows that the differences between the two continents in this respect are much smaller than commonly assumed. The analysis demonstrates a tendency toward convergence in technological capabilities...

  2. Mini Total Organic Carbon Analyzer (miniTOCA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Total Organic Carbon (TOC) analyzers function by converting (oxidizing) all organic compounds (contaminants) in the water sample to carbon dioxide gas (CO2), then...

  3. Triple Isotope Water Analyzer for Extraplanetary Studies Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to employ Off-Axis ICOS to develop triple-isotope water analyzers for lunar and other extraplanetary exploration. This instrument...

  4. Mars & Multi-Planetary Electrical Environment Spectrum Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Our objective is to develop MENSA as a highly integrated planetary radio and digital spectrum analyzer cubesat payload that can be deployed as a satellite...

  5. Lab-on-a-chip astrobiology analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an astrobiology analyzer to measure chemical signatures of life in extraterrestrial settings. The...

  6. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  7. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Los Gatos Research (LGR) proposes to develop highly-accurate, lightweight, low-power gas analyzers for measurements of carbon dioxide (CO2) and water vapor (H2O)...

  8. Mobile Greenhouse Gas Flux Analyzer for Unmanned Aerial Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for eddy flux covariance...

  9. 40 CFR 1065.250 - Nondispersive infra-red analyzer.

    Science.gov (United States)

    2010-07-01

    ... analyzer that has compensation algorithms that are functions of other gaseous measurements and the engine's known or assumed fuel properties. The target value for any compensation algorithm is 0.0% (that is,...

  10. 40 CFR 1065.272 - Nondispersive ultraviolet analyzer.

    Science.gov (United States)

    2010-07-01

    ... in § 1065.307. You may use a NDUV analyzer that has compensation algorithms that are functions of... compensation algorithm is 0.0% (that is, no bias high and no bias low), regardless of the uncompensated...

  11. PICARD - A PIpeline for Combining and Analyzing Reduced Data

    Science.gov (United States)

    Gibb, Andrew G.; Jenness, Tim; Economou, Frossie

    PICARD is a facility for combining and analyzing reduced data, normally the output from the ORAC-DR data reduction pipeline. This document describes an introduction to using PICARD for processing instrument-independent data.

  12. Online Spectral Fit Tool for Analyzing Reflectance Spectra

    Science.gov (United States)

    Penttilä, A.; Kohout, T.

    2015-11-01

    The Online Spectral Fit Tool is developed for analyzing Vis-NIR spectral behavior of asteroids and meteorites. Implementation is done using JavaScript/HTML. Fitted spectra consist of spline continuum and gamma distributions for absorption bands.

  13. Airspace Analyzer for Assessing Airspace Directional Permeability Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We build a software tool which enables the user (airline or Air Traffic Service Provider (ATSP)) the ability to analyze the flight-level-by-flight-level...

  14. Josephson junction spectrum analyzer for millimeter and submillimeter wavelengths

    Energy Technology Data Exchange (ETDEWEB)

    Larkin, S.Y.; Anischenko, S.E.; Khabayev, P.V. [State Research Center, Kiev (Ukraine)

    1994-12-31

    A prototype of the Josephson-effect spectrum analyzer developed for the millimeter-wave band is described. The measurement results for spectra obtained in the frequency band from 50 to 250 GHz are presented.

  15. Multisensor Analyzed Sea Ice Extent - Northern Hemisphere (MASIE-NH)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Multisensor Analyzed Sea Ice Extent Northern Hemisphere (MASIE-NH) products provide measurements of daily sea ice extent and sea ice edge boundary for the...

  16. Technology Transfer

    Science.gov (United States)

    Bullock, Kimberly R.

    1995-01-01

    The development and application of new technologies in the United States has always been important to the economic well being of the country. The National Aeronautics and Space Administration (NASA) has been an important source of these new technologies for almost four decades. Recently, increasing global competition has emphasized the importance of fully utilizing federally funded technologies. Today NASA must meet its mission goals while at the same time, conduct research and development that contributes to securing US economic growth. NASA technologies must be quickly and effectively transferred into commercial products. In order to accomplish this task, NASA has formulated a new way of doing business with the private sector. Emphasis is placed on forming mutually beneficial partnerships between NASA and US industry. New standards have been set in response to the process that increase effectiveness, efficiency, and timely customer response. This summer I have identified potential markets for two NASA inventions: including the Radially Focused Eddy Current Sensor for Characterization of Flaws in Metallic Tubing and the Radiographic Moire. I have also worked to establish a cooperative program with TAG, private industry, and a university known as the TAG/Industry/Academia Program.

  17. Manufacturing technologies

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The Manufacturing Technologies Center is an integral part of Sandia National Laboratories, a multiprogram engineering and science laboratory, operated for the Department of Energy (DOE) with major facilities at Albuquerque, New Mexico, and Livermore, California. Our Center is at the core of Sandia`s Advanced Manufacturing effort which spans the entire product realization process.

  18. Energy Technology.

    Science.gov (United States)

    Eaton, William W.

    Reviewed are technological problems faced in energy production including locating, recovering, developing, storing, and distributing energy in clean, convenient, economical, and environmentally satisfactory manners. The energy resources of coal, oil, natural gas, hydroelectric power, nuclear energy, solar energy, geothermal energy, winds, tides,…

  19. Lasers technology

    International Nuclear Information System (INIS)

    The Lasers Technology Program of IPEN is strongly committed to the study of Laser Applications on several areas: Nuclear, Medicine and Dentistry, Industry, Environment and Advanced research, aiming not only research but diffusion and innovation in association with Brazilian universities and commercial partners

  20. Technology transfer

    International Nuclear Information System (INIS)

    This paper emphasizes in the specific areas of design, engineering and component production. This paper presents what Framatome has to offer in these areas and its export oriented philosophy. Then, a typical example of this technology transfer philosophy is the collaboration with the South Korean firm, Korea Heavy Industries Corporation (KHIC) for the supply of KNU 9 and KNU 10 power stations

  1. Seafood Technology

    DEFF Research Database (Denmark)

    Børresen, Torger

    must be performed such that total traceability and authenticity of the final products can be presented on demand. The most important aspects to be considered within seafood technology today are safety, healthy products and high eating quality. Safety can be divided into microbiological safety...

  2. Biomethane technology

    OpenAIRE

    Ogejo, Jactone Arogo; Wen, Zhiyou; Ignosh, John; Bendfeldt, Eric S.; Collins, Eldridge, 1940-

    2009-01-01

    This publication provides a general overview of anaerobic digestion and the current status of biomethane technology on livestock farms in the United States. It is part of the Bioenergy Engineering Education Program (BEEP) of the Biological Systems Engineering Department at Virginia Tech. Most of the discussion uses dairy manure as an example of feedstock for an anaerobic digester.

  3. Development of hydrazine analyzer for high concentration of ethanolamine

    International Nuclear Information System (INIS)

    A polarographic electrode method has been conventionally used for monitoring hydrazine concentrations in feedwater of pressurized water reactor (PWR) to control the secondary water chemistry. Measurements of hydrazine concentrations of the high pH controlled feedwater with ethanolamine (ETA) by this existing hydrazine analyzer were not stable on some occasions when the plant was in power operation. To solve this problem we developed a new hydrazine analyzer using p-dimethylaminobenzaldehyde (DABA) colorimetry and flow injection analysis (FIA) methods. (author)

  4. Momentum analyzers DCBA for neutrinoless double beta decay experiments

    International Nuclear Information System (INIS)

    Momentum analyzers called Drift Chamber Beta-ray Analyzer (DCBA) are being developed at KEK in order to search for neutrinoless double beta decays of nuclei. A test prototype, DCBA-T2, has been constructed to confirm the principle detecting electron tracks in a uniform magnetic field. Another prototype, DCBA-T3, is now under construction to improve the energy resolution. The test results and the present statuses of these prototypes are presented.

  5. Momentum analyzers DCBA for neutrinoless double beta decay experiments

    Energy Technology Data Exchange (ETDEWEB)

    Ishihara, Nobuhiro, E-mail: nobuhiro.ishihara@kek.j [High Energy Accelerator Research Organization (KEK), 1-1 Oho, Tsukuba, Ibaraki 305-0801 (Japan)

    2010-11-01

    Momentum analyzers called Drift Chamber Beta-ray Analyzer (DCBA) are being developed at KEK in order to search for neutrinoless double beta decays of nuclei. A test prototype, DCBA-T2, has been constructed to confirm the principle detecting electron tracks in a uniform magnetic field. Another prototype, DCBA-T3, is now under construction to improve the energy resolution. The test results and the present statuses of these prototypes are presented.

  6. DendrometeR: Analyzing the pulse of trees in R

    OpenAIRE

    Ernst van der Maaten; Marieke van der Maaten-Theunissen; Marko Smiljanić; Sergio Rossi; Sonia Simard; Martin Wilmking; Annie Deslauriers; Patrick Fonti; Georg von Arx; Olivier Bouriaud

    2016-01-01

    Dendrometers are measurement devices proven to be useful to analyze tree water relations and growth responses in relation to environmental variability. To analyze dendrometer data, two analytical methods prevail: (1) daily approaches that calculate or extract single values per day, and (2) stem-cycle approaches that separate high-resolution dendrometer records into distinct phases of contraction, expansion and stem-radius increment. Especially the stem-cycle approach requires complex algorith...

  7. Analyzing competitiveness of automotive industry through cumulative belief degrees

    OpenAIRE

    Kabak, Özgür; Ülengin, Füsun; Önsel, Şule; Özaydin, Özay; Aktaş, Emel

    2012-01-01

    Copyright @ 2012 The European Mathematical Society This study aims to analyze the automotive industry from competitiveness perspective using a novel cumulative belief degrees (CBD) approach. For this purpose, a mathematical model based on CBD is proposed to quantify the relations among the variables in a system. This model is used to analyze the Turkish Automotive Industry through scenario analysis. This research is supported by SEDEFED (Federation of Industrial Associations), REF (TÜSİ...

  8. APPROACHES TO ANALYZE THE QUALITY OF ROMANIAN TOURISM WEB SITES

    Directory of Open Access Journals (Sweden)

    Lacurezeanu Ramona

    2013-07-01

    The purpose of our work is to analyze travel web-sites, more exactly, whether the criteria used to analyze virtual stores are also adequate for the Romanian tourism product. Following the study, we concluded that the Romanian online tourism web-sites for the Romanian market have the features that we found listed on similar web-sites of France, England, Germany, etc. In conclusion, online Romanian tourism can be considered one of the factors of economic growth.

  9. Real-Time, Polyphase-FFT, 640-MHz Spectrum Analyzer

    Science.gov (United States)

    Zimmerman, George A.; Garyantes, Michael F.; Grimm, Michael J.; Charny, Bentsian; Brown, Randy D.; Wilck, Helmut C.

    1994-01-01

    Real-time polyphase-fast-Fourier-transform, polyphase-FFT, spectrum analyzer designed to aid in detection of multigigahertz radio signals in two 320-MHz-wide polarization channels. Spectrum analyzer divides total spectrum of 640 MHz into 33,554,432 frequency channels of about 20 Hz each. Size and cost of polyphase-coefficient memory substantially reduced and much of processing loss of windowed FFTs eliminated.

  10. A New Theoretical Framework for Analyzing Stochastic Global Optimization Algorithms

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    In this paper, we develop a new theoretical framework by means of the absorbing Markov process theory for analyzing some stochastic global optimization algorithms. Applying the framework to the pure random search, we prove that the pure random search converges to the global minimum in probability and its time has geometry distribution. We also analyze the pure adaptive search by this framework and turn out that the pure adaptive search converges to the global minimum in probability and its time has Poisson distribution.

  11. Seasonal cycle of cloud cover analyzed using Meteosat images

    OpenAIRE

    Massons, J.; Domingo, D.; Lorente, J.

    1998-01-01

    A cloud-detection method was used to retrieve cloudy pixels from Meteosat images. High spatial resolution (one pixel), monthly averaged cloud-cover distribution was obtained for a 1-year period. The seasonal cycle of cloud amount was analyzed. Cloud parameters obtained include the total cloud amount and the percentage of occurrence of clouds at three altitudes. Hourly variations of cloud cover are also analyzed. Cloud properties determined are coherent with those obtained in previous studies....

  12. FAW Technology Strategies of Low-Carbon Passenger Car

    Institute of Scientific and Technical Information of China (English)

    Li Jun

    2012-01-01

    Author analyzed the global background of low-carbon technology around the world,a technology & economy analysis model called TOS was developed in the paper,author analyzed technology paths for low-carbon Car in China based on the current technologies available and technologies to he developed in China,3 possible paths are presented based on the analysis,author also explained the FAW BlueWay technology strategies for low carbon cars both for short mid and long term objectives.Author concludes the paper with illustration of powertrain lineup for FAW BlueWay Technologies.

  13. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Science.gov (United States)

    Gilev, A. G.; Pleshanov, N. K.; Bazarov, B. A.; Bulkin, A. P.; Schebetov, A. F.; Syromyatnikov, V. G.; Tarnavich, V. V.; Ulyanov, V. A.

    2016-10-01

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4‧ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm2 beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm2 window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm2 window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  14. Vacuum Technology

    Energy Technology Data Exchange (ETDEWEB)

    Biltoft, P J

    2004-10-15

    The environmental condition called vacuum is created any time the pressure of a gas is reduced compared to atmospheric pressure. On earth we typically create a vacuum by connecting a pump capable of moving gas to a relatively leak free vessel. Through operation of the gas pump the number of gas molecules per unit volume is decreased within the vessel. As soon as one creates a vacuum natural forces (in this case entropy) work to restore equilibrium pressure; the practical effect of this is that gas molecules attempt to enter the evacuated space by any means possible. It is useful to think of vacuum in terms of a gas at a pressure below atmospheric pressure. In even the best vacuum vessels ever created there are approximately 3,500,000 molecules of gas per cubic meter of volume remaining inside the vessel. The lowest pressure environment known is in interstellar space where there are approximately four molecules of gas per cubic meter. Researchers are currently developing vacuum technology components (pumps, gauges, valves, etc.) using micro electro mechanical systems (MEMS) technology. Miniature vacuum components and systems will open the possibility for significant savings in energy cost and will open the doors to advances in electronics, manufacturing and semiconductor fabrication. In conclusion, an understanding of the basic principles of vacuum technology as presented in this summary is essential for the successful execution of all projects that involve vacuum technology. Using the principles described above, a practitioner of vacuum technology can design a vacuum system that will achieve the project requirements.

  15. Technology Programme

    Energy Technology Data Exchange (ETDEWEB)

    Batistoni, Paola; De Marco, Francesco; Pieroni, Leonardo (ed.)

    2005-07-01

    The technology activities carried out by the Euratom-ENEA Association in the framework of the European Fusion Development Agreement concern the Next Step (International Thermonuclear Experimental Reactor - ITER), the Long-Term Programme (breeder blanket, materials, International Fusion Materials Irradiation Facility - IFMIF), Power Plant Conceptual Studies and Socio-Economic Studies. The Underlying Technology Programme was set up to complement the fusion activities as well to develop technologies with a wider range of interest. The Technology Programme mainly involves staff from the Frascati laboratories of the Fusion Technical and Scientific Unit and from the Brasimone laboratories of the Advanced Physics Technologies Unit. Other ENEA units also provide valuable contributions to the programme. ENEA is heavily engaged in component development/testing and in design and safety activities for the European Fusion Technology Programme. Although the work documented in the following covers a large range of topics that differ considerably because they concern the development of extremely complex systems, the high level of integration and coordination ensures the capability to cover the fusion system as a whole. In 2004 the most significant testing activities concerned the ITER primary beryllium-coated first wall. In the field of high-heat-flux components, an important achievement was the qualification of the process for depositing a copper liner on carbon fibre composite (CFC) hollow tiles. This new process, pre-brazed casting (PBC), allows the hot radial pressing (HRP) joining procedure to be used also for CFC-based armour monoblock divertor components. The PBC and HRP processes are candidates for the construction of the ITER divertor. In the materials field an important milestone was the commissioning of a new facility for chemical vapour infiltration/deposition, used for optimising silicon carbide composite (SiCf/SiC) components. Eight patents were deposited during 2004

  16. Study of Exotic Weakly Bound Nuclei Using Magnetic Analyzer Mavr

    Science.gov (United States)

    Maslov, V. A.; Kazacha, V. I.; Kolesov, I. V.; Lukyanov, S. M.; Melnikov, V. N.; Osipov, N. F.; Penionzhkevich, Yu. E.; Skobelev, N. K.; Sobolev, Yu. G.; Voskoboinik, E. I.

    2016-06-01

    A project of the high-resolution magnetic analyzer MAVR is proposed. The analyzer will comprise new magnetic optical and detecting systems for separation and identification of reaction products in a wide range of masses (5-150) and charges (1-60). The magnetic optical system consists of the MSP-144 magnet and a doublet of quadrupole lenses. This will allow the solid angle of the spectrometer to be increased by an order of magnitude up to 30 msr. The magnetic analyzer will have a high momentum resolution (10-4) and high focal-plane dispersion (1.9 m). It will allow products of nuclear reactions at energies up to 30 MeV/nucleon to be detected with the charge resolution ∼1/60. Implementation of the project is divided into two stages: conversion of the magnetic analyzer proper and construction of the nuclear reaction products identification system. The MULTI detecting system is being developed for the MAVR magnetic analyzer to allow detection of nuclear reaction products and their identification by charge Q, atomic number Z, and mass A with a high absolute accuracy. The identification will be performed by measuring the energy loss (ΔE), time of flight (TOF), and total kinetic energy (TKE) of reaction products. The particle trajectories in the analyzer will also be determined using the drift chamber developed jointly with GANIL. The MAVR analyzer will operate in both primary beams of heavy ions and beams of radioactive nuclei produced by the U400 - U400M acceleration complex. It will also be used for measuring energy spectra of nuclear reaction products and as an energy monochromator.

  17. BK/TD models for analyzing in vitro impedance data on cytotoxicity.

    Science.gov (United States)

    Teng, S; Barcellini-Couget, S; Beaudouin, R; Brochot, C; Desousa, G; Rahmani, R; Pery, A R R

    2015-06-01

    The ban of animal testing has enhanced the development of new in vitro technologies for cosmetics safety assessment. Impedance metrics is one such technology which enables monitoring of cell viability in real time. However, analyzing real time data requires moving from static to dynamic toxicity assessment. In the present study, we built mechanistic biokinetic/toxicodynamic (BK/TD) models to analyze the time course of cell viability in cytotoxicity assay using impedance. These models account for the fate of the tested compounds during the assay. BK/TD models were applied to analyze HepaRG cell viability, after single (48 h) and repeated (4 weeks) exposures to three hepatotoxic compounds (coumarin, isoeugenol and benzophenone-2). The BK/TD models properly fit the data used for their calibration that was obtained for single or repeated exposure. Only for one out of the three compounds, the models calibrated with a single exposure were able to predict repeated exposure data. We therefore recommend the use of long-term exposure in vitro data in order to adequately account for chronic hepatotoxic effects. The models we propose here are capable of being coupled with human biokinetic models in order to relate dose exposure and human hepatotoxicity. PMID:25827406

  18. Has Information Technology Competence Ever Increased? Evidences from the Annual User Satisfaction Survey of Information Technology Services

    OpenAIRE

    Hun Myoung Park

    2015-01-01

    This study challenges a common myth on information technology competence and examins whether information technology competence has increased as the technology develops. An analytic framework considers if an inforation technology service is technologically sophisticated and if it is developed for general purpose for all users. The annual user satisfaction survey data from 1998 through 2014 are analyzed to present historical patterns of use of information technology services, satisfaction, and ...

  19. Has Information Technology Competence Ever Increased? Evidences from the Annual User Satisfaction Survey of Information Technology Services

    OpenAIRE

    Park, Hun Myoung

    2015-01-01

    This study challenges a common myth on information technology competence and examins whether information technology competence has increased as the technology develops. An analytic framework considers if an inforation technology service is technologically sophisticated and if it is developed for general purpose for allusers. The annual user satisfaction survey data from 1998 through 2014 are analyzed to present historical patterns of use of information technology services, satisfaction, and u...

  20. Technology: Technology and Common Sense

    Science.gov (United States)

    Van Horn, Royal

    2004-01-01

    The absence of common sense in the world of technology continues to amaze the author. Things that seem so logical to just aren nott for many people. The installation of Voice-over IP (VoIP, with IP standing for Internet Protocol) in many school districts is a good example. Schools have always had trouble with telephones. Many districts don't even…