WorldWideScience

Sample records for biosystem analyzing technology

  1. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of complex biosystem analyzing technology; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Fukugo seibutsukei kaiseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    The aim is to utilize the sophisticated functions of complex biosystems. In the research and development of technologies for effectively utilizing unexploited resources and substances such as seeweeds and algae, seaweeds are added to seawater to turn into a microbial suspension after the passage of two weeks, the suspension is next scattered on a carageenan culture medium, and then carageenan decomposing microbes are obtained. In the research and development of technologies for utilizing microbe/fauna-flora complex systems, technologies for exploring and analyzing microbes are studied. For this purpose, 48 kinds of sponges and 300 kinds of bacteria symbiotic with the sponges are sampled in Malaysia. Out of them, 15 exhibit enzyme inhibition and Artemia salina lethality activities. In the development of technologies for analyzing the functions of microbes engaged in the production of useful resources and substances for animals and plants, 150 kinds of micro-algae are subjected to screening using protease and chitinase inhibiting activities as the indexes, and it is found that an extract of Isochrysis galbana displays an intense inhibitory activity. The alga is cultured in quantities, the active component is isolated from 20g of dried alga, and its constitution is determined. (NEDO)

  2. Biosystems engineering

    OpenAIRE

    P. Ulger; E. Gonulol

    2015-01-01

    Higher agricultural education system has been getting multidiscipline as a result of the level oftechnology, recently. Biosystems Engineering has become popular in developed countries particularly afterelectronic and information technologies has been getting to be a part of agriculture and accompanied ofbiology. In this study definition of Biosystems Engineering discipline, working areas, research, publicationand job opportunities were discussed meticulously. Academic organization of Biosyste...

  3. Application of structural health monitoring technologies to bio-systems: current status and path forward

    Science.gov (United States)

    Bhalla, Suresh; Srivastava, Shashank; Suresh, Rupali; Moharana, Sumedha; Kaur, Naveet; Gupta, Ashok

    2015-03-01

    This paper presents a case for extension of structural health monitoring (SHM) technologies to offer solutions for biomedical problems. SHM research has made remarkable progress during the last two/ three decades. These technologies are now being extended for possible applications in the bio-medical field. Especially, smart materials, such as piezoelectric ceramic (PZT) patches and fibre-Bragg grating (FBG) sensors, offer a new set of possibilities to the bio-medical community to augment their conventional set of sensors, tools and equipment. The paper presents some of the recent extensions of SHM, such as condition monitoring of bones, monitoring of dental implant post surgery and foot pressure measurement. Latest developments, such as non-bonded configuration of PZT patches for monitoring bones and possible applications in osteoporosis detection, are also discussed. In essence, there is a whole new gamut of new possibilities for SHM technologies making their foray into the bi-medical sector.

  4. BioSystems

    Data.gov (United States)

    U.S. Department of Health & Human Services — The NCBI BioSystems Database provides integrated access to biological systems and their component genes, proteins, and small molecules, as well as literature...

  5. ANTICIPATING AND REGULATING BIOSYSTEM

    Directory of Open Access Journals (Sweden)

    Ion Iorga Siman

    2010-06-01

    Full Text Available Regulating biosystems closely related to human beings are structures still difficult to understand.Numerous intimate processes taking place in these systems, even their actual constitution, are insufficiently decoded, and that they have populated the world long before man invented the first regulator, appears not to have contributed much to their knowledge. This work is intended to highlight what regulating biosystems are.There is no secret that somatic muscles perform control operations which no act of moving would be possible without. All actions are the result of dynamic controlled processes adjusted to strict control laws. By treating them very seriously may lead to knowledge of processes occurring in complex systems

  6. The Threads of Biosystems Engineering

    OpenAIRE

    Briassoulis, Demetres; Gallego Vázquez, Eutiquio; Pantaleo, Antonio Marco; Holden, Nicholas; Owende, Philip; Ting, K.C.; Mallikarjunan, Kumar

    2012-01-01

    The core concepts, or threads, of Biosystems Engineering (BSEN) are variously understood by those within the discipline, but have never been unequivocally defined due to its early stage of development. This makes communication and teaching difficult compared to other well established engineering subjects. Biosystems Engineering is a field of Engineering which int egrates engineering science and design with applied biological, environmental and agricultural sciences....

  7. Industrial Biosystems Engineering and Biorefinery Systems

    Institute of Scientific and Technical Information of China (English)

    Shulin Chen

    2008-01-01

    The concept of Industrial Biosystems Engineering (IBsE) was suggested as a new engineering branch to be developed for meeting the needs for science, technology and professionals by the upcoming bioeconomy. With emphasis on systems, IBsE builds upon the interfaces between systems biology, bioprocessing, and systems engineering. This paper discussed the background, the suggested definition, the theoretical framework and methodologies of this new discipline as well as its challenges and future development

  8. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  9. Controlled structure and properties of silicate nanoparticle networks for incorporation of biosystem components

    International Nuclear Information System (INIS)

    Inorganic nanoparticles are of technological interest in many fields. We created silicate nanoparticle hydrogels that effectively incorporated biomolecules that are unstable and involved in complicated reactions. The size of the silicate nanoparticles strongly affected both the physical characteristics of the resulting hydrogel and the activity of biomolecules incorporated within the hydrogel. We used high-resolution transmission electron microscopy (TEM) to analyze in detail the hydrogel network patterns formed by the silicate nanoparticles. We obtained clear nanostructured images of biomolecule-nanoparticle composite hydrogels. The TEM images also showed that larger silicate nanoparticles (22 nm) formed more loosely associated silicate networks than did smaller silicate nanoparticles (7 nm). The loosely associated networks formed from larger silicate nanoparticles might facilitate substrate diffusion through the network, thus promoting the observed increased activity of the entrapped biomolecules. This doubled the activity of the incorporated biosystems compared with that of biosystems prepared by our own previously reported method. We propose a reaction scheme to explain the formation of the silicate nanoparticle networks. The successful incorporation of biomolecules into the nanoparticle hydrogels, along with the high level of activity exhibited by the biomolecules required for complicated reaction within the gels, demonstrates the nanocomposites' potential for use in medical applications.

  10. Registering plant dysfunction in artificial biosystems through fluorescence imaging technique

    Science.gov (United States)

    Nikolova, Alexandra; Krumov, Alexandar; Vassilev, Vesselin

    Humanity ambitions in space exploration and long-term men-operated space missions evoke an increasing interest to artificial ecosystem researches. Advanced studies of plant biosystems provoke development of new innovative technologies for plant cultivation in man-made environment. Closed ecosystems of different types and structure are now used for space horticulture, cultivation of genetically modified species, bio-products for pharmacies and industry etc. New technologies are required to monitor and control basic parameters of future bioregenerative life support system, especially of plants photosynthetic activity as the most fundamental biological process. Authors propose a conception for a non-invasive control of plant physiological status in closed biosystem through spatial registration of chlorophyll fluorescence. This approach allows an early detection of stress impact on plants, reveal the dynamic and direction of the negative influence and the level of plant stress. Technical requirements for obtaining plant fluorescence images are examined in close relation with plant illumination conditions. Problems related with optimised plant illumination are discussed. Examples of fluorescence images of healthy and stressed plants demonstrate the sensibility and rapidity of signal changes caused by plant dysfunction. Proposed conception could be used for developing new technical solutions in autocontrolled bio-support systems, based on real time analysis of fluorescence images.

  11. Biosystems and Food Engineering Research Review 21

    OpenAIRE

    Cummins, Enda; Curran, Thomas P.

    2016-01-01

    The Twenty First Annual Research Review describes the ongoing research programme in the School of Biosystems and Food Engineering at University College Dublin from over 83 researchers (11 academic staff, 1 technician, 4 postdoctoral researchers and 67 postgraduates). The research programme covers three focal areas: Food and Process Engineering; Bioresource Systems; and Bioenvironmental Engineering. Each area is divided into sub-areas as outlined in the Table of Contents which also includes th...

  12. Radiofrequency mass analyzer. Theory, design, technology and utilization

    International Nuclear Information System (INIS)

    In this work the results of research and design of the sensor-analyzer part of a mass spectrometer is presented. It is destined for use as portable gas analyzer for control and technological process supervision, in the environment control, etc. Starting from data and physical principles a generalized mathematical model of the radiofrequency mass analyzer, corresponding to the shape and the amplitude of the potential of high frequency modulation is analyzed. On this basis the peculiarities of the non-linear regime of some RF mass analyzer of optimal and simple scheme of calculation. A number of technological, execution and assembling issues of the RF analysers under production conditions are analyzed. Within the large range of uses of these devices the vacuum technique, thin layer technology, atomic and ionic composition studies of earth and other planet atmospheres, the detection, identification and exhaling gas analyses are mentioned

  13. Achievement report for fiscal 1997 on the development of technologies for utilizing biological resources such as complex biosystems. Development of technologies for producing substitute fuel for petroleum by utilizing organisms; 1997 nendo fukugo seibutsukei nado seibutsu shigen riyo gijutsu kaihatsu seika hokokusho. Seibutsu riyo sekiyu daitai nenryo seizo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    Technologies of producing useful substances using the substance decomposing/producing functions of complex biosystems and methods of their handling are developed. In the utilization of microbes in the digestive tracts of termites and longicorns, it is made clear that several kinds of termites cleave the {beta}-O-4 ether linkage. In relation to technologies for wood decomposing complex microbial system construction and complex vector system development, a screening system is constructed in which strains that exhibit complex actions are combined. Concerning the advanced utilization of tropical oil plants, conditions are determined for inducing callus out of oil palm tissues. Out of oil palm sarcocarp tissues, mRNA (messenger ribonucleic acid) is isolated for the construction of a cDNA (complementary deoxyribonucleic acid) library. For the purpose of isolating a powerful promoter, a partial base sequence is determined for ubiquitin that frequently expresses itself in cells. A pathogenic bacterium ailing the oil palm is sampled for identification, and it is inferred that the bacterium is a kind of Ganoderma boninense. (NEDO)

  14. Analyzing the next-generation catalog a library technology report

    CERN Document Server

    Nagy, Andrew

    2011-01-01

    his issue of ""Library Technology Reports"" analyzes five different academic libraries to better understand their investments, detailing the outcome thus far and drawing conclusions about the next-generation catalog.

  15. International Student Collaboration in Biosystems Engineering using Video Podcasts in Design Classes

    OpenAIRE

    Curran, Thomas P.; Gates, Richard S.; Gentile, Francesco; et al

    2014-01-01

    A working group within the Trans-Atlantic Biosystems Engineering Network (TABE.NET) analyzed the idea of the development of an international collaborative design project for undergraduate students in the participating institutions. Aims of this action were to get a change in the students' outlook to utilize in their remaining years within the university and a more internationalized resume before they finish. Further outcomes desired by the team were a second or third language acquisition and ...

  16. The biosystems engineering design challenge at University College Dublin

    OpenAIRE

    Curran, Thomas P.; Cummins, Enda; Holden, Nicholas M.; McDonnell, Kevin; Blaney, Colleen

    2007-01-01

    The Biosystems Engineering Design Challenge has recently become an academic module open to all undergraduate students at University College Dublin. The focus of the module is on designing and building a working, bench-scale device that solves a practical problem relevant to Biosystems Engineering. The module provides an opportunity for students to learn about engineering design, project management and teamwork. Enrolled students are split into teams of up to seven and meet an assi...

  17. Abstracts of the 17. world congress of the International Commission of Agriculture and Biosystems Engineering (CIGR) : sustainable biosystems through engineering

    Energy Technology Data Exchange (ETDEWEB)

    Savoie, P.; Villeneuve, J.; Morisette, R. [Agriculture and Agri-Food Canada, Quebec City, PQ (Canada). Soils and Crops Research and Development Centre] (eds.)

    2010-07-01

    This international conference provided a forum to discuss methods to produce agricultural products more efficiently through improvements in engineering and technology. It was attended by engineers and scientists working from different perspectives on biosystems. Beyond food, farms and forests can provide fibre, bio-products and renewable energy. Seven sections of CIGR were organized in the following technical sessions: (1) land and water engineering, (2) farm buildings, equipment, structures and environment, (3) equipment engineering for plants, (4) energy in agriculture, (5) management, ergonomics and systems engineering, (6) post harvest technology and process engineering, and (7) information systems. The Canadian Society of Bioengineering (CSBE) merged its technical program within the 7 sections of CIGR. Four other groups also held their activities during the conference. The American Society of Agricultural and Biological Engineers (ASABE) organized its 9th international drainage symposium and the American Ecological Engineering Society (AEES) held its 10th annual meeting. The International Network for Information Technology in Agriculture (INFITA), and the 8th world congress on computers in agriculture also joined CIGR 2010.

  18. Study on Algae Removal by Immobilized Biosystem on Sponge

    Institute of Scientific and Technical Information of China (English)

    PEI Haiyan; HU Wenrong

    2006-01-01

    In this study, sponges were used to immobilize domesticated sludge microbes in a limited space, forming an immobilized biosystem capable of algae and microcystins removal. The removal effects on algae, microcystins and UV260 of this biosystem and the mechanism of algae removal were studied. The results showed that active sludge from sewage treatment plants was able to remove algae from a eutrophic lake's water after 7 d of domestication. The removal efficiency for algae,organic matter and microcystins increased when the domesticated sludge was immobilized on sponges. When the hydraulic retention time (HRT) was 5h, the removal rates of algae, microcystins and UV260 were 90%, 94.17% and 84%, respectively.The immobilized biosystem consisted mostly of bacteria, the Ciliata and Sarcodina protozoans and the Rotifer metazoans.Algal decomposition by zoogloea bacteria and preying by microcreatures were the two main modes of algal removal, which occurred in two steps: first, absorption by the zoogloea; second, decomposition by the zoogloea bacteria and the predacity of the microcreatures.

  19. Nontrivial quantum and quantum-like effects in biosystems: Unsolved questions and paradoxes.

    Science.gov (United States)

    Melkikh, Alexey V; Khrennikov, Andrei

    2015-11-01

    Non-trivial quantum effects in biological systems are analyzed. Some unresolved issues and paradoxes related to quantum effects (Levinthal's paradox, the paradox of speed, and mechanisms of evolution) are addressed. It is concluded that the existence of non-trivial quantum effects is necessary for the functioning of living systems. In particular, it is demonstrated that classical mechanics cannot explain the stable work of the cell and any over-cell structures. The need for quantum effects is generated also by combinatorial problems of evolution. Their solution requires a priori information about the states of the evolving system, but within the framework of the classical theory it is not possible to explain mechanisms of its storage consistently. We also present essentials of so called quantum-like paradigm: sufficiently complex bio-systems process information by violating the laws of classical probability and information theory. Therefore the mathematical apparatus of quantum theory may have fruitful applications to describe behavior of bio-systems: from cells to brains, ecosystems and social systems. In quantum-like information biology it is not presumed that quantum information bio-processing is resulted from quantum physical processes in living organisms. Special experiments to test the role of quantum mechanics in living systems are suggested. This requires a detailed study of living systems on the level of individual atoms and molecules. Such monitoring of living systems in vivo can allow the identification of the real potentials of interaction between biologically important molecules. PMID:26160644

  20. Financial options methodology for analyzing investments in new technology

    International Nuclear Information System (INIS)

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated

  1. Financial options methodology for analyzing investments in new technology

    Science.gov (United States)

    Wenning, B. D.

    1995-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  2. Financial options methodology for analyzing investments in new technology

    Energy Technology Data Exchange (ETDEWEB)

    Wenning, B.D. [Texas Utilities Services, Inc., Dallas, TX (United States)

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  3. Nano-Biotechnology: Structure and Dynamics of Nanoscale Biosystems

    CERN Document Server

    Manjasetty, Babu A; Ramaswamy, Y S

    2010-01-01

    Nanoscale biosystems are widely used in numerous medical applications. The approaches for structure and function of the nanomachines that are available in the cell (natural nanomachines) are discussed. Molecular simulation studies have been extensively used to study the dynamics of many nanomachines including ribosome. Carbon Nanotubes (CNTs) serve as prototypes for biological channels such as Aquaporins (AQPs). Recently, extensive investigations have been performed on the transport of biological nanosystems through CNTs. The results are utilized as a guide in building a nanomachinary such as nanosyringe for a needle free drug delivery.

  4. Polarized 3He Gas Circulating Technologies for Neutron Analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Watt, David; Hersman, Bill

    2014-12-10

    We describe the development of an integrated system for quasi-continuous operation of a large volume neutron analyzer. The system consists of a non-magnetic diaphragm compressor, a prototype large volume helium polarizer, a surrogate neutron analyzer, a non-depolarizing gas storage reservoir, a non-ferrous valve manifold for handling gas distribution, a custom rubidium-vapor gas return purifier, and wire-wound transfer lines, all of which are immersed in a two-meter external magnetic field. Over the Phase II period we focused on three major tasks required for the successful deployment of these types of systems: 1) design and implementation of gas handling hardware, 2) automation for long-term operation, and 3) improvements in polarizer performance, specifically fabrication of aluminosilicate optical pumping cells. In this report we describe the design, implementation, and testing of the gas handling hardware. We describe improved polarizer performance resulting from improved cell materials and fabrication methods. These improvements yielded valved 8.5 liter cells with relaxation times greater than 12 hours. Pumping this cell with 1500W laser power with 1.25nm linewidth yielded peak polarizations of 60%, measured both inside and outside the polarizer. Fully narrowing this laser to 0.25nm, demonstrated separately on one stack of the four, would have allowed 70% polarization with this cell. We demonstrated the removal of 5 liters of polarized helium from the polarizer with no measured loss of polarization. We circulated the gas through a titanium-clad compressor with polarization loss below 3% per pass. We also prepared for the next phase of development by refining the design of the polarizer so that it can be engineer-certified for pressurized operation. The performance of our system far exceeds comparable efforts elsewhere.

  5. Enhancing the first year learning experience for Biosystems Engineering students at University College Dublin

    OpenAIRE

    Curran, Thomas P.; Doyle, Colleen; Cummins, Enda; McDonnell, Kevin; Holden, Nicholas M.

    2010-01-01

    This paper outlines the development of a problem-based learning module called the Biosystems Engineering Design Challenge. The focus of the module is on designing and building a working, bench-scale device that solves a practical problem relevant to Biosystems Engineering. It provides an early opportunity for students to learn about engineering design, project management and teamwork. The module aligns well with the academic policy of University College Dublin to introduce alte...

  6. Fiscal 1998 industrial science and technology R and D project. Research report on R and D of genome informatics technology (Development of stable oil supply measures using complex biosystem); 1998 nendo genome informatics gijutsu kenkyu kaihtsu seika hokokusho. Fukugo seibutsukei riyo sekiyu antei kyokyu taisaku kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This report describes the fiscal 1998 result on development of genome informatics technology. As comparative analysis technique of genes, the combination of electrophoresis and PCR was used. For improvement of the throughput and reproducibility of the technique, module- shuffling primers were used, and the multi(96)-arrayed capillary fragment analyzer was devised. The system detecting SNPs rapidly was also developed successfully. As analysis technology of DNA sequence by use of triple- stranded DNA formation, study was made on construction of long cDNA libraries, selective subtraction of specific sequences from libraries, and the basic technology of homologous cloning. Study was also made on each reaction step of IGCR technique for fast analysis, and specifications of a fluorescence transfer monitor. As modeling technique of genetic sequence information, the simulation model was developed for gene expression regulatory networks during muscle differentiation, and feedback regulation of period genes. Such support systems as transcription factor prediction and gene regulatory network inference were developed from existing data. (NEDO)

  7. [Vitauct of superorganism biosystems (by the example of communities of social insects and biocenoses)].

    Science.gov (United States)

    Makrushin, A V

    2010-01-01

    Unidirectional irreversible changes occur continuously in complex biosystems consisting of large number of different elements. First, they increase the stability of the systems but then decrease it bringing for the probability of their disappearance from the Earth. By the example of communities of social insects and biocenoses the reactions of superorganismic level prolonging their life are considered. PMID:20586249

  8. Analyzing of MOS and Codec Selection for Voice over IP Technology

    OpenAIRE

    Mohd Nazri Ismail

    2009-01-01

    In this research, we propose an architectural solution to implement the voice over IP (VoIP) service in campus environment network. Voice over IP (VoIP) technology has become a discussion issue for this time being. Today, the deployment of this technology on an organization truly can give a great financial benefit over traditional telephony. Therefore, this study is to analyze the VoIP Codec selection and investigate the Mean Opinion Score (MOS) performance areas evolved with the quality of s...

  9. THE DELPHI METHOD AS A TOOL FOR ANALYZING TECHNOLOGY EVOLUTION: CASE OPEN SOURCE THIN COMPUTING

    OpenAIRE

    MATTI KARVONEN; VILLE RYYNÄNEN; TUOMO KÄSSI

    2009-01-01

    The main goal of this paper is to show how the Delphi method works as a management tool when analyzing technology evolution. The paper also provides insights on how thin computing and open source can affect the future IT infrastructure development. The primary data was collected in a three round Delphi study consisting of the following interest groups: (1) Developers of open source thin computing, (2) Industrial experts, (3) Representatives of academic institutes. The Delphi method represents...

  10. Present and future of the numerical methods in buildings and infrastructures areas of biosystems engineering

    Directory of Open Access Journals (Sweden)

    Francisco Ayuga

    2015-04-01

    Full Text Available Biosystem engineering is a discipline resulting from the evolution of the traditional agricultural engineering to include new engineering challenges related with biological systems, from the cell to the environment. Modern buildings and infrastructures are needed to satisfy crop and animal production demands. In this paper a review on the status of numerical methods applied to solve engineering problems in the field of buildings and infrastructures in biosystem engineering is presented. The history and basic background of the finite element method is presented. This is the first numerical method implemented and also the more developed one. The history and background of other two more recent methods, with practical applications, the computer fluids dynamics and the discrete element method are also presented. Besides, a review on the scientific and professional applications on the field of buildings and infrastructures for biosystem engineering needs is presented. Today we can simulate engineering problems with solids, engineering problems with fluids and engineering problems with particles and get to practical solutions faster and cheaper than in the past. The paper encourages young engineers and researchers to make progress these tools and their engineering applications. The capacities of all numerical methods in their present development status go beyond the present practical applications. There is a broad field to work on it.

  11. Modeling Dendrimers Charge Interaction in Solution: Relevance in Biosystems

    Directory of Open Access Journals (Sweden)

    Domenico Lombardo

    2014-01-01

    Full Text Available Dendrimers are highly branched macromolecules obtained by stepwise controlled, reaction sequences. The ability to be designed for specific applications makes dendrimers unprecedented components to control the structural organization of matter during the bottom-up synthesis of functional nanostructures. For their applications in the field of biotechnology the determination of dendrimer structural properties as well as the investigation of the specific interaction with guest components are needed. We show how the analysis of the scattering structure factor S(q, in the framework of current models for charged systems in solution, allows for obtaining important information of the interdendrimers electrostatic interaction potential. The finding of the presented results outlines the important role of the dendrimer charge and the solvent conditions in regulating, through the modulation of the electrostatic interaction potential, great part of the main structural properties. This charge interaction has been indicated by many studies as a crucial factor for a wide range of structural processes involving their biomedical application. Due to their easily controllable properties dendrimers can be considered at the crossroad between traditional colloids, associating polymers, and biological systems and represent then an interesting new technological approach and a suitable model system of molecular organization in biochemistry and related fields.

  12. Data integration, systems approach and multilevel description of complex biosystems

    International Nuclear Information System (INIS)

    Recent years have witnessed the development of new quantitative approaches and theoretical tenets in the biological sciences. The advent of high throughput experiments in genomics, proteomics and electrophysiology (to cite just a few examples) have provided the researchers with unprecedented amounts of data to be analyzed. Large datasets, however can not provide the means to achieve a complete understanding of the underlying biological phenomena, unless they are supplied with a solid theoretical framework and with proper analytical tools. It is now widely accepted that by using and extending some of the paradigmatic principles of what has been called complex systems theory, some degree of advance in this direction can be attained. We will be presenting ways in which by using data integration techniques (linear, non-linear, combinatorial, graphical), multidimensional-multilevel descriptions (multifractal modeling, dimensionality reduction, computational learning), as well as an approach based in systems theory (interaction maps, probabilistic graphical models, non-equilibrium physics) have allowed us to better understand some problems in the interface of Statistical Physics and Computational Biology

  13. Blow and go: the Breath-Analyzed Ignition Interlock Device as a technological response to DWI.

    Science.gov (United States)

    Fulkerson, Andrew

    2003-01-01

    Driving while intoxicated rates have declined substantially in the last 20 years. This is as a result of public opinion combined with increased law enforcement efforts. A recent tool has been the Breath Analyzed Ignition Interlock Device. This new technology is designed to prevent persons with excessive blood alcohol levels from operating the interlocked vehicle. This 3-year recidivism study of the ignition interlock revealed 17.5% recidivism rates for the interlock group compared to 25.3% recidivism rates for the non-interlock group, a 31% decrease. Multiple offenders and younger (under 30) offenders had significantly lower rates of subsequent arrests. The multi-offenders in the comparison group were more than twice as likely as the interlock group to have a subsequent conviction within 3 years. The difference was nearly the same for the under 30 age group. There was almost no difference for first offenders. Accordingly, the ignition interlock appears to significantly reduce recidivism for repeat and younger DWI offenders but offers almost no improvement for first offenders. One driver of 315 (0.32%) was charged with DWI with an interlock in place. This offender had a child provide the breath sample while she drove the vehicle. PMID:12731690

  14. Analyzing Preservice Teachers' Technological Pedagogical Content Knowledge Development in the Context of a Multidimensional Teacher Preparation Program

    Science.gov (United States)

    Shinas, Valerie Harlow; Karchmer-Klein, Rachel; Mouza, Chrystalla; Yilmaz-Ozden, Sule; Glutting, Joseph J.

    2015-01-01

    In this quantitative study, correlational and multiple regression analyses were conducted to examine the technological pedagogical content knowledge (TPACK) development of 299 preservice teachers in response to the technology preparation they received during their initial teacher licensure program. Survey data were analyzed to determine the…

  15. Multimodal methodologies for analyzing preschool children’s engagements with digital technologies

    DEFF Research Database (Denmark)

    Chimirri, Niklas Alexander

    Recent research on the role of technologies and specifically digital technologies in pedagogy has engaged in re-reading Vygotsky’s opus with a focus on the technological mediatedness of personal and collective development. Among other issues, it questions the differentiation between tools and signs...

  16. Education of indoor enviromental engineering technology

    Czech Academy of Sciences Publication Activity Database

    Kic, P.; Zajíček, Milan

    2011-01-01

    Roč. 9, Spec. 1 (2011), s. 83-90. ISSN 1406-894X. [Biosystems Engineering 2011. Tartu, 12.05.2011-13.05.2011] Institutional research plan: CEZ:AV0Z10750506 Keywords : Biosystems engineering * indoor environment * study * programs Subject RIV: AM - Education http://library.utia.cas.cz/separaty/2011/VS/zajicek- education of indoor enviromental engineering technology .pdf

  17. Analyzing interdependencies between policy mixes and technological innovation systems : The case of offshore wind in Germany

    NARCIS (Netherlands)

    Reichardt, Kristin; Negro, Simona O.; Rogge, Karoline S.; Hekkert, Marko P.

    2016-01-01

    One key approach for studying emerging technologies in the field of sustainability transitions is that of technological innovation systems (TIS). While most TIS studies aim at deriving policy recommendations - typically by identifying system barriers - the actual role of these proposed policies in t

  18. Recurrent Routines: Analyzing and Supporting Orchestration in Technology-Enhanced Primary Classrooms

    Science.gov (United States)

    Prieto, Luis P.; Villagra-Sobrino, Sara; Jorrin-Abellan, Ivan M.; Martinez-Mones, Alejandra; Dimitriadis, Yannis

    2011-01-01

    The increasing presence of multiple Information and Communication Technologies (ICT) in the classroom does not guarantee an improvement of the learning experiences of students, unless it is also accompanied by pedagogically effective orchestration of those technologies. In order to help teachers in this endeavour, it can be useful to understand…

  19. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    Directory of Open Access Journals (Sweden)

    Mario Camberos C.

    2013-07-01

    Full Text Available In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on 2008, two hundred percent. This result is in correspondence with diminishing technological gap, perhaps caused by economic crisis impact.

  20. A methodology for capturing and analyzing data from technology base seminar wargames.

    OpenAIRE

    Miles, Jeffrey T.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis provides a structured methodology for obtaining, evaluating, and portraying to a decision maker, the opinions of players of Technology Base Seminar Wargames (TBSW). The thesis then demonstrates the methodology by applying the events of the Fire Support Technology Base Seminar Wargame held in May 1991. Specifically, the evaluation team developed six surveys, each survey capturing opinions using the categorical...

  1. Technological change and salary variation in Mexican regions: Analyzing panel data for the service sector

    OpenAIRE

    Mario Camberos C.; Luis Huesca Reynoso; David Castro Lugo

    2013-01-01

    In this paper Hypothesis Biased Technological Change is applied for Mexican workers services sector, belonging several Mexican regions. Economics Census microdata, 1998, 2003 and 2008 are used. Hypothesis is proved with technological gaps, under consideration of different index and result statistics consistency by taking account panel analysis. Mayor wages differences at 2008 year were find out between Capital region and South one, about five hundred percent on 1998 year; but it was lower on ...

  2. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can observe eddy covariance flux of CO2 from unmanned airborne platforms. For both phases, a total of four...

  3. Innovative CO2 Analyzer Technology for the Eddy Covariance Flux Monitor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to build and evaluate NDIR Analyzers that can be used to observe Eddy Covariance Flux and Absolute Dry Mole Fraction of CO2 from stationary and airborne...

  4. Application of Printed Circuit Board Technology to FT-ICR MS Analyzer Cell Construction and Prototyping

    Science.gov (United States)

    Leach, Franklin E.; Norheim, Randolph; Anderson, Gordon; Pasa-Tolic, Ljiljana

    2014-12-01

    Although Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) remains the mass spectrometry platform that provides the highest levels of performance for mass accuracy and resolving power, there is room for improvement in analyzer cell design as the ideal quadrupolar trapping potential has yet to be generated for a broadband MS experiment. To this end, analyzer cell designs have improved since the field's inception, yet few research groups participate in this area because of the high cost of instrumentation efforts. As a step towards reducing this barrier to participation and allowing for more designs to be physically tested, we introduce a method of FT-ICR analyzer cell prototyping utilizing printed circuit boards at modest vacuum conditions. This method allows for inexpensive devices to be readily fabricated and tested over short intervals and should open the field to laboratories lacking or unable to access high performance machine shop facilities because of the required financial investment.

  5. Novel Indirect Calorimetry Technology to Analyze Metabolism in Individual Neonatal Rodent Pups

    NARCIS (Netherlands)

    Dominguez, Jesus F.; Guo, Lixin; Carrasco Molnar, Marco A.; Ballester Escobedo, Antonio; Dunphy, Taylor; Lund, Trent D.; Turman, Jack E.

    2009-01-01

    Background: The ability to characterize the development of metabolic function in neonatal rodents has been limited due to technological constraints. Low respiratory volumes and flows at rest pose unique problems, making it difficult to reliably measure O(2) consumption, CO(2) production, respiratory

  6. Analyzing the Effect of Web-Based Instruction Applications to School Culture within Technology Integration

    Science.gov (United States)

    Cakiroglu, Unal; Akkan, Yasar; Guven, Bulent

    2012-01-01

    Determining the reflections of technology integration applications that are to be performed in our schools is important to light the way of first steps of integration. In this research, the effect of a web-based instruction environment used by 31 different teachers in a high school to school culture is set forth. The school culture is analyzed…

  7. Applying the CHAID Algorithm to Analyze How Achievement Is Influenced by University Students' Demographics, Study Habits, and Technology Familiarity

    Science.gov (United States)

    Baran, Bahar; Kiliç, Eylem

    2015-01-01

    The purpose of this study is to analyze three separate constructs (demographics, study habits, and technology familiarity) that can be used to identify university students' characteristics and the relationship between each of these constructs with student achievement. A survey method was used for the current study, and the participants included…

  8. Analyzing the Impact of Software-Defined Video Networking to Broadcast Technology Business

    OpenAIRE

    Niiranen, Heikki

    2015-01-01

    Broadcast video systems have traditionally been the domain of purpose built video interfaces and routing products. The advances in technology have reduced the price of Ethernet network bandwidth and processing to levels where uncompressed video can be feasibly transported within packet switched IP networks. This trend is accompanied with the paradigm of software-defined networking, which allows easy control of network parameters and features from high level applications. This has made it poss...

  9. Analyzing the Direct Methanol Fuel Cell technology in portable applications by a historical and bibliometric analysis

    OpenAIRE

    Suominen, A.; Tuominen, A. (Aulis)

    2010-01-01

    The development of direct methanol fuel cell (DMFC) technology through an analysis of research, patenting and commercial adoption is studied in this paper. The analysis uses a dataset gathered from both publication and patent databases. This data is complemented with a review on commercial efforts on portable fuel cells. Bibliometric methods are used to identify research networks and research trends. The Fisher-Pry growth model is used to estimate future research activity. The patent landscap...

  10. Fusion of Nuclear and Emerging Technology

    International Nuclear Information System (INIS)

    The presentation discussed the following subjects: emerging technology; nuclear technology; fusion emerging and nuclear technology; progressive nature of knowledge; optically stimulated luminescence - application of luminescence technology to sediments; Biosystemics technology -convergence nanotechnology, ecological science, biotechnology, cognitive science and IT - prospective impact on materials science, the management of public system for bio-health, eco and food system integrity and disease mitigation

  11. Analyzing the Life Cycle Energy Savings of DOE Supported Buildings Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Cort, Katherine A.; Hostick, Donna J.; Dirks, James A.; Elliott, Douglas B.

    2009-08-31

    This report examines the factors that would potentially help determine an appropriate analytical timeframe for measuring the U.S. Department of Energy's Building Technology (BT) benefits and presents a summary-level analysis of the life cycle savings for BT’s Commercial Buildings Integration (CBI) R&D program. The energy savings for three hypothetical building designs are projected over a 100-year period using Building Energy Analysis and Modeling System (BEAMS) to illustrate the resulting energy and carbon savings associated with the hypothetical aging buildings. The report identifies the tasks required to develop a long-term analytical and modeling framework, and discusses the potential analytical gains and losses by extending an analysis into the “long-term.”

  12. Temperature variation in metal ceramic technology analyzed using time domain optical coherence tomography

    Science.gov (United States)

    Sinescu, Cosmin; Topala, Florin I.; Negrutiu, Meda Lavinia; Duma, Virgil-Florin; Podoleanu, Adrian G.

    2014-01-01

    The quality of dental prostheses is essential in providing good quality medical services. The metal ceramic technology applied in dentistry implies ceramic sintering inside the dental oven. Every ceramic material requires a special sintering chart which is recommended by the producer. For a regular dental technician it is very difficult to evaluate if the temperature inside the oven remains the same as it is programmed on the sintering chart. Also, maintaining the calibration in time is an issue for the practitioners. Metal ceramic crowns develop a very accurate pattern for the ceramic layers depending on the temperature variation inside the oven where they are processed. Different patterns were identified in the present study for the samples processed with a variation in temperature of +30 °C to +50 °C, respectively - 30 0°C to -50 °C. The OCT imagistic evaluations performed for the normal samples present a uniform spread of the ceramic granulation inside the ceramic materials. For the samples sintered at a higher temperature an alternation between white and darker areas between the enamel and opaque layers appear. For the samples sintered at a lower temperature a decrease in the ceramic granulation from the enamel towards the opaque layer is concluded. The TD-OCT methods can therefore be used efficiently for the detection of the temperature variation due to the ceramic sintering inside the ceramic oven.

  13. Analyzing Accuracy and Accessibility in Information and Communication Technology Ethical Scenario Context

    Directory of Open Access Journals (Sweden)

    M. Masrom

    2011-01-01

    Full Text Available Problem statement: Recently, the development of Information and Communication Technology (ICT is indispensable to life. The utilization of ICT has provided advantages for people, organizations and society as a whole. Nevertheless, the widespread and rapid use of ICT in society has exacerbated existing ethical issues or dilemmas and also led to the emergence of new ethical issues such as unauthorized access, software piracy, internet pornography, privacy protection, information gap and many others. Approach: Therefore, the aim of this study is to discuss several issues of the ICT ethics. It will focusing on two major issues, that is, data accuracy and accessibility. Results: The results indicated that more than half percentage of respondents tend to be ethical in data accuracy scenario and also in accessibility scenario. Several computer ethics scenarios that relate to the data accuracy and accessibility are presented and the results of analysis are then discussed. Conclusion: Based on the results in this study, computer ethics issues such as data accuracy and accessibility should receive more attention in the ICT field.

  14. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants

    Institute of Scientific and Technical Information of China (English)

    Miguel A Pineros; Pierre-Luc Pradier; Nathanael M Shaw; Ithipong Assaranurak; Susan R McCouch; Craig Sturrock; Malcolm Bennett; Leon V Kochian; Brandon G Larson; Jon E Shaff; David J Schneider; Alexandre Xavier Falcao; Lixing Yuan; Randy T Clark; Eric J Craft; Tyler W Davis

    2016-01-01

    A plant’s ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architec-ture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyp-ing software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimen-sional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions.

  15. Evolving technologies for growing, imaging and analyzing 3D root system architecture of crop plants.

    Science.gov (United States)

    Piñeros, Miguel A; Larson, Brandon G; Shaff, Jon E; Schneider, David J; Falcão, Alexandre Xavier; Yuan, Lixing; Clark, Randy T; Craft, Eric J; Davis, Tyler W; Pradier, Pierre-Luc; Shaw, Nathanael M; Assaranurak, Ithipong; McCouch, Susan R; Sturrock, Craig; Bennett, Malcolm; Kochian, Leon V

    2016-03-01

    A plant's ability to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, can be strongly influenced by root system architecture (RSA), the three-dimensional distribution of the different root types in the soil. The ability to image, track and quantify these root system attributes in a dynamic fashion is a useful tool in assessing desirable genetic and physiological root traits. Recent advances in imaging technology and phenotyping software have resulted in substantive progress in describing and quantifying RSA. We have designed a hydroponic growth system which retains the three-dimensional RSA of the plant root system, while allowing for aeration, solution replenishment and the imposition of nutrient treatments, as well as high-quality imaging of the root system. The simplicity and flexibility of the system allows for modifications tailored to the RSA of different crop species and improved throughput. This paper details the recent improvements and innovations in our root growth and imaging system which allows for greater image sensitivity (detection of fine roots and other root details), higher efficiency, and a broad array of growing conditions for plants that more closely mimic those found under field conditions. PMID:26683583

  16. Assessing the validity of using serious game technology to analyze physician decision making.

    Directory of Open Access Journals (Sweden)

    Deepika Mohan

    Full Text Available BACKGROUND: Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes have emerged as a method of studying physician decision making. However, little is known about their validity. METHODS: We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines. We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case. We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases. FINDINGS: We recruited 209 physicians, of whom 168 (79% began and 142 (68% completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C: 10.9 [SD 4.8] vs. cognitive load (CL:10.7 [SD 5.6], p = 0.74, despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01. Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20, but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03. CONCLUSIONS: We found that physicians made decisions consistent with actual practice, that we could

  17. Electro-Quasistatic Simulations in Bio-Systems Engineering and Medical Engineering

    Directory of Open Access Journals (Sweden)

    U. van Rienen

    2005-01-01

    Full Text Available Slowly varying electromagnetic fields play a key role in various applications in bio-systems and medical engineering. Examples are the electric activity of neurons on neurochips used as biosensors, the stimulating electric fields of implanted electrodes used for deep brain stimulation in patients with Morbus Parkinson and the stimulation of the auditory nerves in deaf patients, respectively. In order to simulate the neuronal activity on a chip it is necessary to couple Maxwell's and Hodgkin-Huxley's equations. First numerical results for a neuron coupling to a single electrode are presented. They show a promising qualitative agreement with the experimentally recorded signals. Further, simulations are presented on electrodes for deep brain stimulation in animal experiments where the question of electrode ageing and energy deposition in the surrounding tissue are of major interest. As a last example, electric simulations for a simple cochlea model are presented comparing the field in the skull bones for different electrode types and stimulations in different positions.

  18. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    Science.gov (United States)

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  19. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    Directory of Open Access Journals (Sweden)

    Peyman Ghafari Ashtiani

    2016-08-01

    Full Text Available One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a customer to another one. The purpose of this research is to analyze the relationship between customers' loyalty and the acceptance of viral marketing based on the theory of technology acceptance model (TAM model among the civil engineers and architects who are the members of Engineering Council in Isfahan (ECI. The research method is descriptive–survey and it is applicable in target. The statistical population includes civil engineers and architects who are the members of Engineering Council in Isfahan including 14400 members. The sample size was determined 762 members based on Cochran sampling formula, the sample was selected as accessible. The data was collected by field method. Analyzing the data and recent research hypothesis, the data was extracted from the questionnaires. Then, all the data was analyzed by computer and SPSS and LISREL software. According to the results of the data, the loyalty of the civil engineers and architects members of ECI was associated with the acceptance and practical involvement of viral marketing.

  20. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program

    OpenAIRE

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: “Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?” Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), ...

  1. Fluorescent In Situ Hybridization as a Genetic Technology to Analyzing Chromosomal Organization of Alien Wheat Recombinant Lines

    International Nuclear Information System (INIS)

    Fluorescent in situ hybridization is a valuable method for physical mapping of DNA sequence to chromosomes and genomes and to analyzing their organization, diversity, evolution and function. Using genomic DNA the origin of chromatin in hybrids and alien introgression lines can be identified and followed through breeding programmes. We have applied this technology to study the chromosome composition of new recombinants and genomes derived from spontaneous and induced translocations in particular involving rye and the goat grass Thinoyrum intermedium that transfer disease and stress resistance to wheat. We have established flow diagrammes for easy identification of the alien chromosome material. (author)

  2. Analysis of molecular responses in plant biosystem under excess aluminum stress

    International Nuclear Information System (INIS)

    The effects of aluminum (Al) ion on plant biosystem were investigated using tandem accelerator mass spectrometry. Distribution of Al incorporated in cell organelle was investigated with water-cultured Brachiaria ruziziensis, a subtropical pasture aiming to elucidate the behaviors of Al element in Al-resistant plants. The intracellular amount of Al was determined using radioactive Al (26Al) as a tracer. The concentration of Al was highest in the nucleus than other cell organelle. Then, intracellular distribution of Al was investigated in cell culture system of acid resistant and Al-resistant B. ruziziensis. The optimum temperature for the growth of B. ruziziensis under Al stress conditions at pH 3.5-4.0 was significantly lower than that of the ordinary culture conditions at pH 5.8. The results of electrophoresis suggested that Al incorporated in the acid conditions might exist as a complex bound to several different proteins in the cell. The localization of Al to the nucleus was more distinct in the Al-resistant cells than the sensitive ones. Then, relationship between Al sensitivity and viral sensitivity was investigated with the barley. Formation of necrotic streak was induced by BSMV virus under stress conditions of a low concentration of Al at pH 6. Such necrotic streak was more easily induced as an increase in the Al concentration Ultra-sensitive AMS method was able to detect 107-108 Al atoms. In Al-resistant plant, its accumulation in the nucleus or mitochondria was greater than that in a sensitive one. Therefore, it was suggested that Al-resistant mutants have larger capacities of the nucleus and mitochondria for Al accumulation than that of a sensitive strain. (M.N.)

  3. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    Science.gov (United States)

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger. PMID:25120949

  4. Application of the Linux-based share memory technology in a visual and remote analyzing software for the high-speed nuclear information

    International Nuclear Information System (INIS)

    Taking a remote analyzing software for high-speed nuclear information acquisition as an example, the paper presents the application of the shared memory technology based on the Linux operating system. The characteristic of universality and credibility provides the reference for similar analyzing software of other nuclear information acquisition systems

  5. Validation of the Applied Biosystems 7500 Fast Instrument for Detection of Listeria Species with the SureTect Listeria Species PCR Assay.

    Science.gov (United States)

    Cloke, Jonathan; Arizanova, Julia; Crabtree, David; Simpson, Helen; Evans, Katharine; Vaahtoranta, Laura; Palomäki, Jukka-Pekka; Artimo, Paulus; Huang, Feng; Liikanen, Maria; Koskela, Suvi; Chen, Yi

    2016-03-01

    The Thermo Scientific™ SureTect™ Listeria species Real-Time PCR Assay was certified during 2013 by the AOAC Research Institute (RI) Performance Tested Methods(SM) program as a rapid method for the detection of Listeria species from a wide range of food matrixes and surface samples. A method modification study was conducted in 2015 to extend the matrix claims of the product to a wider range of food matrixes. This report details the method modification study undertaken to extend the use of this PCR kit to the Applied Biosystems™ 7500 Fast PCR Instrument and Applied Biosystems RapidFinder™ Express 2.0 software allowing use of the assay on a 96-well format PCR cycler in addition to the current workflow, using the 24-well Thermo Scientific PikoReal™ PCR Instrument and Thermo Scientific SureTect software. The method modification study presented in this report was assessed by the AOAC-RI as being a level 2 method modification study, necessitating a method developer study on a representative range of food matrixes covering raw ground turkey, 2% fat pasteurized milk, and bagged lettuce as well as stainless steel surface samples. All testing was conducted in comparison to the reference method detailed in International Organization for Standardization (ISO) 6579:2002. No significant difference by probability of detection statistical analysis was found between the SureTect Listeria species PCR Assay or the ISO reference method methods for any of the three food matrixes and the surface samples analyzed during the study. PMID:26923177

  6. Does Personality Matter? Applying Holland's Typology to Analyze Students' Self-Selection into Science, Technology, Engineering, and Mathematics Majors

    Science.gov (United States)

    Chen, P. Daniel; Simpson, Patricia A.

    2015-01-01

    This study utilized John Holland's personality typology and the Social Cognitive Career Theory (SCCT) to examine the factors that may affect students' self-selection into science, technology, engineering, and mathematics (STEM) majors. Results indicated that gender, race/ethnicity, high school achievement, and personality type were statistically…

  7. Collect, analyze and data base for building up the investment reports of Center for Nuclear Science and Technology construction project

    International Nuclear Information System (INIS)

    Following the Contract No.19/HD/NVCB dated July 10, 2013 signed by the President of Vietnam Atomic Energy Institute (VINATOM), an additional ministerial Project was approval by the Decision No. 526/QD-VNLNT dated July 8, 2013 by the VINATOM President in order to implement an important task for VINATOM. This project was implemented by the Institute for Nuclear Science and Technology (INST) in Hanoi as management organization and VINATOM as the owner of project results. Main objectives of this Project are to support national budget for implementing to collected the general report from previous projects which are relevant to CNEST and new research reactor, IAEA guidance documents, documents provided by ROSATOM in seminars in 2010, 2012 and 2013, report from expert visits of Ministry of Science and Technology and completed the general report about the construction project of CNEST. (author)

  8. Ab initio O(N) elongation-counterpoise method for BSSE-corrected interaction energy analyses in biosystems

    Energy Technology Data Exchange (ETDEWEB)

    Orimoto, Yuuichi; Xie, Peng; Liu, Kai [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Yamamoto, Ryohei [Department of Molecular and Material Sciences, Interdisciplinary Graduate School of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Imamura, Akira [Hiroshima Kokusai Gakuin University, 6-20-1 Nakano, Aki-ku, Hiroshima 739-0321 (Japan); Aoki, Yuriko, E-mail: aoki.yuriko.397@m.kyushu-u.ac.jp [Department of Material Sciences, Faculty of Engineering Sciences, Kyushu University, 6-1 Kasuga-Park, Fukuoka 816-8580 (Japan); Japan Science and Technology Agency, CREST, 4-1-8 Hon-chou, Kawaguchi, Saitama 332-0012 (Japan)

    2015-03-14

    An Elongation-counterpoise (ELG-CP) method was developed for performing accurate and efficient interaction energy analysis and correcting the basis set superposition error (BSSE) in biosystems. The method was achieved by combining our developed ab initio O(N) elongation method with the conventional counterpoise method proposed for solving the BSSE problem. As a test, the ELG-CP method was applied to the analysis of the DNAs’ inter-strands interaction energies with respect to the alkylation-induced base pair mismatch phenomenon that causes a transition from G⋯C to A⋯T. It was found that the ELG-CP method showed high efficiency (nearly linear-scaling) and high accuracy with a negligibly small energy error in the total energy calculations (in the order of 10{sup −7}–10{sup −8} hartree/atom) as compared with the conventional method during the counterpoise treatment. Furthermore, the magnitude of the BSSE was found to be ca. −290 kcal/mol for the calculation of a DNA model with 21 base pairs. This emphasizes the importance of BSSE correction when a limited size basis set is used to study the DNA models and compare small energy differences between them. In this work, we quantitatively estimated the inter-strands interaction energy for each possible step in the transition process from G⋯C to A⋯T by the ELG-CP method. It was found that the base pair replacement in the process only affects the interaction energy for a limited area around the mismatch position with a few adjacent base pairs. From the interaction energy point of view, our results showed that a base pair sliding mechanism possibly occurs after the alkylation of guanine to gain the maximum possible number of hydrogen bonds between the bases. In addition, the steps leading to the A⋯T replacement accompanied with replications were found to be unfavorable processes corresponding to ca. 10 kcal/mol loss in stabilization energy. The present study indicated that the ELG-CP method is promising for

  9. Analyzing the effect of customer loyalty on virtual marketing adoption based on theory of technology acceptance model

    OpenAIRE

    Peyman Ghafari Ashtiani; Atefeh Parsayan; Moein Mohajerani

    2016-01-01

    One of the most advantages of the internet and its expansion is probably due to its easy and low cost access to unlimited information and easy and fast information exchange. The accession of communication technology for marketing area and emergence of the Internet leads to creation and development of new marketing models such as viral marketing. In fact, unlike other marketing methods, the most powerful tool for selling products and ideas are not done by a marketer to a customer but from a cu...

  10. Analyzing the Development of Linux Technology%嵌入式Linux系统移植技术研究

    Institute of Scientific and Technical Information of China (English)

    张伟杰; 李明理

    2012-01-01

    本文阐述了嵌入式Linux系统开发流程和交叉开发环境的建立,分析了Linux的内部组织结构及其对系统移植的影响,介绍了目标硬件平台和现有的软件基础,并从本次目标系统实现的角度,以理论分析为基础,移植实现为目标,围绕系统移植的主要内容做了重点剖析和实现.%The paper states the development environment, analysises the internal organizational structure and it' s effect of system transplant, introduces target hardware platform and the existing software foundation, and from the angle of the realization of the target system, based on the theoretical analysis, transplantation for goal realization, around the system is an important content of transplantation, do it analyzes and realization.

  11. Global Impact of Energy Use in Middle East Oil Economies: A Modeling Framework for Analyzing Technology-Energy-Environment-Economy Chain

    OpenAIRE

    Hodjat Ghadimi

    2007-01-01

    To explore choices of improving energy efficiency in energy-rich countries of the Middle East, this study lays out an integrated modeling framework for analyzing the technology-energy-environment-economy chain for the case of an energy exporting country. This framework consists of an input output process-flow model (IOPM) and a computable general equilibrium (CGE) model. The former investigates the micro-level production processes and sectoral interdependencies to show how alternative technol...

  12. Network Stack Analyzing and Protocol Add Technology in Linux%Linux网络协议栈分析及协议添加的实现

    Institute of Scientific and Technical Information of China (English)

    唐续; 刘心松; 杨峰

    2003-01-01

    In order to improve the performance of Linux network,new protocols should be implemented and added in original protocol stack. For this demand,this paper has analyzed Linux network stack architecture and implement technology,then presented a method that appended new protocols in the network stack of Linux. The processes of protocol register ,protocol operation,protocol header implement,packets receiving, user interface are involved in this method.

  13. Seismic Technology Adapted to Analyzing and Developing Geothermal Systems Below Surface-Exposed High-Velocity Rocks Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; DeAngelo, Michael V. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Ermolaeva, Elena [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Remington, Randy [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Sava, Diana [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wagner, Donald [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wei, Shuijion [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology

    2013-02-01

    The objective of our research was to develop and demonstrate seismic data-acquisition and data-processing technologies that allow geothermal prospects below high-velocity rock outcrops to be evaluated. To do this, we acquired a 3-component seismic test line across an area of exposed high-velocity rocks in Brewster County, Texas, where there is high heat flow and surface conditions mimic those found at numerous geothermal prospects. Seismic contractors have not succeeded in creating good-quality seismic data in this area for companies who have acquired data for oil and gas exploitation purposes. Our test profile traversed an area where high-velocity rocks and low-velocity sediment were exposed on the surface in alternating patterns that repeated along the test line. We verified that these surface conditions cause non-ending reverberations of Love waves, Rayleigh waves, and shallow critical refractions to travel across the earth surface between the boundaries of the fast-velocity and slow-velocity material exposed on the surface. These reverberating surface waves form the high level of noise in this area that does not allow reflections from deep interfaces to be seen and utilized. Our data-acquisition method of deploying a box array of closely spaced geophones allowed us to recognize and evaluate these surface-wave noise modes regardless of the azimuth direction to the surface anomaly that backscattered the waves and caused them to return to the test-line profile. With this knowledge of the surface-wave noise, we were able to process these test-line data to create P-P and SH-SH images that were superior to those produced by a skilled seismic data-processing contractor. Compared to the P-P data acquired along the test line, the SH-SH data provided a better detection of faults and could be used to trace these faults upward to the boundaries of exposed surface rocks. We expanded our comparison of the relative value of S-wave and P-wave seismic data for geothermal

  14. Greenhouse gas (GHG) emission in organic farming. Approximate quantification of its generation at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM)

    Science.gov (United States)

    Campos, Jorge; Barbado, Elena; Maldonado, Mariano; Andreu, Gemma; López de Fuentes, Pilar

    2016-04-01

    As it well-known, agricultural soil fertilization increases the rate of greenhouse gas (GHG) emission production such as CO2, CH4 and N2O. Participation share of this activity on the climate change is currently under study, as well as the mitigation possibilities. In this context, we considered that it would be interesting to know how this share is in the case of organic farming. In relation to this, a field experiment was carried out at the organic garden of the School of Agricultural, Food and Biosystems Engineering (ETSIAAB) in the Technical University of Madrid (UPM). The orchard included different management growing areas, corresponding to different schools of organic farming. Soil and gas samples were taken from these different sites. Gas samples were collected throughout the growing season from an accumulated atmosphere inside static chambers inserted into the soil. Then, these samples were carried to the laboratory and there analyzed. The results obtained allow knowing approximately how ecological fertilization contributes to air pollution due to greenhouse gases.

  15. An application of multiplier analysis in analyzing the role of information and communication technology sectors on Indonesian national economy: 1990-2005

    Science.gov (United States)

    Zuhdi, Ubaidillah

    2015-01-01

    The purpose of this study is to continue the previous studies which focused on Indonesian Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Indonesian national economy using simple household income multiplier, one of the analysis tools in Input-Output (IO) analysis. The analysis period of this study is from 1990-2005. The results show that the sectors did not have an important role on the national economy of Indonesia on the period. Besides, the results also show that, from the point of view of the multiplier, Indonesian national economy tended to stable during the period.

  16. An application of multiplier analysis in analyzing the role of information and communication technology sectors on Indonesian national economy: 1990-2005

    International Nuclear Information System (INIS)

    The purpose of this study is to continue the previous studies which focused on Indonesian Information and Communication Technology (ICT) sectors. More specifically, this study aims to analyze the role of ICT sectors on Indonesian national economy using simple household income multiplier, one of the analysis tools in Input-Output (IO) analysis. The analysis period of this study is from 1990-2005. The results show that the sectors did not have an important role on the national economy of Indonesia on the period. Besides, the results also show that, from the point of view of the multiplier, Indonesian national economy tended to stable during the period

  17. Oxygen analyzer

    Science.gov (United States)

    Benner, William H.

    1986-01-01

    An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.

  18. Laser influence to biosystems

    Directory of Open Access Journals (Sweden)

    Jevtić Sanja D.

    2015-01-01

    Full Text Available In this paper a continous (cw lasers in visible region were applied in order to study the influence of quantum generator to certain plants. The aim of such projects is to analyse biostimulation processes of living organizms which are linked to defined laser power density thresholds (exposition doses. The results of irradiation of corn and wheat seeds using He-Ne laser in the cw regime of 632.8nm, 50mW are presented and compared to results for other laser types. The dry and wet plant seeds were irradiated in defined time intervals and the germination period plant was monitored by days. Morphological data (stalk thickness, height, cob lenght for chosen plants were monitored. From the recorded data, for the whole vegetative period, we performed appropriate statistical data processing. One part of experiments were the measurements of coefficient of reflection in visible range. Correlation estimations were calculated and discussed for our results. Main conclusion was that there were a significant increments in plant's height and also a cob lenght elongation for corn.

  19. Laser influence to biosystems

    OpenAIRE

    Jevtić Sanja D.; Srećković Milesa Ž.; Pelemiš Svetlana S.; Konstantinović Ljubica M.; Jovanić Predrag B.; Petrović Lazar D.; Dukić Milan M.

    2015-01-01

    In this paper a continous (cw) lasers in visible region were applied in order to study the influence of quantum generator to certain plants. The aim of such projects is to analyse biostimulation processes of living organizms which are linked to defined laser power density thresholds (exposition doses). The results of irradiation of corn and wheat seeds using He-Ne laser in the cw regime of 632.8nm, 50mW are presented and compared to results for other laser ...

  20. An art report to analyze research status for the establishment of the space food development and future food system using the advanced food technology

    International Nuclear Information System (INIS)

    The quality of food for the astronaut accomplishing the mission in the space is one of the most important matters, and it is time to study and develop Korean space food for the Korean astronaut in the space. Therefore, in the beginning of the space exploration era, it is necessary to establish a national long-term plan and study and develop Korean space food in order to provide food with better quality for the astronaut accomplishing the space mission. Using current food processing, preservation, and packaging technology, it is necessary to develop the Korean space food, provide Korean astronaut studying at the international space station, and study the future space food systems used for the long-term space voyage and planet habitat base in the era of space exploration. Space food is analyzed through nutritional analysis, sensory evaluation, storage studies, packaging evaluations, and many other methods before its final shipment on the space shuttle. Each technology developed for the advanced food system must provide the required attribute to the food system, including safety, nutrition, and acceptability. It is anticipated that the duration of the exploration class missions can be at least 2, 3 years, and one of the biggest challenges for these missions will be to provide acceptable food with a shelf-life of 3-5 years. The development of space food process/preservation technology and its ripple effect will make a contribution to the improvement of nation's international phase, and the developed space food will potentially be used for combat ration and emergency/special food like the U. S. A. In the 21th century of space exploration era, the development of the advanced food system and life support system in the Mars space base as well as the space shuttle will strengthen the capability to precede the future space exploration era

  1. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) TESTING OF TWO HYDROGEN SULFIDE ANALYZERS: HORIBA INSTRUMENTS, INC., APSA-360 AND TELEDYNE-API MODEL 101E

    Science.gov (United States)

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  2. Biochemical Technology Program progress report for the period January 1--June 30, 1976. [Centrifugal analyzers and advanced analytical systems for blood and body fluids

    Energy Technology Data Exchange (ETDEWEB)

    Mrochek, J.E.; Burtis, C.A.; Scott, C.D. (comps.)

    1976-09-01

    This document, which covers the period January 1-June 30, 1976, describes progress in the following areas: (1) advanced analytical techniques for the clinical laboratory, (2) fast clinical analyzers, (3) development of a miniaturized analytical clinical laboratory system, (4) centrifugal fast analyzers for animal toxicological studies, and (5) chemical profile of body fluids.

  3. (Environmental technology)

    Energy Technology Data Exchange (ETDEWEB)

    Boston, H.L.

    1990-10-12

    The traveler participated in a conference on environmental technology in Paris, sponsored by the US Embassy-Paris, US Environmental Protection Agency (EPA), the French Environmental Ministry, and others. The traveler sat on a panel for environmental aspects of energy technology and made a presentation on the potential contributions of Oak Ridge National Laboratory (ORNL) to a planned French-American Environmental Technologies Institute in Chattanooga, Tennessee, and Evry, France. This institute would provide opportunities for international cooperation on environmental issues and technology transfer related to environmental protection, monitoring, and restoration at US Department of Energy (DOE) facilities. The traveler also attended the Fourth International Conference on Environmental Contamination in Barcelona. Conference topics included environmental chemistry, land disposal of wastes, treatment of toxic wastes, micropollutants, trace organics, artificial radionuclides in the environment, and the use biomonitoring and biosystems for environmental assessment. The traveler presented a paper on The Fate of Radionuclides in Sewage Sludge Applied to Land.'' Those findings corresponded well with results from studies addressing the fate of fallout radionuclides from the Chernobyl nuclear accident. There was an exchange of new information on a number of topics of interest to DOE waste management and environmental restoration needs.

  4. Waste Not, Want Not: Analyzing the Economic and Environmental Viability of Waste-to-Energy (WTE) Technology for Site-Specific Optimization of Renewable Energy Options

    Energy Technology Data Exchange (ETDEWEB)

    Funk, K.; Milford, J.; Simpkins, T.

    2013-02-01

    Waste-to-energy (WTE) technology burns municipal solid waste (MSW) in an environmentally safe combustion system to generate electricity, provide district heat, and reduce the need for landfill disposal. While this technology has gained acceptance in Europe, it has yet to be commonly recognized as an option in the United States. Section 1 of this report provides an overview of WTE as a renewable energy technology and describes a high-level model developed to assess the feasibility of WTE at a site. Section 2 reviews results from previous life cycle assessment (LCA) studies of WTE, and then uses an LCA inventory tool to perform a screening-level analysis of cost, net energy production, greenhouse gas (GHG) emissions, and conventional air pollution impacts of WTE for residual MSW in Boulder, Colorado. Section 3 of this report describes the federal regulations that govern the permitting, monitoring, and operating practices of MSW combustors and provides emissions limits for WTE projects.

  5. Systemic structural modular generalization of the crystallography of bound water applied to study the mechanisms of processes in biosystems at the atomic and molecular level

    International Nuclear Information System (INIS)

    The main reasons of the modern scientific revolution, one of the consequences of which are nanotechnologies and the development of interdisciplinary overall natural science (which can build potentially possible atomic structures and study the mechanisms of the processes occurring in them), are considered. The unifying role of crystallography in the accumulation of interdisciplinary knowledge is demonstrated. This generalization of crystallography requires the introduction of a new concept: a module which reflects the universal condition for stability of all real and potential and equilibrium and nonequilibrium structures of matter (their connectivity). A modular generalization of crystallography covers all forms of solids, including the structure of bound water (a system-forming matrix for the self-organization and morphogenesis of hierarchical biosystems which determines the metric selection of all other structural components of these systems). A dynamic model of the water surface layer, which serves as a matrix in the formation of Langmuir monolayers and plays a key role in the occurrence of life on the Earth, is developed.

  6. Policy issues on technology development for high-level radioactive waste disposal from the viewpoint of intergenerational equity. Through analyzing the arguments on retrievability

    International Nuclear Information System (INIS)

    It is recognized that social and political perspectives are crucial for decision-making of the high-level radioactive waste (HLW) disposal. Understanding and addressing value judgments involved in decision-making becomes the essential part of the efforts to gain public acceptance of HLW disposal. This study discusses policy issues raised by the change of the concept of intergenerational equity, which is one of the most important principles for the value judgments concerning HLW disposal. This principle is widening its scope: in addition to the obligation of the current generation to minimize risk, cost and burden for the future generation, the obligation to maintain equal opportunities for the future generation is recently emphasized. Serious consideration of retrievability is regarded as a symbol of this change. This study focuses on the implications of the above change in the strategy of disposal technology development. It points out that comprehensive consideration including the viewpoints of ensuring retrievability and gaining confidence in robustness of the disposal system is required to optimize the methodology of HLW disposal and that the optimized methodology should be evaluated from the social perspective to ensure high level public consensus. Moreover, this study makes suggestions on development of technology elements to provide a firm basis for optimization. (author)

  7. Persistent GnRH receptor activation in pituitary αT3-1 cells analyzed with a label-free technology.

    Science.gov (United States)

    Nederpelt, I; Vergroesen, R D; IJzerman, A P; Heitman, L H

    2016-05-15

    The gonadotropin-releasing hormone (GnRH) receptor is a drug target for certain hormone-dependent diseases such as prostate cancer. In this study, we examined the activation profiles of the endogenous ligand, GnRH and a well-known marketed analog, buserelin using a label-free assay in pituitary αT3-1 cells with endogenous GnRH receptor expression. This whole cell impedance-based technology allows for the real-time measurement of morphological cellular changes. Both agonists dose-dependently decreased the impedance as a result of GnRH receptor activation with potencies of 9.3±0.1 (pEC50 value, buserelin) and 7.8±0.06 (pEC50 value, GnRH). Subsequently, GnRH receptor activation was completely abolished with a selective Gαq inhibitor, thereby confirming the Gαq-coupling of the GnRH receptor in pituitary αT3-1 cells. Additionally, we observed continued responses after agonist stimulation of αT3-1 cells indicating long-lasting cellular effects. Wash-out experiments demonstrated that the long-lasting effects induced by GnRH were most likely caused by rebinding since over 70% of the original response was abolished after wash-out. In contrast, a long receptor residence time was responsible for the prolonged effects caused by buserelin, with over 70% of the original response remaining after wash-out. In summary, we validated that impedance-based label-free technology is suited for studying receptor-mediated activation in cell lines endogenously expressing the target of interest. Moreover, this real-time monitoring allows the examination of binding kinetics and its influence on receptor activation at a cellular level. PMID:26774084

  8. Field Evaluation of MERCEM Mercury Emission Analyzer System at the Oak Ridge TSCA Incinerator East Tennessee Technology Park Oak Ridge, Tennessee

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-03-01

    The authors reached the following conclusions: (1) The two-month evaluation of the MERCEM total mercury monitor from Perkin Elmer provided a useful venue in determining the feasibility of using a CEM to measure total mercury in a saturated flue gas. (2) The MERCEM exhibited potential at a mixed waste incinerator to meet requirements proposed in PS12 under conditions of operation with liquid feeds only at stack mercury concentrations in the range of proposed MACT standards. (3) Performance of the MERCEM under conditions of incinerating solid and liquid wastes simultaneously was less reliable than while feeding liquid feeds only for the operating conditions and configuration of the host facility. (4) The permeation tube calibration method used in this test relied on the CEM internal volumetric and time constants to relate back to a concentration, whereas a compressed gas cylinder concentration is totally independent of the analyzer mass flowmeter and flowrates. (5) Mercury concentration in the compressed gas cylinders was fairly stable over a 5-month period. (6) The reliability of available reference materials was not fully demonstrated without further evaluation of their incorporation into routine operating procedures performed by facility personnel. (7) The degree of mercury control occurring in the TSCA Incinerator off-gas cleaning system could not be quantified from the data collected in this study. (8) It was possible to conduct the demonstration at a facility incinerating radioactively contaminated wastes and to release the equipment for later unrestricted use elsewhere. (9) Experience gained by this testing answered additional site-specific and general questions regarding the operation and maintenance of CEMs and their use in compliance monitoring of total mercury emissions from hazardous waste incinerators.

  9. Ring Image Analyzer

    Science.gov (United States)

    Strekalov, Dmitry V.

    2012-01-01

    Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.

  10. Portable Programmable Multifunction Body Fluids Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Liquid Logic proposes to develop a very capable analyzer based on its digital microfluidic technology. Such an analyzer would be:  Capable of both...

  11. To Analyze the Science and Technology Novelty Searching System of Chi-nese Medicine and Establish Countermeasures%中医药科技查新系统分析及对策研究

    Institute of Scientific and Technical Information of China (English)

    黄薇

    2015-01-01

    该研究对目前的中医药科技查新系统进行了分析,找出了问题所在,以及产生问题的原因。就如何提高查新工作质量,提高工作效率等方面依次给予了解答,并提出了一些有建设性意义的方案。提出了丰富查新人员的业余生活,规范中医药科技查新标准等方案。通过分析问题,解决问题的方法,完善了中医药科技查新系统。最后得出结论,进行临床科研前,必须要在科技查新管理部门做好全面细致的查新工作,确定立项的创新点是否具有研究价值。科研完成后,通过中医药科技查新系统进行评估,再次确定研究的新颖性。%To analyze the science and technology novelty searching system of Chinese medicine at present in this paper, find out the problem in it and the reason of the problem. Then We answered that how to improve the quality of novelty searching work and work efficiency, put forward some countermeasures and suggestions. We proposed to enrich staff's spare life, and make the science and technology novelty searching system of Chinese medicine standard much more better. we completed the science and technology novelty searching system of Chinese medicine by analyzing the problem and solving the problem. We have a conclusion that we must to determine whether the project have research value and search in the Science and technology novelty search management before the research. We must to estimate the novelty of research though the science and technology novelty searching system of Chinese medicine again after completing the research.

  12. Serum Protein Fingerprint of Patients with Pancreatic Cancer by SELDI Technology

    Institute of Scientific and Technical Information of China (English)

    MA Ning; GE Chun-lin; LUAN Feng-ming; YAO Dian-bo; HU Chao-jun; LI Ning; LIU Yong-feng

    2008-01-01

    Objective:To study the serum protein fingerprint of patients with pancreatic cancer and to screen for protein molecules closely related to pancreatic cancer during the onset and progression of the disease using surface-enhanced laser desorption and ionization time of fight mass spectrometry(SELDI-TOF-MS).Methods:Serum samples from 20 pancreatic cancers,20 healthy volunteers and 18 patients with other pancreatic diseases.WCX magnetic beans and PBSII-C protein chips reader(Ciphergen Biosystems Ins.)were used.The protein fingerprint expression of all the Serum samples and the resulting profiles between cancer and normal were analyzed with Biomarker Wizard system.Results:A group of proteomic peaks were detected.Four differently expressed potential biomarkers were identified with the relative molecular weights of 5705 Da,4935 Da,5318 Da and 3243 Da.Among them,two proteins with m/z5705,5318Da down-regulated,and two proteins with m/z 4935,3243 Da were up-regulated in pancreatic cancers.Conclusion:SELDI technology can be used to screen significant proteins of differential expression in the serum of pancreatic cancer patients.These different proteins could be specific biomarkers of the patients with pancreatic cancer in the serum and have the potential value of further investigation.

  13. 我国校企技术转移效率及影响因素分析%Analyzing on University-Enterprise Technology Transfer Efficiency and Its Influencing Factors in China

    Institute of Scientific and Technical Information of China (English)

    廖述梅; 徐升华

    2009-01-01

    高校对企业的技术转移是国家创新体系建设的主要工作之一.采用SFE方法测算了我国27个省市高校从2000-2006年以来对企业的技术转移效率,并分析了非效率因素.分析发现,我国总体上校企技术转移效率比较低,各省市差异大;校企技术转移受到了诸如专利、地区人均研发投入等内外部因素的影响较大.分别从政府、高校和企业三个方面给出了提高校企技术转移效率的政策性建议.%Technology transfer from university to enterprise is one of the important works to develop national innovation systems.Based on 27 provincial panel data from 2000 to 2006,this paper measures the efficiency of technology transfer from university to enterprise,and analyzes inefficiency factors by SFE.The results suggest that the overall efficiencies ale low,and heterogeneity among region.Moreover,the inefficiency model shows that inner and environmental factors,such as patents and R&D investment per head have significant impact on technology transfer.Lastly,some suggestions are put forward to promote technology transfer efficiency from government,university,and enterprise perspectives.

  14. Analyzing crime scene videos

    Science.gov (United States)

    Cunningham, Cindy C.; Peloquin, Tracy D.

    1999-02-01

    Since late 1996 the Forensic Identification Services Section of the Ontario Provincial Police has been actively involved in state-of-the-art image capture and the processing of video images extracted from crime scene videos. The benefits and problems of this technology for video analysis are discussed. All analysis is being conducted on SUN Microsystems UNIX computers, networked to a digital disk recorder that is used for video capture. The primary advantage of this system over traditional frame grabber technology is reviewed. Examples from actual cases are presented and the successes and limitations of this approach are explored. Suggestions to companies implementing security technology plans for various organizations (banks, stores, restaurants, etc.) will be made. Future directions for this work and new technologies are also discussed.

  15. Analyzing in the Present

    DEFF Research Database (Denmark)

    Revsbæk, Line; Pedersen, Lene Tanggaard

    2015-01-01

    The article presents a notion of “analyzing in the present” as a source of inspiration in analyzing qualitative research materials. The term emerged from extensive listening to interview recordings during everyday commuting to university campus. Paying attention to the way different parts of...... various interviews conveyed diverse significance to the listening researcher at different times became a method of continuously opening up the empirical material in a reflexive, breakdown-oriented process of analysis. We argue that situating analysis in the present of analyzing emphasizes and acknowledges...... contributes to an ongoing methodological conversation problematizing the notion of “data” and the use of “data-reliant” methods of analysis....

  16. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  17. Software Design Analyzer System

    Science.gov (United States)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  18. Total organic carbon analyzer

    Science.gov (United States)

    Godec, Richard G.; Kosenka, Paul P.; Smith, Brian D.; Hutte, Richard S.; Webb, Johanna V.; Sauer, Richard L.

    1991-01-01

    The development and testing of a breadboard version of a highly sensitive total-organic-carbon (TOC) analyzer are reported. Attention is given to the system components including the CO2 sensor, oxidation reactor, acidification module, and the sample-inlet system. Research is reported for an experimental reagentless oxidation reactor, and good results are reported for linearity, sensitivity, and selectivity in the CO2 sensor. The TOC analyzer is developed with gravity-independent components and is designed for minimal additions of chemical reagents. The reagentless oxidation reactor is based on electrolysis and UV photolysis and is shown to be potentially useful. The stability of the breadboard instrument is shown to be good on a day-to-day basis, and the analyzer is capable of 5 sample analyses per day for a period of about 80 days. The instrument can provide accurate TOC and TIC measurements over a concentration range of 20 ppb to 50 ppm C.

  19. Technology

    Directory of Open Access Journals (Sweden)

    Xu Jing

    2016-01-01

    Full Text Available The traditional answer card reading method using OMR (Optical Mark Reader, most commonly, OMR special card special use, less versatile, high cost, aiming at the existing problems proposed a method based on pattern recognition of the answer card identification method. Using the method based on Line Segment Detector to detect the tilt of the image, the existence of tilt image rotation correction, and eventually achieve positioning and detection of answers to the answer sheet .Pattern recognition technology for automatic reading, high accuracy, detect faster

  20. Fuel analyzer; Analisador de combustiveis

    Energy Technology Data Exchange (ETDEWEB)

    Cozzolino, Roberval [RS Motors, Indaiatuba, SP (Brazil)

    2008-07-01

    The current technology 'COMBUSTIMETRO' aims to examine the fuel through performance of the engine, as the role of the fuel is to produce energy for the combustion engine in the form of which is directly proportional to the quality and type of fuel. The 'COMBUSTIMETRO' has an engine that always keeps the same entry of air, fuel and fixed point of ignition. His operation is monitored by sensors (Sonda Lambda, RPM and Gases Analyzer) connected to a processor that performs calculations and records the information, generate reports and graphs. (author)

  1. List mode multichannel analyzer

    Science.gov (United States)

    Archer, Daniel E.; Luke, S. John; Mauger, G. Joseph; Riot, Vincent J.; Knapp, David A.

    2007-08-07

    A digital list mode multichannel analyzer (MCA) built around a programmable FPGA device for onboard data analysis and on-the-fly modification of system detection/operating parameters, and capable of collecting and processing data in very small time bins (<1 millisecond) when used in histogramming mode, or in list mode as a list mode MCA.

  2. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  3. Portable pulse height analyzing system

    International Nuclear Information System (INIS)

    Low power, battery operated, compact Portable Pulse Height Analyzing System/Multi Channel Analyzer (PMCA) has been designed and developed for monitoring the various low activity radioisotopes in situ. PMCA can also be used in mobile radiation monitoring vans, wherein, gamma spectrum data collected at different locations can be stored in the battery backed RAM disk and down-loaded on to the PC via a serial link. Designed primarily for measurement and analysis of isotope activity and for field experiments, it can be used with most of the radiation detectors used for pulse height spectrum analysis. PMCA is built around embedded PC hardware architecture wherein all the cards are made with state of the art technology with extensive use of SMT and ASICS. PMCA provides features comparable with standard laboratory models and enables computation of integral area, background area, net peak area, FWHM, peak centroid and energy calibration in the field. This paper describes Portable Pulse Height Analyzing System with focus on following features: a) Hardware implementation of well-known multi channel analyzer technique using embedded PC hardware architecture. b) Software implementation of spectrum acquisition and analysis using high level language namely, C. (author)

  4. Analyzed Using Statistical Moments

    International Nuclear Information System (INIS)

    Diffraction enhanced imaging (DEl) technique is a new x-ray imaging method derived from radiography. The method uses a monorheumetten x-ray beam and introduces an analyzer crystal between an object and a detector Narrow angular acceptance of the analyzer crystal generates an improved contrast over the evaluation radiography. While standart radiography can produce an 'absorption image', DEl produces 'apparent absorption' and 'apparent refraction' images with superior quality. Objects with similar absorption properties may not be distinguished with conventional techniques due to close absorption coefficients. This problem becomes more dominant when an object has scattering properties. A simple approach is introduced to utilize scattered radiation to obtain 'pure absorption' and 'pure refraction' images

  5. PhosphoSiteAnalyzer

    DEFF Research Database (Denmark)

    Bennetzen, Martin V; Cox, Jürgen; Mann, Matthias; Andersen, Jens S.

    2012-01-01

    Phosphoproteomic experiments are routinely conducted in laboratories worldwide, and because of the fast development of mass spectrometric techniques and efficient phosphopeptide enrichment methods, researchers frequently end up having lists with tens of thousands of phosphorylation sites for...... sets that have been subjected to kinase prediction using the previously published NetworKIN algorithm. NetworKIN applies sophisticated linear motif analysis and contextual network modeling to obtain kinase-substrate associations with high accuracy and sensitivity. PhosphoSiteAnalyzer provides an...

  6. Lear CAN analyzer

    OpenAIRE

    Peiró Ibañez, Felipe

    2013-01-01

    Since it was introduced in the automotive industry, the protocol CAN (Controller Area Network) has been widely used for its benefits. This has led many companies to offer several hardware and software solutions in order to monitor the communications that gives this protocol. The current master thesis presents the Lear CAN Analyzer as a software tool developed within the company LEAR Corporation. It is designed to be used in the automobile industry as a complement or substitute for other co...

  7. Analyzing business process management

    OpenAIRE

    Skjæveland, Børge

    2013-01-01

    Within the Oil & Gas Industry, the market is constantly growing more competitive, forcing companies to continually adapt to changes. Companies need to cut costs and improve the business efficiency. One way of successfully managing these challenges is to implement business process management in the organization. This thesis will analyze how Oceaneering Asset Integrity AS handled the implementation of a Business Process Management System and the effects it had on the employees. The main goal...

  8. Magnetoresistive Emulsion Analyzer

    OpenAIRE

    Lin, Gungun; Baraban, Larysa; Han, Luyang; Karnaushenko, Daniil; Makarov, Denys; Cuniberti, Gianaurelio; Schmidt, Oliver G.

    2013-01-01

    We realize a magnetoresistive emulsion analyzer capable of detection, multiparametric analysis and sorting of ferrofluid-containing nanoliter-droplets. The operation of the device in a cytometric mode provides high throughput and quantitative information about the dimensions and magnetic content of the emulsion. Our method offers important complementarity to conventional optical approaches involving ferrofluids, and paves the way to the development of novel compact tools for diagnostics and n...

  9. Radioisotope analyzer of barium

    International Nuclear Information System (INIS)

    Principle of operation and construction of radioisotope barium sulphate analyzer type MZB-2 for fast determination of barium sulphate content in barite ores and enrichment products are described. The gauge equipped with Am-241 and a scintillation detector enables measurement of barium sulphate content in prepared samples of barite ores in the range 60% - 100% with the accuracy of 1%. The gauge is used in laboratories of barite mine and ore processing plant. 2 refs., 2 figs., 1 tab. (author)

  10. IPv6 Protocol Analyzer

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the emerging of next generation Intemet protocol (IPv6), it is expected to replace the current version of Internet protocol (IPv4) that will be exhausted in the near future. Besides providing adequate address space, some other new features are included into the new 128 bits of IP such as IP auto configuration, quality of service, simple routing capability, security, mobility and multicasting. The current protocol analyzer will not be able to handle IPv6 packets. This paper will focus on developing protocol analyzer that decodes IPv6 packet. IPv6 protocol analyzer is an application module,which is able to decode the IPv6 packet and provide detail breakdown of the construction of the packet. It has to understand the detail construction of the IPv6, and provide a high level abstraction of bits and bytes of the IPv6 packet.Thus it increases network administrators' understanding of a network protocol,helps he/she in solving protocol related problem in a IPv6 network environment.

  11. Methods for Analyzing Genomes

    OpenAIRE

    Ståhl, Patrik L.

    2010-01-01

    The human genome reference sequence has given us a two‐dimensional blueprint of our inherited code of life, but we need to employ modern‐day technology to expand our knowledge into a third dimension. Inter‐individual and intra‐individual variation has been shown to be larger than anticipated, and the mode of genetic regulation more complex. Therefore, the methods that were once used to explain our fundamental constitution are now used to decipher our differences. Over the past four years, thr...

  12. Fluorescence analyzer for lignin

    Science.gov (United States)

    Berthold, John W.; Malito, Michael L.; Jeffers, Larry

    1993-01-01

    A method and apparatus for measuring lignin concentration in a sample of wood pulp or black liquor comprises a light emitting arrangement for emitting an excitation light through optical fiber bundles into a probe which has an undiluted sensing end facing the sample. The excitation light causes the lignin concentration to produce fluorescent emission light which is then conveyed through the probe to analyzing equipment which measures the intensity of the emission light. Measures a This invention was made with Government support under Contract Number DOE: DE-FC05-90CE40905 awarded by the Department of Energy (DOE). The Government has certain rights in this invention.

  13. Analyzing Chinese Financial Reporting

    Institute of Scientific and Technical Information of China (English)

    SABRINA; ZHANG

    2008-01-01

    If the world’s capital markets could use a harmonized accounting framework it would not be necessary for a comparison between two or more sets of accounting standards. However,there is much to do before this becomes reality.This article aims to pres- ent a general overview of China’s General Accepted Accounting Principles(GAAP), U.S.General Accepted Accounting Principles and International Financial Reporting Standards(IFRS),and to analyze the differ- ences among IFRS,U.S.GAAP and China GAAP using fixed assets as an example.

  14. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  15. Residual gas analyzer calibration

    Science.gov (United States)

    Lilienkamp, R. H.

    1972-01-01

    A technique which employs known gas mixtures to calibrate the residual gas analyzer (RGA) is described. The mass spectra from the RGA are recorded for each gas mixture. This mass spectra data and the mixture composition data each form a matrix. From the two matrices the calibration matrix may be computed. The matrix mathematics requires the number of calibration gas mixtures be equal to or greater than the number of gases included in the calibration. This technique was evaluated using a mathematical model of an RGA to generate the mass spectra. This model included shot noise errors in the mass spectra. Errors in the gas concentrations were also included in the valuation. The effects of these errors was studied by varying their magnitudes and comparing the resulting calibrations. Several methods of evaluating an actual calibration are presented. The effects of the number of gases in then, the composition of the calibration mixture, and the number of mixtures used are discussed.

  16. Analyzing Cosmic Bubble Collisions

    CERN Document Server

    Gobbetti, Roberto

    2012-01-01

    We develop a set of controlled, analytic approximations to study the effects of bubble collisions on cosmology. We expand the initial perturbation to the inflaton field caused by the collision in a general power series, and determine its time evolution during inflation in terms of the coefficients in the expansion. In models where the observer's bubble undergoes sufficient slow-roll inflation to solve the flatness problem, in the thin wall limit only one coefficient in the expansion is relevant to observational cosmology, allowing nearly model-independent predictions. We discuss two approaches to determining the initial perturbation to the inflaton and the implications for the sign of the effect (a hot or cold spot on the Cosmic Microwave Background temperature map). Lastly, we analyze the effects of collisions with thick-wall bubbles, i.e. away from the thin-wall limit.

  17. Analyzing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2014-01-01

    New types of disclosure and reporting are argued to be vital in order to convey a transparent picture of the true state of the company. However, they are unfortunately not without problems as these types of information are somewhat more complex than the information provided in the traditional......, because the costs of processing and analyzing it exceed the benefits indicating bounded rationality. Hutton (2002) concludes that the analyst community’s inability to raise important questions on quality of management and the viability of its business model inevitably led to the Enron debacle. There seems...... to be evidence of the fact that all types of corporate stakeholders from management to employees, owners, the media and politicians have grave difficulties in interpreting new forms of reporting. One hypothesis could be that if managements’ own understanding of value creation is disclosed to the...

  18. Analyzing architecture articles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In the present study, we express the quality, function, and characteristics of architecture to help people comprehensively understand what architecture is. We also reveal the problems and conflict found in population, land, water resources, pollution, energy, and the organization systems in construction. China’s economy is transforming. We should focus on the cities, architectural environment, energy conservation, emission-reduction, and low-carbon output that will result in successful green development. We should macroscopically and microscopically analyze the development, from the natural environment to the artificial environment; from the relationship between human beings and nature to the combination of social ecology in cities, and farmlands. We must learn to develop and control them harmoniously and scientifically to provide a foundation for the methods used in architecture research.

  19. PDA: Pooled DNA analyzer

    Directory of Open Access Journals (Sweden)

    Lin Chin-Yu

    2006-04-01

    Full Text Available Abstract Background Association mapping using abundant single nucleotide polymorphisms is a powerful tool for identifying disease susceptibility genes for complex traits and exploring possible genetic diversity. Genotyping large numbers of SNPs individually is performed routinely but is cost prohibitive for large-scale genetic studies. DNA pooling is a reliable and cost-saving alternative genotyping method. However, no software has been developed for complete pooled-DNA analyses, including data standardization, allele frequency estimation, and single/multipoint DNA pooling association tests. This motivated the development of the software, 'PDA' (Pooled DNA Analyzer, to analyze pooled DNA data. Results We develop the software, PDA, for the analysis of pooled-DNA data. PDA is originally implemented with the MATLAB® language, but it can also be executed on a Windows system without installing the MATLAB®. PDA provides estimates of the coefficient of preferential amplification and allele frequency. PDA considers an extended single-point association test, which can compare allele frequencies between two DNA pools constructed under different experimental conditions. Moreover, PDA also provides novel chromosome-wide multipoint association tests based on p-value combinations and a sliding-window concept. This new multipoint testing procedure overcomes a computational bottleneck of conventional haplotype-oriented multipoint methods in DNA pooling analyses and can handle data sets having a large pool size and/or large numbers of polymorphic markers. All of the PDA functions are illustrated in the four bona fide examples. Conclusion PDA is simple to operate and does not require that users have a strong statistical background. The software is available at http://www.ibms.sinica.edu.tw/%7Ecsjfann/first%20flow/pda.htm.

  20. Optimizing Technology of Volatile Oil from Flower of Gomphrena globosa and Analyzing Extracting its Chemical Compositions%千日红挥发油提取工艺优化及其化学成分分析

    Institute of Scientific and Technical Information of China (English)

    黄良勤; 王刚

    2014-01-01

    确定千日红(Gomphrena gliobosa. L)挥发油的最佳提取工艺,并分析其化学成分。采用微波辅助萃取法从千日红中提取挥发油,以挥发油提取率为指标,考察萃取时间、液料比、微波功率等影响因素,采用正交试验法优化千日红挥发油提取工艺。通过气相色谱-质谱法对千日红挥发油进行分析,用归一化法测定挥发油成分的百分含量,再用Wiley和Nist05标准谱库解析挥发油成分的结构。千日红挥发油最佳提取工艺:料液比为1:14,萃取时间为5 h,微波功率为700 W,此工艺条件下,挥发油提取率达1.34%。共鉴定了33种成分,占挥发油总成分的98.19%。为进一步开发利用千日红提供科学依据。%To optimize extraction process of volatile oil from flower of Gomphrena globosa and to analyze its chemical compo-sitions, the essential oils were extracted from Gomphrena globosa by microwave-assisted extraction (MAE). Distillation time, solid-liquid ratio, microwave power were investigated with extraction ratio of volatile oil. Extraction technology of volatile oil from flower of Gomphrena globosa was optimized by orthogonal test. Chemical compositions of the volatile oil were analysed by gas chromatography-mass spectrometry (GC-MS). The amount of the components from the essential oil were determined through normalization method. The essential oils were identified with Wiley and Nist05 mass spectrum atlas. Optimal extraction process of volatile oil was as follows: solid-liquid ratio 1:14 and distillation time 4 h. Under optimal extraction conditions, yield of volatile oil was 1.34%. There were 33 components composing of about 98.19% of the total essentials separated and i-dentified from Gomphrena globosa. by MAE. The study will provide scientific basis for the further exploration and utilization of Gomphrena globosa.

  1. Diversity of endophytic bacteria in walnut analyzed by Illumina MiSeq high-throughput sequencing technology%Illumina MiSeq高通量测序分析核桃内生细菌多样性

    Institute of Scientific and Technical Information of China (English)

    陈泽斌; 李冰; 王定康; 余磊; 徐胜光; 任禛; 靳松; 张永福; 彭声静

    2015-01-01

    In this study, the species abundance and alpha diversity of walnut endophytic bacteria were analyzed by Illumina MiSeq high-throughput sequencing of the 16S rDNA-V4 region. Softwares such as Uparse, Flash, and Qiime were employed to sort and calculate the number of sequences and operational taxonomic units ( OTUs) . The numbers of effective sequences and OTUs for each sample were 63183 and 103, respectively. The rarefaction curves showed that adequate sampling was achieved, and the number of OTUs was close to saturation. Majority of the endophytic bacteria belonged to Sphingomonas ( 27. 27%) , Halomonas ( 27. 27%) and Agrobacterium ( 45. 45%) which were therefore the dominant bacterial families in walnut. Illumi-na MiSeq high-throughput sequencing technology provided more accurate and scientific data resources for the study of endophytic bacteria.%应用Illumina MiSeq高通量测序技术测定核桃内生细菌的16S rDNA-V4变异区序列,使用Uparse等软件整理和统计样品序列数目和操作分类单元( OTUs)数量,分析内生细菌的丰度和a-多样性。获得了用于分析的有效序列63183条,OTU数为103。稀释曲线表明测序深度充分,OTU数量接近于饱和。核桃样品的Chao1指数为105.143,Shannon多样性指数为1.823。核桃内生细菌主要分布于鞘氨醇单胞菌属( Sphingomonas,27.27%)、盐单胞菌属( Halomonas,27.27%)、土壤杆菌属( Agrobacterium,45.45%),这3个属是核桃内生细菌优势菌属。

  2. Pseudostupidity and analyzability.

    Science.gov (United States)

    Cohn, L S

    1989-01-01

    This paper seeks to heighten awareness of pseudostupidity and the potential analyzability of patients who manifest it by defining and explicating it, reviewing the literature, and presenting in detail the psychoanalytic treatment of a pseudostupid patient. Pseudostupidity is caused by an inhibition of the integration and synthesis of thoughts resulting in a discrepancy between intellectual capacity and apparent intellect. The patient's pseudostupidity was determined in part by his need to prevent his being more successful than father, i.e., defeating his oedipal rival. Knowing and learning were instinctualized. The patient libidinally and defensively identified with father's passive, masochistic position. He needed to frustrate the analyst as he had felt excited and frustrated by his parents' nudity and thwarted by his inhibitions. He wanted to cause the analyst to feel as helpless as he, the patient, felt. Countertransference frustration was relevant and clinically useful in the analysis. Interpretation of evolving relevant issues led to more anxiety and guilt, less pseudostupidity, a heightened alliance, and eventual working through. Negative therapeutic reactions followed the resolution of pseudostupidity. PMID:2708771

  3. Downhole Fluid Analyzer Development

    Energy Technology Data Exchange (ETDEWEB)

    Bill Turner

    2006-11-28

    A novel fiber optic downhole fluid analyzer has been developed for operation in production wells. This device will allow real-time determination of the oil, gas and water fractions of fluids from different zones in a multizone or multilateral completion environment. The device uses near infrared spectroscopy and induced fluorescence measurement to unambiguously determine the oil, water and gas concentrations at all but the highest water cuts. The only downhole components of the system are the fiber optic cable and windows. All of the active components--light sources, sensors, detection electronics and software--will be located at the surface, and will be able to operate multiple downhole probes. Laboratory testing has demonstrated that the sensor can accurately determine oil, water and gas fractions with a less than 5 percent standard error. Once installed in an intelligent completion, this sensor will give the operating company timely information about the fluids arising from various zones or multilaterals in a complex completion pattern, allowing informed decisions to be made on controlling production. The research and development tasks are discussed along with a market analysis.

  4. Analyzing Pseudophosphatase Function.

    Science.gov (United States)

    Hinton, Shantá D

    2016-01-01

    Pseudophosphatases regulate signal transduction cascades, but their mechanisms of action remain enigmatic. Reflecting this mystery, the prototypical pseudophosphatase STYX (phospho-serine-threonine/tyrosine-binding protein) was named with allusion to the river of the dead in Greek mythology to emphasize that these molecules are "dead" phosphatases. Although proteins with STYX domains do not catalyze dephosphorylation, this in no way precludes their having other functions as integral elements of signaling networks. Thus, understanding their roles in signaling pathways may mark them as potential novel drug targets. This chapter outlines common strategies used to characterize the functions of pseudophosphatases, using as an example MK-STYX [mitogen-activated protein kinase (MAPK) phospho-serine-threonine/tyrosine binding], which has been linked to tumorigenesis, apoptosis, and neuronal differentiation. We start with the importance of "restoring" (when possible) phosphatase activity in a pseudophosphatase so that the active mutant may be used as a comparison control throughout immunoprecipitation and mass spectrometry analyses. To this end, we provide protocols for site-directed mutagenesis, mammalian cell transfection, co-immunoprecipitation, phosphatase activity assays, and immunoblotting that we have used to investigate MK-STYX and the active mutant MK-STYXactive. We also highlight the importance of utilizing RNA interference (RNAi) "knockdown" technology to determine a cellular phenotype in various cell lines. Therefore, we outline our protocols for introducing short hairpin RNA (shRNA) expression plasmids into mammalians cells and quantifying knockdown of gene expression with real-time quantitative PCR (qPCR). A combination of cellular, molecular, biochemical, and proteomic techniques has served as powerful tools in identifying novel functions of the pseudophosphatase MK-STYX. Likewise, the information provided here should be a helpful guide to elucidating the

  5. Digital Microfluidics Sample Analyzer

    Science.gov (United States)

    Pollack, Michael G.; Srinivasan, Vijay; Eckhardt, Allen; Paik, Philip Y.; Sudarsan, Arjun; Shenderov, Alex; Hua, Zhishan; Pamula, Vamsee K.

    2010-01-01

    Three innovations address the needs of the medical world with regard to microfluidic manipulation and testing of physiological samples in ways that can benefit point-of-care needs for patients such as premature infants, for which drawing of blood for continuous tests can be life-threatening in their own right, and for expedited results. A chip with sample injection elements, reservoirs (and waste), droplet formation structures, fluidic pathways, mixing areas, and optical detection sites, was fabricated to test the various components of the microfluidic platform, both individually and in integrated fashion. The droplet control system permits a user to control droplet microactuator system functions, such as droplet operations and detector operations. Also, the programming system allows a user to develop software routines for controlling droplet microactuator system functions, such as droplet operations and detector operations. A chip is incorporated into the system with a controller, a detector, input and output devices, and software. A novel filler fluid formulation is used for the transport of droplets with high protein concentrations. Novel assemblies for detection of photons from an on-chip droplet are present, as well as novel systems for conducting various assays, such as immunoassays and PCR (polymerase chain reaction). The lab-on-a-chip (a.k.a., lab-on-a-printed-circuit board) processes physiological samples and comprises a system for automated, multi-analyte measurements using sub-microliter samples of human serum. The invention also relates to a diagnostic chip and system including the chip that performs many of the routine operations of a central labbased chemistry analyzer, integrating, for example, colorimetric assays (e.g., for proteins), chemiluminescence/fluorescence assays (e.g., for enzymes, electrolytes, and gases), and/or conductometric assays (e.g., for hematocrit on plasma and whole blood) on a single chip platform.

  6. Managing healthcare information: analyzing trust.

    Science.gov (United States)

    Söderström, Eva; Eriksson, Nomie; Åhlfeldt, Rose-Mharie

    2016-08-01

    Purpose - The purpose of this paper is to analyze two case studies with a trust matrix tool, to identify trust issues related to electronic health records. Design/methodology/approach - A qualitative research approach is applied using two case studies. The data analysis of these studies generated a problem list, which was mapped to a trust matrix. Findings - Results demonstrate flaws in current practices and point to achieving balance between organizational, person and technology trust perspectives. The analysis revealed three challenge areas, to: achieve higher trust in patient-focussed healthcare; improve communication between patients and healthcare professionals; and establish clear terminology. By taking trust into account, a more holistic perspective on healthcare can be achieved, where trust can be obtained and optimized. Research limitations/implications - A trust matrix is tested and shown to identify trust problems on different levels and relating to trusting beliefs. Future research should elaborate and more fully address issues within three identified challenge areas. Practical implications - The trust matrix's usefulness as a tool for organizations to analyze trust problems and issues is demonstrated. Originality/value - Healthcare trust issues are captured to a greater extent and from previously unchartered perspectives. PMID:27477934

  7. Soft Decision Analyzer

    Science.gov (United States)

    Steele, Glen; Lansdowne, Chatwin; Zucha, Joan; Schlensinger, Adam

    2013-01-01

    The Soft Decision Analyzer (SDA) is an instrument that combines hardware, firmware, and software to perform realtime closed-loop end-to-end statistical analysis of single- or dual- channel serial digital RF communications systems operating in very low signal-to-noise conditions. As an innovation, the unique SDA capabilities allow it to perform analysis of situations where the receiving communication system slips bits due to low signal-to-noise conditions or experiences constellation rotations resulting in channel polarity in versions or channel assignment swaps. SDA s closed-loop detection allows it to instrument a live system and correlate observations with frame, codeword, and packet losses, as well as Quality of Service (QoS) and Quality of Experience (QoE) events. The SDA s abilities are not confined to performing analysis in low signal-to-noise conditions. Its analysis provides in-depth insight of a communication system s receiver performance in a variety of operating conditions. The SDA incorporates two techniques for identifying slips. The first is an examination of content of the received data stream s relation to the transmitted data content and the second is a direct examination of the receiver s recovered clock signals relative to a reference. Both techniques provide benefits in different ways and allow the communication engineer evaluating test results increased confidence and understanding of receiver performance. Direct examination of data contents is performed by two different data techniques, power correlation or a modified Massey correlation, and can be applied to soft decision data widths 1 to 12 bits wide over a correlation depth ranging from 16 to 512 samples. The SDA detects receiver bit slips within a 4 bits window and can handle systems with up to four quadrants (QPSK, SQPSK, and BPSK systems). The SDA continuously monitors correlation results to characterize slips and quadrant change and is capable of performing analysis even when the

  8. Investigation of techniques for analyzing and evaluating the effects of newly developed new-energy and energy-saving technologies; Shinsho energy gijutsu kaihatsu koka no bunseki hyoka shuho ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    For the comprehensive and rational evaluation of technologies related to new energy and energy saving, investigations have been conducted about the history and trend of related policies and technological development. Japan saw a great change in the demand-supply structure in 10 to 15 years following the first oil crisis when oil was partially replaced with atomic energy and LNG and energy-saving efforts came into practice. However, the diversification of energy sources by developing new energies and improvement on the self-sustenance rate remain far from success. Related policies both in Japan and overseas are shifting from the conventional efforts for stable supply and economic growth to the handling of deregulation and environmental problems. Energy stratagem has been studied, not regarding energy technology as a mere economic task and working out how the demand for energy should be with the restricted resources, social economy, and environments, taken into account. Models and scenarios are proposed, for the quantitative analysis and evaluation of the energy stratagem wherein the development of technologies for new energy and energy saving are positioned clearly. Especially, concepts emphasized therein are the technique of power source planning and the development of regenerative energy. 24 refs., 32 figs., 15 tabs.

  9. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  10. 浅析流化床生物质与煤共气化技术方案%Briefly Analyzing Scheme for Biomass and Coal Co-gasification Technology of Fluidized Bed

    Institute of Scientific and Technical Information of China (English)

    毕可军; 毛少祥; 孔北方; 柏林红

    2012-01-01

    In allusion to problems that the biomass was difficult to gasity independently, author has discussed the co-complemented technical scheme of biomass with coal co-gasification; has introduced the physical property of biomass and its gasification features; has discussed the technical features and process flow for pulverized coal gasification technology with fluidized bed of ash meh collection ; has presented the technical scheme to make co-gasification of biomass with coal on basis of pulverized coal gasification technology with fluidized bed of ash meh collection ; and also has presented the relative solution measures for existing problems.%针对生物质能源难以单独气化的问题,探讨了生物质与煤共气化的互补性技术方案;介绍了生物质的物理性质和气化特性;论述了灰融聚流化床粉煤气化技术的特点和工艺流程;提出了在灰融聚流化床粉煤气化的基础上进行生物质与煤共气化技术方案,对存在的问题提出了相关解决措施。

  11. A tandem parallel plate analyzer

    International Nuclear Information System (INIS)

    By a new modification of a parallel plate analyzer the second-order focus is obtained in an arbitrary injection angle. This kind of an analyzer with a small injection angle will have an advantage of small operational voltage, compared to the Proca and Green analyzer where the injection angle is 30 degrees. Thus, the newly proposed analyzer will be very useful for the precise energy measurement of high energy particles in MeV range. (author)

  12. Analyzing the Biology on the System Level

    Institute of Scientific and Technical Information of China (English)

    Wei Tong

    2004-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology, and summarizes the analysis methods, experimental technologies, research developments, and so on in the four key fields of systems biology-systemic structures, dynamics, control methods, and design principles.

  13. Integrating Biosystem Models Using Waveform Relaxation

    Directory of Open Access Journals (Sweden)

    Seymour RobertM

    2008-01-01

    Full Text Available Modelling in systems biology often involves the integration of component models into larger composite models. How to do this systematically and efficiently is a significant challenge: coupling of components can be unidirectional or bidirectional, and of variable strengths. We adapt the waveform relaxation (WR method for parallel computation of ODEs as a general methodology for computing systems of linked submodels. Four test cases are presented: (i a cascade of unidirectionally and bidirectionally coupled harmonic oscillators, (ii deterministic and stochastic simulations of calcium oscillations, (iii single cell calcium oscillations showing complex behaviour such as periodic and chaotic bursting, and (iv a multicellular calcium model for a cell plate of hepatocytes. We conclude that WR provides a flexible means to deal with multitime-scale computation and model heterogeneity. Global solutions over time can be captured independently of the solution techniques for the individual components, which may be distributed in different computing environments.

  14. Integrating Biosystem Models Using Waveform Relaxation

    Directory of Open Access Journals (Sweden)

    Stephen Baigent

    2008-12-01

    Full Text Available Modelling in systems biology often involves the integration of component models into larger composite models. How to do this systematically and efficiently is a significant challenge: coupling of components can be unidirectional or bidirectional, and of variable strengths. We adapt the waveform relaxation (WR method for parallel computation of ODEs as a general methodology for computing systems of linked submodels. Four test cases are presented: (i a cascade of unidirectionally and bidirectionally coupled harmonic oscillators, (ii deterministic and stochastic simulations of calcium oscillations, (iii single cell calcium oscillations showing complex behaviour such as periodic and chaotic bursting, and (iv a multicellular calcium model for a cell plate of hepatocytes. We conclude that WR provides a flexible means to deal with multitime-scale computation and model heterogeneity. Global solutions over time can be captured independently of the solution techniques for the individual components, which may be distributed in different computing environments.

  15. Fiber optic chemical sensor (FOCS reg-sign) technology for the detection of hydrocarbons in air and dissolved in water: PetroSense reg-sign Portable Hydrocarbon Analyzer (PHA 100) and the PetroSense reg-sign CMS 5000

    International Nuclear Information System (INIS)

    The PetroSense reg-sign Portable Hydrocarbon Analyzer (PHA 100) and the PetroSense reg-sign CMS 5000 for continuous monitoring, are instruments which can detect the presence of hydrocarbons dissolved in water as well as detect hydrocarbons in the air. They allow the user to obtain real time, in situ data on hydrocarbon levels present. They can also measure the progress of any remediation that is occurring. The technology used by these instruments is described in this paper

  16. O profissional da informática e sua personalidade analisada por meio da técnica de Rorschach The information technology professionals and their personality analyzed by Rorschach technique

    Directory of Open Access Journals (Sweden)

    Seille Cristine Garcia Santos

    2005-12-01

    Full Text Available Este artigo apresenta os resultados de um estudo comparativo da capacidade de análise, iniciativa e relacionamento humano entre informatas gerentes e operacionais. Participaram 66 informatas, de 9 empresas e 5 departamentos de informática com até 150 funcionários, de Porto Alegre e Grande Porto Alegre. Foram utilizados a técnica de Rorschach (Sistema Klopfer e um questionário estruturado. O mesmo questionário foi respondido pelo superior imediato de cada participante do estudo. Os resultados mostram que não existem diferenças significativas (t-Test e correlação de Pearson entre informatas gerentes e informatas operacionais com relação à capacidade de análise, iniciativa e relacionamento humano. Os operacionais se diferenciam dos gerentes no que diz respeito à liberação das reações emocionais com menos controle. É discutida a presença em ambos os grupos de indicativos no Rorschach de dificuldades para interagir com outras pessoas.This article presents the outcome of a comparative study carried out about the analysis capabilities, initiative, and human relationship among IT (Information Technology managers and IT systems analysts and programmers. Sixty-six IT individuals from nine different companies and five IT divisions with up to 150 employees working in Porto Alegre and its metropolitan region were surveyed. The Rorschach's technique (Klopfer System and a structured questionnaire were applied. The same questionnaire was answered by the immediate superior of each subordinate surveyed by this study. The results show that there is no significant difference (t-Test and Pearson correlation between IT managers and IT systems analysts and programmers regarding their analysis capabilities, initiative and human relationship. The operational individuals distinguish themselves from the managerial ones regarding the release of less controlled emotional reactions. Difficulties to interact with others based on Rorschach's indicatives

  17. Research on environmental bioecosensing technology using ecological information; Seitaikei joho ni yoru kankyo bio eco sensing gijutsu ni kansuru chosa

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    The bioecosensing technology was studied which detects and identifies feeble signals generated by biosystem communication in wide biological environment. The following were reported as current notable environmental biosensing technologies: a quick measurement method of environmental contaminants using immunological measurement method, analysis method of ecological state of microorganism using DNA probes, observation of ecosystem by bioluminescent system, measurement method of environmental changes and contaminants using higher animals and plants, and detection method of chemical contaminants using chemotaxis of microorganism. As a result, the new bioecosensing/monitoring technology in molecular level was suggested for identifying comprehensive environmental changes which could not be measured by previous physical and chemical methods, as changes in ecosystem corresponding to environmental changes. As the wide area remote sensing technology of environmental ecological information, sensing technology on the earth, aircraft and satellite was also discussed. 247 refs., 55 figs., 17 tabs.

  18. Analyzing Valuation Practices through Contracts

    DEFF Research Database (Denmark)

    Tesnière, Germain; Labatut, Julie; Boxenbaum, Eva

    This paper seeks to analyze the most recent changes in how societies value animals. We analyze this topic through the prism of contracts between breeding companies and farmers. Focusing on new valuation practices and qualification of breeding animals, we question the evaluation of difficult...

  19. Analyzing data files in SWAN

    CERN Document Server

    Gajam, Niharika

    2016-01-01

    Traditionally analyzing data happens via batch-processing and interactive work on the terminal. The project aims to provide another way of analyzing data files: A cloud-based approach. It aims to make it a productive and interactive environment through the combination of FCC and SWAN software.

  20. Transcriptome characteristics of Paspalum vaginatum analyzed with Illumina sequencing technology%基于高通量测序的海滨雀稗转录组学研究

    Institute of Scientific and Technical Information of China (English)

    贾新平; 叶晓青; 梁丽建; 邓衍明; 孙晓波; 佘建明

    2014-01-01

    The transcriptome of Paspalum vaginatum leaf was sequenced using an Illumina HiSeq 2000 plat-form,which is a new generation of high-throughput sequencing technology used to study expression profiles and to predict functional genes.In the target sample,a total of 47520544 reads containing 4752054400 bp of se-quence information were generated.A total of 81220 unigenes containing 87542503 bp sequence information were formed by initial sequence splicing,with an average read length of 1077 bp.Unigene qualities for several aspects were assessed,such as length distribution,GC content and gene expression level.The sequencing data was of high quality and reliability.The 46169 unigenes were annotated using BLAST searches against the Nr, Nt and SwissProt databases.All the assembled unigenes could be broadly divided into biological processes,cel-lular components and 48 branches of molecular function categories by gene ontology,including metabolic process,binding and cellular processes.The unigenes were further annotated based on COG category,which could be grouped into 25 functional categories.The unigenes could be broadly divided into 112 classes according to their metabolic pathway,including the phenylalanine metabolism pathway,plant-pathogen interaction,plant hormone biosynthesis and signal transduction,flavonoid biosynthesis,terpenoid backbone biosynthesis,lipid metabolism,and RNA degradation.There were 22721 SSR in 81220 unigenes and in the SSR,A/T was the highest repeat,following by CCG/CGG and AGC/CTG.This study is the first comprehensive transcriptome a-nalysis for Paspalum vaginatum ,providing valuable genome data sources for the molecular biology of this grass.%采用新一代高通量测序技术 Illumina HiSeq 2000对海滨雀稗叶片转录组进行测序,结合生物信息学方法开展基因表达谱研究和功能基因预测。通过测序,获得了47520544个序列读取片段(reads),包含了4752054400个碱基序列(bp)信息。对 reads

  1. VoIP Quality Analyzer

    OpenAIRE

    Havelka, Ondřej

    2011-01-01

    This thesis deals with the quality of the IP telephony and its measuring using the netflow technology. It describes individual factors influencing the quality from sampling and quantization over the impairment caused by codecs to the degradation during network transfers. Next part focuses on models allowing to regard quality of IP telephony with emphasis to the E-model and R-factor. It shortly describes the netflow technology and the quality measuring connected with it. Practical part describ...

  2. Nuclear fuel microsphere gamma analyzer

    Science.gov (United States)

    Valentine, Kenneth H.; Long, Jr., Ernest L.; Willey, Melvin G.

    1977-01-01

    A gamma analyzer system is provided for the analysis of nuclear fuel microspheres and other radioactive particles. The system consists of an analysis turntable with means for loading, in sequence, a plurality of stations within the turntable; a gamma ray detector for determining the spectrum of a sample in one section; means for analyzing the spectrum; and a receiver turntable to collect the analyzed material in stations according to the spectrum analysis. Accordingly, particles may be sorted according to their quality; e.g., fuel particles with fractured coatings may be separated from those that are not fractured, or according to other properties.

  3. M-Learning and Technological Literacy: Analyzing Benefits for Apprenticeship

    Science.gov (United States)

    Cortés, Carlos Manuel Pacheco; Cortés, Adriana Margarita Pacheco

    2014-01-01

    The following study consists on comparative literature review conducted by several researchers and instructional designers; for a wide comprehension of Mobile-Learning (abbreviated "M-Learning") as an educational platform to provide "anytime-anywhere" access to interactions and resources on-line, and "Technological…

  4. Achievement report for fiscal 1997 on developing a silicon manufacturing process with reduced energy consumption. Investigation and research on analyzing practical application of a technology to manufacture solar cell silicon raw materials; 1997 nendo energy shiyo gorika silicon seizo process kaihatsu. Taiyo denchi silicon genryo seizo gijutsu no jitsuyoka kaiseki ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    This paper describes the achievement in fiscal 1997 of analyzing practical application of a technology to manufacture solar cell silicon raw materials. Silicon consumption for solar cells in fiscal 1997 has increased to 2000-ton level, and the supply has been very tight. For drastic improvement in the demand and supply situation, development of SOG-Si manufacturing technology and its early practical application are desired. The development of the NEDO mass-production technology using melting and refining has completed constructing the process facilities in fiscal 1998, and will enter the stage of operational research. However, insufficiency in the basic data about behavior of impurities is inhibiting the development. In the substrate manufacturing technology, discussions have shown progress on use of diversifying silicons outside the standard by using the electromagnetic casting process. For slicing and processing the substrates, development of a high-performance slicing equipment and automatic rough rinsing machine is under way. Properties required on silicon raw materials vary considerably widely because of difference in cell making systems and conditions, which is attributable to unknown impurity behavior. When 1GW production is assumed, the cell module manufacturing cost is calculated as 137 yen/W, for which low-cost mass production for its realization, slicing productivity enhancement, and cost reduction are required. The paper also describes site surveys in overseas countries. (NEDO)

  5. SCAN: a Fortran syntax analyzer

    International Nuclear Information System (INIS)

    SCAN is a computer program which analyzes the syntax of a Fortran program. It reads statements of a Fortran program, checks the grammatical validity of them, and produces tables of the analyzed results and intermediate codes for further use. SCAN recognizes the Fortran syntax of the Japan Industrial Standards-7000, plus some Fortran-H statements. In this report, the structure of SCAN, the methods used by the SCAN to analyze statements, tables and intermediate Buckus form texts produced by the SCAN, are presented. The SCAN itself is also written in Fortran language and consists of about 5000 statements. By slight modifications the SCAN may be useful for any application which needs analysis operations of Fortran syntax. (author)

  6. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  7. Analyzing the Grammar of English

    CERN Document Server

    Teschner, Richard V

    2007-01-01

    Analyzing the Grammar of English offers a descriptive analysis of the indispensable elements of English grammar. Designed to be covered in one semester, this textbook starts from scratch and takes nothing for granted beyond a reading and speaking knowledge of English. Extensively revised to function better in skills-building classes, it includes more interspersed exercises that promptly test what is taught, simplified and clarified explanations, greatly expanded and more diverse activities, and a new glossary of over 200 technical terms.Analyzing the Grammar of English is the only English gram

  8. Analyzing petabytes of data with Hadoop

    CERN Document Server

    CERN. Geneva

    2009-01-01

    Abstract The open source Apache Hadoop project provides a powerful suite of tools for storing and analyzing petabytes of data using commodity hardware. After several years of production use inside of web companies like Yahoo! and Facebook and nearly a year of commercial support and development by Cloudera, the technology is spreading rapidly through other disciplines, from financial services and government to life sciences and high energy physics. The talk will motivate the design of Hadoop and discuss some key implementation details in depth. It will also cover the major subprojects in the Hadoop ecosystem, go over some example applications, highlight best practices for deploying Hadoop in your environment, discuss plans for the future of the technology, and provide pointers to the many resources available for learning more. In addition to providing more information about the Hadoop platform, a major goal of this talk is to begin a dialogue with the ATLAS research team on how the tools commonly used in t...

  9. Complete denture analyzed by optical coherence tomography

    Science.gov (United States)

    Negrutiu, Meda L.; Sinescu, Cosmin; Todea, Carmen; Podoleanu, Adrian G.

    2008-02-01

    The complete dentures are currently made using different technologies. In order to avoid deficiencies of the prostheses made using the classical technique, several alternative systems and procedures were imagined, directly related to the material used and also to the manufacturing technology. Thus, at the present time, there are several injecting systems and technologies on the market, that use chemoplastic materials, which are heat cured (90-100°C), in dry or wet environment, or cold cured (below 60°C). There are also technologies that plasticize a hard cured material by thermoplastic processing (without any chemical changes) and then inject it into a mold. The purpose of this study was to analyze the existence of possible defects in several dental prostheses using a non invasive method, before their insertion in the mouth. Different dental prostheses, fabricated from various materials were investigated using en-face optical coherence tomography. In order to discover the defects, the scanning was made in three planes, obtaining images at different depths, from 0,01 μm to 2 mm. In several of the investigated prostheses we found defects which may cause their fracture. These defects are totally included in the prostheses material and can not be vizualised with other imagistic methods. In conclusion, en-face OCT is an important investigative tool for the dental practice.

  10. The kpx, a program analyzer for parallelization

    International Nuclear Information System (INIS)

    The kpx is a program analyzer, developed as a common technological basis for promoting parallel processing. The kpx consists of three tools. The first is ktool, that shows how much execution time is spent in program segments. The second is ptool, that shows parallelization overhead on the Paragon system. The last is xtool, that shows parallelization overhead on the VPP system. The kpx, designed to work for any FORTRAN cord on any UNIX computer, is confirmed to work well after testing on Paragon, SP2, SR2201, VPP500, VPP300, Monte-4, SX-4 and T90. (author)

  11. Proteomics: an efficient tool to analyze nematode proteins

    Science.gov (United States)

    Proteomic technologies have been successfully used to analyze proteins structure and characterization in plants, animals, microbes and humans. We used proteomics methodologies to separate and characterize soybean cyst nematode (SCN) proteins. Optimizing the quantity of proteins required to separat...

  12. The Photo-Pneumatic CO2 Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We are proposing to build a new technology, the photo-pneumatic analyzer. It is small, solid-state, inexpensive, and appropriate for observations of atmospheric...

  13. Analyzing Software Piracy in Education.

    Science.gov (United States)

    Lesisko, Lee James

    This study analyzes the controversy of software piracy in education. It begins with a real world scenario that presents the setting and context of the problem. The legalities and background of software piracy are explained and true court cases are briefly examined. Discussion then focuses on explaining why individuals and organizations pirate…

  14. Helping Students Analyze Business Documents.

    Science.gov (United States)

    Devet, Bonnie

    2001-01-01

    Notes that student writers gain greater insight into the importance of audience by analyzing business documents. Discusses how business writing teachers can help students understand the rhetorical refinements of writing to an audience. Presents an assignment designed to lead writers systematically through an analysis of two advertisements. (SG)

  15. Software-Design-Analyzer System

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    CRISP-90 software-design-analyzer system, update of CRISP-80, is set of computer programs constituting software tool for design and documentation of other software and supporting top-down, hierarchical, modular, structured methodologies for design and programming. Written in Microsoft QuickBasic.

  16. FORTRAN Static Source Code Analyzer

    Science.gov (United States)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  17. The Statistical Loop Analyzer (SLA)

    Science.gov (United States)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  18. Methods for Analyzing Social Media

    DEFF Research Database (Denmark)

    Jensen, Jakob Linaa

    2013-01-01

    Social media is becoming increasingly attractive for users. It is a fast way to communicate ideas and a key source of information. It is therefore one of the most influential mediums of communication of our time and an important area for audience research. The growth of social media invites many...... new questions such as: How can we analyze social media? Can we use traditional audience research methods and apply them to online content? Which new research strategies have been developed? Which ethical research issues and controversies do we have to pay attention to? This book focuses on research...... strategies and methods for analyzing social media and will be of interest to researchers and practitioners using social media, as well as those wanting to keep up to date with the subject....

  19. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  20. Source-Code-Analyzing Program

    Science.gov (United States)

    Manteufel, Thomas; Jun, Linda

    1991-01-01

    FORTRAN Static Source Code Analyzer program, SAP, developed to gather statistics automatically on occurrences of statements and structures within FORTRAN program and provide for reporting of those statistics. Provisions made to weight each statistic and provide overall figure of complexity. Statistics, as well as figures of complexity, gathered on module-by-module basis. Overall summed statistics also accumulated for complete input source file. Written in FORTRAN IV.

  1. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Science.gov (United States)

    Cheung, Mike W.-L.; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  2. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach

    Directory of Open Access Journals (Sweden)

    Mike W.-L. Cheung

    2016-05-01

    Full Text Available Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists – and probably the most crucial one – is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.

  3. Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.

    Science.gov (United States)

    Cheung, Mike W-L; Jak, Suzanne

    2016-01-01

    Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639

  4. 基于有效专利的师范大学技术创新能力研究——与综合性大学的对比分析%The Research for Technological Innovation Efficiency of Normal University by Effective Patent——Analyzing with Comprehensive University

    Institute of Scientific and Technical Information of China (English)

    黄荣晓

    2011-01-01

    以有效专利拥有量作为衡量技术创新能力的指标,从专利类型、专利构成和有效专利维持年限等方面,分析了师范大学技术创新能力状况。应该通过完善政策导向、注重专利转化等政策来提高技术创新能力和水平。%By taking the effective patent as the measuring objects that can represent the technological innovation abilith,this paper analyzes the technological innovation efficiency of South China Normal University from the types,composition and effective lasting length of the patent.The innovation advantage of patent mostly distributes in physics,chemistry,biology and so on.In order to improving the efficiency of technological innovation,we should consummate the patent policy and importance the patent conversion.

  5. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  6. The security analyzer, a security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    A technique has been developed to characterize a nuclear facility and measure the strengths and weaknesses of the physical protection system. It utilizes the artificial intelligence capabilities available in the prolog programming language to probe a facility's defenses and find potential attack paths that meet designated search criteria. As sensors or barriers become inactive due to maintenance, failure, or inclement weather conditions, the protection system can rapidly be reanalyzed to discover weaknesses that would need to be strengthened by alternative means. Conversely, proposed upgrades and enhancements can be easily entered into the database and their effect measured against a variety of potential adversary attacks. Thus the security analyzer is a tool that aids the protection planner as well as the protection operations staff

  7. Trace Gas Analyzer (TGA) program

    Science.gov (United States)

    1977-01-01

    The design, fabrication, and test of a breadboard trace gas analyzer (TGA) is documented. The TGA is a gas chromatograph/mass spectrometer system. The gas chromatograph subsystem employs a recirculating hydrogen carrier gas. The recirculation feature minimizes the requirement for transport and storage of large volumes of carrier gas during a mission. The silver-palladium hydrogen separator which permits the removal of the carrier gas and its reuse also decreases vacuum requirements for the mass spectrometer since the mass spectrometer vacuum system need handle only the very low sample pressure, not sample plus carrier. System performance was evaluated with a representative group of compounds.

  8. Truck acoustic data analyzer system

    Science.gov (United States)

    Haynes, Howard D.; Akerman, Alfred; Ayers, Curtis W.

    2006-07-04

    A passive vehicle acoustic data analyzer system having at least one microphone disposed in the acoustic field of a moving vehicle and a computer in electronic communication the microphone(s). The computer detects and measures the frequency shift in the acoustic signature emitted by the vehicle as it approaches and passes the microphone(s). The acoustic signature of a truck driving by a microphone can provide enough information to estimate the truck speed in miles-per-hour (mph), engine speed in rotations-per-minute (RPM), turbocharger speed in RPM, and vehicle weight.

  9. COBSTRAN - COMPOSITE BLADE STRUCTURAL ANALYZER

    Science.gov (United States)

    Aiello, R. A.

    1994-01-01

    The COBSTRAN (COmposite Blade STRuctural ANalyzer) program is a pre- and post-processor that facilitates the design and analysis of composite turbofan and turboprop blades, as well as composite wind turbine blades. COBSTRAN combines composite mechanics and laminate theory with a data base of fiber and matrix properties. As a preprocessor for NASTRAN or another Finite Element Method (FEM) program, COBSTRAN generates an FEM model with anisotropic homogeneous material properties. Stress output from the FEM program is provided as input to the COBSTRAN postprocessor. The postprocessor then uses the composite mechanics and laminate theory routines to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. COBSTRAN is designed to carry out the many linear analyses required to efficiently model and analyze blade-like structural components made of multilayered angle-plied fiber composites. Components made from isotropic or anisotropic homogeneous materials can also be modeled as a special case of COBSTRAN. NASTRAN MAT1 or MAT2 material cards are generated according to user supplied properties. COBSTRAN is written in FORTRAN 77 and was implemented on a CRAY X-MP with a UNICOS 5.0.12 operating system. The program requires either COSMIC NASTRAN or MSC NASTRAN as a structural analysis package. COBSTRAN was developed in 1989, and has a memory requirement of 262,066 64 bit words.

  10. Compact Microwave Fourier Spectrum Analyzer

    Science.gov (United States)

    Savchenkov, Anatoliy; Matsko, Andrey; Strekalov, Dmitry

    2009-01-01

    A compact photonic microwave Fourier spectrum analyzer [a Fourier-transform microwave spectrometer, (FTMWS)] with no moving parts has been proposed for use in remote sensing of weak, natural microwave emissions from the surfaces and atmospheres of planets to enable remote analysis and determination of chemical composition and abundances of critical molecular constituents in space. The instrument is based on a Bessel beam (light modes with non-zero angular momenta) fiber-optic elements. It features low power consumption, low mass, and high resolution, without a need for any cryogenics, beyond what is achievable by the current state-of-the-art in space instruments. The instrument can also be used in a wide-band scatterometer mode in active radar systems.

  11. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the...... qualities of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust...... thesis documents the groundwork towards addressing the challenges faced by telemedical technologies today and establishing telemedicine as a means of patient diagnosis and treatment. Furthermore, it serves as an empirical example of designing a software ecosystem....

  12. Complex networks theory for analyzing metabolic networks

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; YU Hong; LUO Jianhua; CAO Z.W.; LI Yixue

    2006-01-01

    One of the main tasks of post-genomic informatics is to systematically investigate all molecules and their interactions within a living cell so as to understand how these molecules and the interactions between them relate to the function of the organism,while networks are appropriate abstract description of all kinds of interactions. In the past few years, great achievement has been made in developing theory of complex networks for revealing the organizing principles that govern the formation and evolution of various complex biological, technological and social networks. This paper reviews the accomplishments in constructing genome-based metabolic networks and describes how the theory of complex networks is applied to analyze metabolic networks.

  13. Analyzing and modeling heterogeneous behavior

    Science.gov (United States)

    Lin, Zhiting; Wu, Xiaoqing; He, Dongyue; Zhu, Qiang; Ni, Jixiang

    2016-05-01

    Recently, it was pointed out that the non-Poisson statistics with heavy tail existed in many scenarios of human behaviors. But most of these studies claimed that power-law characterized diverse aspects of human mobility patterns. In this paper, we suggest that human behavior may not be driven by identical mechanisms and can be modeled as a Semi-Markov Modulated Process. To verify our suggestion and model, we analyzed a total of 1,619,934 records of library visitations (including undergraduate and graduate students). It is found that the distribution of visitation intervals is well fitted with three sections of lines instead of the traditional power law distribution in log-log scale. The results confirm that some human behaviors cannot be simply expressed as power law or any other simple functions. At the same time, we divided the data into groups and extracted period bursty events. Through careful analysis in different groups, we drew a conclusion that aggregate behavior might be composed of heterogeneous behaviors, and even the behaviors of the same type tended to be different in different period. The aggregate behavior is supposed to be formed by "heterogeneous groups". We performed a series of experiments. Simulation results showed that we just needed to set up two states Semi-Markov Modulated Process to construct proper representation of heterogeneous behavior.

  14. Los Alamos Nuclear plant analyzer

    International Nuclear Information System (INIS)

    The Relational Database software obtained from Idaho National Engineering Laboratory is implemented on the Los Alamos Cray computer system. For the Nuclear Plant Analyzer (NPA), Los Alamos retained a graphics display terminal and a separate forms terminal for mutual compatibility, but integrated both the terminals into a single-line full-duplex mode of communications, using a single keyboard for input. Work on the program-selection phase of an NPA session is well underway. The final phase of implementation will be the Worker or graphics-driver phase. The Los Alamos in-house NPA has been in use for some time, and has given good results in analyses of four transients. The NPA hydrocode has been developed in to a fast-running code. The authors have observed an average of a factor-of-3 speed increase for typical slow reactor-safety transients when employing the stability enhancing two-step (SETS) method in the one-dimensional components using PF1/MOD1. The SETS method allows violation of the material Courant time-step stability limit and is thus stable at large time steps. The SETS method to the three-dimensional VESSEL component in the NPA hydrocode has been adapted. In addition to the speed increase from this new software, significant additional speed is expected as a result of new hardware that provides for vectorization or parallelization

  15. Analyzing E-Learning Adoption via Recursive Partitioning

    OpenAIRE

    Köllinger, Philipp; Schade, Christian

    2003-01-01

    The paper analyzes factors that influence the adoption of e-learning and gives an example of how to forecast technology adoption based on a post-hoc predictive segmentation using a classification and regression tree (CART). We find strong evidence for the existence of technological interdependencies and organizational learning effects. Furthermore, we find different paths to e-learning adoption. The results of the analysis suggest a growing ?digital divide? among firms. We use cross-sectional...

  16. Usage of data mining for analyzing customer mindset

    Directory of Open Access Journals (Sweden)

    Priti Sadaria

    2012-09-01

    Full Text Available As this is the era of Information Technology, no filed remains untouched by computer science. The technology has become an integral part of the business process. By implementing different data mining techniques and algorithms on the feedback collected from the customer, we can analyzed the data. With help of this analyzed information we have clear idea about the customer’s mind set and can take meaning full decision for production and marketing of particular product. To study about customer mindset differentmodels like classification and association models are used in data mining.

  17. Analyzing Big Data with the Hybrid Interval Regression Methods

    OpenAIRE

    Chia-Hui Huang; Keng-Chieh Yang; Han-Ying Kao

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM...

  18. Design of virtual multi-channel pulse analyzer

    International Nuclear Information System (INIS)

    The author introduces a design of the virtual multi-channel pulse analyzer based on PC DAQ board and LabVIEW, the graphical development platform. It can analyze the output of amplifier and give the height or time spectrum. This system can work in normal modes of MCA and it can be easily expanded according to the situation, which shows the flexibility of virtual instruments technology and extends its application in nuclear signal measurement

  19. Geospatial Technology

    Science.gov (United States)

    Reed, Philip A.; Ritz, John

    2004-01-01

    Geospatial technology refers to a system that is used to acquire, store, analyze, and output data in two or three dimensions. This data is referenced to the earth by some type of coordinate system, such as a map projection. Geospatial systems include thematic mapping, the Global Positioning System (GPS), remote sensing (RS), telemetry, and…

  20. Toward Integrated μNetwork Analyzer

    Science.gov (United States)

    Kmec, M.; Helbig, M.; Herrmann, R.; Rauschenbach, P.; Sachs, J.; Schilling, K.

    The article deals with recent development steps toward monolithically integrated micro-Network Analyzer (μNA). The device will deploy M-Sequence-based single-chip transceivers with a built-in ultra-wideband wave separation unit in the receiver chains. The introduced on-chip wideband wave separation is realized using an optimized resistive directional coupler combined with a customized differential LNA as detector. The wave separation works almost down to DC, and its upper frequency limit is determined by the performance of the implemented technology (i.e., bridge resistors, transistors, etc.), the selected circuit topology, and the wirings of particular coupler components but also by the IC packaging itself. Even though the upper limit is designed to be compatible with the analog input bandwidth of the receiver circuit [which is about 18 GHz for naked die (Kmec et al., M-Sequence based single chip UWB-radar sensor. ANTEM/AMEREM 2010 Conference, Ottawa, 2010)], the packaged IC is intended for use up to 8 GHz. Finally, the discussed transceiver is a further development of the mother SiGe System-on-Chip (SoC) presented in the work cited above.

  1. Analyzing Music Services Positioning Through Qualitative Research

    Directory of Open Access Journals (Sweden)

    Manuel Cuadrado

    2015-12-01

    Full Text Available Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. Curious about this phenomenon, we conducted an exploratory research in order to analyze how all these services, both traditional and new ones were perceived. Specifically, we aimed to study youth´s assessment of the three most relevant music service categories: music radio stations, digital streaming platforms, and pop-rock music festivals. To do so, we used the projective technique of image association to gather information. The population of the study consisted of individuals between 18 and 25 years of age. Our results, after using content analysis, were poor due to spontaneous recall. Therefore, we duplicated the study, but in a more focus-oriented way. Information gathered this time allowed us not only to better know how all these organizations are positioned but also to obtain a list of descriptors to be used in a subsequent descriptive research study.

  2. Analyze Trace Aroma Subtances of Chinese Distillery by Large Volume Injection Techonolgy and Mass Spectrometry Technology%利用大体积进样技术与质谱技术联合分析中国白酒中的微量香味成分

    Institute of Scientific and Technical Information of China (English)

    王双; 徐占成; 徐姿静

    2014-01-01

    中国白酒中含有大量的香味成分和滋味成分,其中很多物质的含量极其稀少,这就给这些物质的分析工作造成了很大的困难。通常情况下为了分析这些极微量的物质,人们需要对白酒样品进行浓缩前处理,然后再利用气相色谱对浓缩样品进行分析。样品的浓缩过程是一项非常耗时耗力的工作,并且浓缩样品中含有大量的溶剂,这些物质在色谱分析中可能引起严重的分析误差,损坏色谱柱或者检测器。而大体积进样技术(LVI)则有效地避免了这些缺点。大体积进样技术可以在短时间内对分析样品进行浓缩,并且排除掉绝大多数的溶剂,从而大大提高了分析的灵敏度。将大体积进样技术和质谱分析技术联合应用,可以大大提高对白酒中极微量成分分析工作的准确性。%Contains the flavor and taste a lot of Chinese liquor, the content of many material extremely scarce, caused great difficulties to the analysis of these substances. Normally in order to analyze the trace substance, people need to liquor samples were concentrated before treatment, and then using gas chromatography to concentrated samples analysis. The enrichment process samples is very time consuming work, and contains a lot of solvent concentration in the sample, these substances may cause serious error analysis in chromatography column, damaged or detector. While large volume injection (LVI) technology can effectively avoid these shortcomings. Large volume injection can be concentrated on the analysis of samples in a short time, and eliminate most of the solvent, thereby greatly improving the analytical sensitivity. The large volume injection technology and mass spectrometry combined with accuracy, can trace component analysis work is greatly improved in liquor.

  3. Update on the USNRC's Nuclear Plant Analyzer

    International Nuclear Information System (INIS)

    The Nuclear Plant Analyzer (NPA) is the US Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAP5 series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. Since the 1984 presentation, major redirections of this NRC program have been taken. The original NPA system was developed for operation on a Control Data Corporation CYBER 176 computer, technology that is some 10 to 15 years old. The NPA system has recently been implemented on Class VI computers to gain increased computational capabilities, and is now being implemented on super-minicomputers for use by the scientific community and possibly by the commercial nuclear power plant simulator community. This paper addresses these activities and related experiences. First, the Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAP5 simulation code onto the Black Fox full scope nuclear power plant simulator

  4. 46 CFR 154.1360 - Oxygen analyzer.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Oxygen analyzer. 154.1360 Section 154.1360 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS... Instrumentation § 154.1360 Oxygen analyzer. The vessel must have a portable analyzer that measures oxygen...

  5. Analyzing the Change-Proneness of APIs and web APIs

    OpenAIRE

    Romano, D.

    2015-01-01

    Analyzing the Change-Proneness of APIs and web APIs APIs and web APIs are used to expose existing business logic and, hence, to ease the reuse of functionalities across multiple software systems. Software systems can use the business logic of legacy systems by binding their APIs and web APIs. With the emergence of a new programming paradigm called service-oriented, APIs are exposed as web APIs hiding the technologies used to implement legacy systems. As a consequence, web APIs establish contr...

  6. Analyzing La Cuna : New Approaches for Mentoring in Professional Associations

    OpenAIRE

    Hicks, Alison

    2012-01-01

    This case study explores the implementation of La Cuna, an online mentoring forum in a small, subject- based professional association, the Seminar for the Acquisition of Latin American Library Materials (SALALM). Designed using the social network software Ning, the forum functioned as an informal learn- ing community for 38 members and was an innovative response to geographical challenges and changing technological skills. Using participation data and a questionnaire to analyze the implementa...

  7. Electrostatic energy analyzers for high energy charged particle beams

    International Nuclear Information System (INIS)

    The electrostatic energy analyzers for high energy charged particle beams emitted from extended large-size objects as well as from remote point sources are proposed. Results of the analytical trajectory solutions in ideal cylindrical field provide focusing characteristics for both configurations. The instruments possess of simple compact design, based on an ideal cylindrical field with entrance window arranged in the end-boundary between electrodes and can be used for measurements in space technologies, plasma and nuclear physics

  8. PMD: A Resource for Archiving and Analyzing Protein Microarray data

    OpenAIRE

    Zhaowei Xu; Likun Huang; Hainan Zhang; Yang Li; Shujuan Guo; Nan Wang; Shi-hua Wang; Ziqing Chen; Jingfang Wang; Sheng-ce Tao

    2016-01-01

    Protein microarray is a powerful technology for both basic research and clinical study. However, because there is no database specifically tailored for protein microarray, the majority of the valuable original protein microarray data is still not publically accessible. To address this issue, we constructed Protein Microarray Database (PMD), which is specifically designed for archiving and analyzing protein microarray data. In PMD, users can easily browse and search the entire database by expe...

  9. Analyzing Real-Time Behavior of Flash Memories

    OpenAIRE

    Parthey, Daniel

    2007-01-01

    Flash memories are used as the main storage in many portable consumer electronic devices because they are more robust than hard drives. This document gives an overview of existing consumer flash memory technologies which are mostly removable flash memory cards. It discusses to which degree consumer flash devices are suitable for real-time systems and provides a detailed timing analysis of some consumer flash devices. Further, it describes methods to analyze mount times, access per...

  10. Electrical spectrum & network analyzers a practical approach

    CERN Document Server

    Helfrick, Albert D

    1991-01-01

    This book presents fundamentals and the latest techniques of electrical spectrum analysis. It focuses on instruments and techniques used on spectrum and network analysis, rather than theory. The book covers the use of spectrum analyzers, tracking generators, and network analyzers. Filled with practical examples, the book presents techniques that are widely used in signal processing and communications applications, yet are difficult to find in most literature.Key Features* Presents numerous practical examples, including actual spectrum analyzer circuits* Instruction on how to us

  11. Entrepreneur and Technological Change

    OpenAIRE

    Myung-Joong Kwon; Jong-gul Lee

    1999-01-01

    The literature on technological change has grown in the last two decades and has made a number of significant theoretical advances, but the role of the entrepreneur in technological change has been relatively ignored. In this paper we attempted to fill this gap and explored the role of the entrepreneur in generating technological change. We constructed an empirical model to analyze the role of the entrepreneur in technological change in the context of deciding the undertaking of the innovatio...

  12. Performance evaluation of PL-11 platelet analyzer

    Institute of Scientific and Technical Information of China (English)

    张有涛

    2013-01-01

    Objective To evaluate and report the performance of PL-11 platelet analyzer. Methods Intravenous blood sam-ples anticoagulated with EDTA-K2 and sodium citrate were tested by the PL-11 platelet analyzer to evaluate the intra-assay and interassay coefficient of variation(CV),

  13. [Health technology in Mexico].

    Science.gov (United States)

    Cruz, C; Faba, G; Martuscelli, J

    1992-01-01

    The features of the health technology cycle are presented, and the effects of the demographic, epidemiologic and economic transition on the health technology demand in Mexico are discussed. The main problems of science and technology in the context of a decreasing scientific and technological activity due to the economic crisis and the adjustment policies are also analyzed: administrative and planning problems, low impact of scientific production, limitations of the Mexican private sector, and the obstacles for technology assessment. Finally, this paper also discusses the main support strategies for science and technology implemented by the Mexican government during the 1980s and the challenges and opportunities that lie ahead. PMID:1411774

  14. Analyzing Broadband Divide in the Farming Sector

    DEFF Research Database (Denmark)

    Jensen, Michael; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup

    2013-01-01

    Agriculture industry has been evolving for centuries. Currently, the technological development of Internet oriented farming tools allows to increase the productivity and efficiency of this sector. Many of the already available tools and applications require high bandwidth in both directions...

  15. Using Three-Dimensional Fluorescence Spectrum Technology to Analyze the Effects of Natural Dissolved Organic Matter on the Pesticide Residues in the Soil%溶解性有机物对土壤中农药残留与分布影响的光谱学研究

    Institute of Scientific and Technical Information of China (English)

    雷宏军; 潘红卫; 韩宇平; 刘鑫; 徐建新

    2015-01-01

    The behavior of pesticide in soil is influenced by dissolved organic matter (DOM ) through competi‐tion adsorption ,adsorption ,solubilization ,accelerated degradation ,and so on .Thus DOM and its components play an important role in the environmental risk in the soil ecosystem and groundwater environment .Current‐ly ,most studies focused on the short‐term effect of high concentration of DOM on the pesticide residues . However ,soil DOM is mainly at low level .Therefore ,there is of some practical significance to probe into the environmental behavior of soil pesticides under natural level of DOM .Thus a site investigation was conducted in the farmland with long‐term application history of pesticide .By using the three dimensional excitation‐emis‐sion fluorescence matrix (3D‐EEM ) technology ,together with the fluorescence regional integration (FRI) quantitative method ,the long‐term effects of pesticide residues under low concentration of natural DOM were analyzed .Results showed that :(1) The long‐term effects of the natural DOM components on the environment behavior of most soil organo‐chlorine pesticides were not significant except for a few pesticides such as γ‐HCH ,p ,p’‐DDE ,etc .(2) The influencing effects of DOM components on different type of pesticides were varied .Among which ,the content of tyrosine component showed a significantly negative correlation (p<0.05) with the concentration of γ‐HCH and p ,p’‐DDE .There were significant positive correlations (p<0.05) between the by‐products of microbial degradation in DOM components and the concentration of hepta‐chlor .There were also a significant positive correlation (p<0.05) between the content of active humus com‐ponent of humic acid in the DOM and the concentration of heptachlor epoxide .These results suggested that the distribution of different types of pesticides residue in the soil was influenced by different components at differ‐ent levels of significance .(3

  16. Multidimensional analyzer based on the MACAMAC system

    International Nuclear Information System (INIS)

    General information about multidimensional analyzer based on the MACAMAC system is given. The analyzer includes the 1521 microprocessor controller and the 1522 memory modules of the ''Borer'' firm (Switzerland), CAMAC modules: ADCs, coincidence organization module, display driver, paper tape input and output modules. The Videoton-34O serves as a terminal, and for data visualization the RG-96 display device is used. The analyzer software includes: a monitor, data acquisition programs, display program and users' programs. The monitor organizes user-system dialogue by means of terminal keyboard

  17. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this proposed program (through Phase III) is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation....

  18. Low Gravity Drug Stability Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed program through Phase III is to build a space-worthy Drug Stability Analyzer that can determine the extent of drug degradation. It will be...

  19. Ultrasensitive Atmospheric Analyzer for Miniature UAVs Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In this SBIR Phase I effort, Los Gatos Research (LGR) proposes to develop a highly-accurate, lightweight, low-power gas analyzer for quantification of water vapor...

  20. On-Demand Urine Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer that can be integrated into International Space Station (ISS) toilets to measure key...

  1. QUBIT DATA STRUCTURES FOR ANALYZING COMPUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Vladimir Hahanov

    2014-11-01

    Full Text Available Qubit models and methods for improving the performance of software and hardware for analyzing digital devices through increasing the dimension of the data structures and memory are proposed. The basic concepts, terminology and definitions necessary for the implementation of quantum computing when analyzing virtual computers are introduced. The investigation results concerning design and modeling computer systems in a cyberspace based on the use of two-component structure are presented.

  2. Analyzing Log Files using Data-Mining

    OpenAIRE

    Marius Mihut

    2008-01-01

    Information systems (i.e. servers, applications and communication devices) create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka) [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  3. The Information Flow Analyzing Based on CPC

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhang; LI Hui

    2005-01-01

    The information flow chart within product life cycle is given out based on collaborative production commerce (CPC) thoughts. In this chart, the separated information systems are integrated by means of enterprise knowledge assets that are promoted by CPC from production knowledge. The information flow in R&D process is analyzed in the environment of virtual R&D group and distributed PDM. In addition, the information flow throughout the manufacturing and marketing process is analyzed in CPC environment.

  4. Network analysis using organizational risk analyzer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tool system of the organizational risk analyzer (ORA) to study the network of East Turkistan terrorists is selected. The model of the relationships among its personnel, knowledge, resources and task entities is represented by the meta-matrix in ORA, with which to analyze the risks and vulnerabilities of organizational structure quantitatively, and obtain the last vulnerabilities and risks of the organization. Case study in this system shows that it should be a shortcut to destroy effectively the network...

  5. Shielded electron microprobe analyzer for plutonium fuel

    International Nuclear Information System (INIS)

    Design, construction and performance test of a shielded electron microprobe analyzer for plutonium fuel are described. In the analyzer, the following modifications were made to Shimadzu ASM-SX (analyzer): (1) a shield of tungsten alloy is incorporated between the sample and the X-ray detector to examine highly radioactive fuel, (2) a magnetic shield against β-rays from the fuel is fitted to the electron detector, (3) a small sample-loading glove box is installed to transfer plutonium fuel safely to the analyzer, (4) a glove box containing a sample-surface treatment apparatus and a balance is connected to the sample-loading glove box, (5) for maintenance and repair of the analyzer by means of closed method, about thirty modifications are made. The performance test with nonradioactive materials showed that despite the above modifications, abilities of the original analyzer are all retained. And furthermore, the simulation test for irradiated fuel with 226Ra of dose rate 40 mR/hr at 30 cm showed that the X-ray peaks to noise ratios are unchanged by using a pulse height selector of the X-ray detector. (author)

  6. Health technology

    International Nuclear Information System (INIS)

    The CEA is an organization with a primarily technological focus, and one of the key areas in which it carries out research is Health Technology. This field of research was recognized and approved by the French Atomic Energy Committee on July 20, 2004. The expectations of both the public and health care professionals relate to demands for the highest standards of health care, at minimum risk. This implies a need to diagnose illness and disease as accurately and as at early a stage as possible, to target surgery precisely to deal only with damaged organs or tissues, to minimize the risk of side effects, allergies and hospital-acquired infections, to follow-up and, as far as possible, tailor the health delivery system to each individual's needs and his or her lifestyle. The health care sector is subject to rapid changes and embraces a vast range of scientific fields. It now requires technological developments that will serve to gather increasing quantities of useful information, analyze and integrate it to obtain a full understanding of highly complex processes and to be able to treat the human body as un-invasively as possible. All the technologies developed require assessment, especially in the hospital environment. (authors)

  7. Macrofoundation for Strategic Technology Management

    DEFF Research Database (Denmark)

    Pedersen, Jørgen Lindgaard

    1995-01-01

    Neoclassical mainstream economics has no perspective on strategic technology management issues. Market failure economics (externalities etc.)can be of some use to analyze problems of relevance in strategic management problems with technology as a part. Environment, inequality and democratic...

  8. Energy and technology review

    International Nuclear Information System (INIS)

    Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81)

  9. Energy and technology review

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-01

    Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81). (GHT)

  10. FST Based Morphological Analyzer for Hindi Language

    Directory of Open Access Journals (Sweden)

    Deepak Kumar

    2012-07-01

    Full Text Available Hindi being a highly inflectional language, FST (Finite State Transducer based approach is most efficient for developing a morphological analyzer for this language. The work presented in this paper uses the SFST (Stuttgart Finite State Transducer tool for generating the FST. A lexicon of root words is created. Rules are then added for generating inflectional and derivational words from these root words. The Morph Analyzer developed was used in a Part Of Speech (POS Tagger based on Stanford POS Tagger. The system was first trained using a manually tagged corpus and MAXENT (Maximum Entropy approach of Stanford POS tagger was then used for tagging input sentences. The morphological analyzer gives approximately 97% correct results. POS tagger gives an accuracy of approximately 87% for the sentences that have the words known to the trained model file, and 80% accuracy for the sentences that have the words unknown to the trained model file.

  11. Dashboard for Analyzing Ubiquitous Learning Log

    Science.gov (United States)

    Lkhagvasuren, Erdenesaikhan; Matsuura, Kenji; Mouri, Kousuke; Ogata, Hiroaki

    2016-01-01

    Mobile and ubiquitous technologies have been applied to a wide range of learning fields such as science, social science, history and language learning. Many researchers have been investigating the development of ubiquitous learning environments; nevertheless, to date, there have not been enough research works related to the reflection, analysis…

  12. Analyzing the mediated voice - a datasession

    DEFF Research Database (Denmark)

    Lawaetz, Anna

    Broadcasted voices are technologically manipulated. In order to achieve a certain autencity or sound of “reality” paradoxically the voices are filtered and trained in order to reach the listeners. This “mis-en-scene” is important knowledge when it comes to the development of a consistent method of...... analysis of the mediated voice...

  13. Detecting influenza outbreaks by analyzing Twitter messages

    CERN Document Server

    Culotta, Aron

    2010-01-01

    We analyze over 500 million Twitter messages from an eight month period and find that tracking a small number of flu-related keywords allows us to forecast future influenza rates with high accuracy, obtaining a 95% correlation with national health statistics. We then analyze the robustness of this approach to spurious keyword matches, and we propose a document classification component to filter these misleading messages. We find that this document classifier can reduce error rates by over half in simulated false alarm experiments, though more research is needed to develop methods that are robust in cases of extremely high noise.

  14. Analyzing Log Files using Data-Mining

    Directory of Open Access Journals (Sweden)

    Marius Mihut

    2008-01-01

    Full Text Available Information systems (i.e. servers, applications and communication devices create a large amount of monitoring data that are saved as log files. For analyzing them, a data-mining approach is helpful. This article presents the steps which are necessary for creating an ‘analyzing instrument’, based on an open source software called Waikato Environment for Knowledge Analysis (Weka [1]. For exemplification, a system log file created by a Windows-based operating system, is used as input file.

  15. Progresses in analyzing 26Al with SMCAMS

    International Nuclear Information System (INIS)

    Shanghai Mini-Cyclotron based Accelerator Mass Spectrometer (SMCAMS) was especially designed for analyzing 14C. In order to accelerate and analyze 26Al the accelerated orbit and beam optics in injection system were calculated and harmonic number and acceleration turns was optimized. Preliminary experiment was carried out. In which a beam current of 1.15 x 10-9A for 27Al- and 0.038 CPS background for 26Al were measured. The limited sensitivity of 26Al/27Al is 5.25 x 10-12. (authors)

  16. A Novel Architecture For Multichannel Analyzer

    International Nuclear Information System (INIS)

    A novel digital approach to real-time, high-throughput, low-cost Multichannel Analyzer (MCA) for radiation spectroscopy is being presented. The MCA input is a shaped nuclear pulse sampled at a high rate, using an Analog-to-Digital Converter (ADC) chip. The digital samples are analyzed by a state-of-the-art Field Programmable Gate Away (FPGA). A customized algorithm is utilized to estimate the peak of the pulse, to reject pile-up and to eliminate processing dead time. The valid pulses estimated peaks are transferred to a micro controller system that creates the histogram and controls the Human Machine Interface (HMI)

  17. APPLICATION OF NEW TECHNOLOGIES TO THE REGIONAL MANAGEMENT AGRIBUSINESS

    OpenAIRE

    S. V. Kolyadenko; Makolkina, Ye. V.

    2014-01-01

    ?????? ???????????????? ???????????????? ?????????? ????? ?????????? ??????????? ????????????? ?????????? ???. ? ?????? ?????????? ???????????? ??????? ????????? ???????? ???????????????? ?????????????? ???????? ? ???????????? ???????????. ??????????? ?????????????? ????????? ????????? ???????????? ????????????? ? ????????????, ???????????????????? ?????????? ??????????? ? ?????? ?????????? ?????????. The authors analyzed the feasibility of new technologies in the management of the Regional O...

  18. Radiation Monitoring Systems With Built-in Multichannel Analyzer

    International Nuclear Information System (INIS)

    The MCA hardware, along with two other discriminator detector channels is integrated on a single plug-in board for a VMS style card cage allowing up to four MAC's per monitor. Advancements in surface mount technology, high-speed integrated circuits helped make it possible to produce a compact high performance MCA which rivals the best stand alone commercial units. Sanatoria Electronics has recently introduced the next generation of Radiation Monitoring System (Rms) products, the Ramo. This new fully-qualified design, which is a direct replacement for its previous monitor, the R M-80, makes use of the newest technologies to improve performance, maintainability and cost. One of its many new features is its fully integrated Multichannel Analyzer (MCA) for use in on-line spectral analysis, isotopic identification, and detector calibration.

  19. Presenting and analyzing movie stimuli for psychocinematic research

    Directory of Open Access Journals (Sweden)

    Arthur P. Shimamura

    2013-02-01

    Full Text Available Movies have an extraordinary way of capturing our perceptual, conceptual and emotional processes. As such, they offer a useful means of exploring psychological phenomenon in the laboratory. Until recently, it has been rather difficult to present animated stimuli and collect behavioral responses online. However, with advances in digital technology and commercially available software to construct, present, and analyze movies for behavioral investigations, there is a growing interest in psychocinematic research. A rather simple, yet useful procedure is described that presents movie clips and collects multiple behavioral responses during its presentation. It uses E-prime 2.0 Professional software to run the experiment and Microsoft Excel to sort and analyze the data.

  20. Analyzing the flight of a quadcopter using a smartphone

    CERN Document Server

    Monteiro, Martín; Cabeza, Cecilia; Marti, Arturo C

    2015-01-01

    Remotely-controlled helicopters and planes have been used as toys for decades. However, only recently, advances in sensor technologies have made possible to easily flight and control theses devices at an affordable price. Along with their increasing availability the educational opportunities are also proliferating. Here, a simple experiment in which a smartphone is mounted on a quadcopter is proposed to investigate the basics of a flight. Thanks to the smartphone's built-in accelerometer and gyroscope, take off, landing and yaw are analyzed.

  1. Biosystems Engineering of Prokaryotes with Tumor-Killing Capacities.

    Science.gov (United States)

    Kalyoncu, Ebuzer; Olmez, Tolga T; Ozkan, Alper D; Sarioglu, Omer F

    2016-01-01

    Certain bacteria selectively attack tumor tissues and trigger tumor shrinkage by producing toxins and modulating the local immune system, but their clinical utility is limited because of the dangers posed by systemic infection. Genetic engineering can be used to minimize the risks associated with tumor-targeting pathogens, as well as to increase their efficiency in killing tumor cells. Advances in genetic circuit design have led to the development of bacterial strains with enhanced tumor-targeting capacities and the ability to secrete therapeutics, cytotoxic proteins and prodrug-cleaving enzymes, which allows their safe and effective use for cancer treatment. The present review details the recent advances in the design and application of these modified bacterial strains. PMID:26654438

  2. Optical Detection of core-gold nanoshells inside biosystems

    Science.gov (United States)

    D'Acunto, Mario; Dinarelli, Simone; Cricenti, Antonio; Luce, Marco

    2016-02-01

    Metal nanoshells having a dielectric core with a thin gold layer are generating new interest due to the unique optical, electric and magnetic properties exhibited by the local field enhancement near the metal - dielectric core interface. These nanoshells possess strong, highly tunable local plasmon resonances with frequencies dependent upon the nanoshell shape and core material. These unique characteristics have applications in biosensing, optical communication and medicine. In this paper, we developed a theoretical, numerical and experimental approach based on a scanning near optical microscope to identify nanoshells inside mouse cells. Taking advantage of the characteristic near-infrared transparency window of many biological systems, i.e. the low light absorption coefficient of biological systems between 750-1100 nm, we were able to identify a 100-150 nm diameter barium titanate-gold nanoshell inside the h9c2 mouse cells.

  3. An Axiomatic, Unified Representation of Biosystems and Quantum Dynamics

    CERN Document Server

    Baianu, I

    2004-01-01

    An axiomatic representation of system dynamics is introduced in terms of categories, functors, organismal supercategories, limits and colimits of diagrams. Specific examples are considered in Complex Systems Biology, such as ribosome biogenesis and Hormonal Control in human subjects. "Fuzzy" Relational Structures are also proposed for flexible representations of biological system dynamics and organization.

  4. Evolving Parameters for a Noisy Bio-System

    OpenAIRE

    Chu, Dominique

    2013-01-01

    A simplified stochastic model of bacterial nutrient uptake and metabolism is presented. An evolutionary algorithm is used to explore the parameter space of this model under three different conditions: Unlimited nutrient supply, a limited nutrient supply that is replenished and a limited nutrient supply that is replenished after a period of no nutrient. A comparison of evolved parameters shows that the solutions are specific to the particular parameters for which they have been evolved. Given ...

  5. Carbon Nanomaterials: Applications in Physico-chemical Systemsand Biosystems

    Directory of Open Access Journals (Sweden)

    Maheshwar Sharon

    2008-07-01

    Full Text Available In the present article, various forms of carbon and carbon nanomaterials (CNMs and a new approach to classify them on the basis of sp2-sp3 configuration are presented. Utilising the concept of junction formation (like p:n junction a concept is developed to explain the special reactivity of nanosized carbon materials. Geometric consideration of chiral and achiral symmetry of single-walled carbon nanotubes is presented which is also responsible for manifesting special propertiesof carbon nanotubes. A brief introduction to various common synthesis techniques of CNMs is given. These is increased chemical and biological activities have resulted in many engineer ednanoparticles, which are being designed for specific purposes, including diagnostic or the rapeuticmedical uses and environmental remediation.Defence Science Journal, 2008, 58(4, pp.460-485, DOI:http://dx.doi.org/10.14429/dsj.58.1668

  6. A Biosystems Approach to Industrial Patient Monitoring and Diagnostic Devices

    CERN Document Server

    Baura, Gail

    2008-01-01

    A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diag

  7. Consideration Regarding Diagnosis Analyze of Corporate Management

    Directory of Open Access Journals (Sweden)

    Mihaela Ciopi OPREA

    2009-01-01

    Full Text Available Diagnosis management aims to identify critical situations and positive aspectsof corporate management. An effective diagnosis made by a team with thestatus of independence from the organization’s management is for managers auseful feedback necessary to improve performance. The work presented focuseson the methodology to achieve effective diagnosis, considering multitudecriteria and variables to be analyzed.

  8. Portable peltier-cooled X RF analyzer

    International Nuclear Information System (INIS)

    Full text: Recent development of semiconductor detectors has made it possible to design portable battery operated XRF-analyzers. Energy resolution and good peak to background ratio are close to liquid nitrogen cooled detector values. Application examples are given and a comparison of the new device between old ones is made. (author)

  9. Graphic method for analyzing common path interferometers

    DEFF Research Database (Denmark)

    Glückstad, J.

    1998-01-01

    Common path interferometers are widely used for visualizing phase disturbances and fluid flows. They are attractive because of the inherent simplicity and robustness in the setup. A graphic method will be presented for analyzing and optimizing filter parameters in common path interferometers....

  10. How to Analyze Company Using Social Network?

    Science.gov (United States)

    Palus, Sebastian; Bródka, Piotr; Kazienko, Przemysław

    Every single company or institution wants to utilize its resources in the most efficient way. In order to do so they have to be have good structure. The new way to analyze company structure by utilizing existing within company natural social network and example of its usage on Enron company are presented in this paper.

  11. Analyzing Vessel Behavior Using Process Mining

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, W.M.P. van der

    2013-01-01

    In the maritime domain, electronic sensors such as AIS receivers and radars collect large amounts of data about the vessels in a certain geographical area. We investigate the use of process mining techniques for analyzing the behavior of the vessels based on these data. In the context of maritime sa

  12. Miniature retarding grid ion energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, G.W.; Sawin, H.H.

    1992-12-01

    A retarding grid analyzer intended for use as a high-density ({approximately}10{sup 12}/cc) plasma diagnostic has been designed, built and tested. The analyzer`s external dimensions are 0.125 inch x0.125 inch x0.050 inch which are smaller than macroscopic plasma scale lengths, thus allowing it to be stalk mounted and moved throughout the plasma. The grids are 2000 line/inch nickel mesh so that the linear dimension of grid open area is less than the debye length for plasmas with 10 eV electrons and 10{sup 12}/cc densities. Successive grids are separated by 0.01 inch in order to avoid space charge effects between grids and thus allow unprecedented energy resolution. Also, because the linear dimension normal to the grid is small compared to the ion mean free path in high pressure (>100 mTorr) discharges, it can be used without the differential pumping required of larger GEA`s in such discharges. The analyzer has been tested on a plasma beam source (a modified ASTeX Compact ECR source) and on an ASTeX S1500ECR source, and has been used as an edge diagnostic on the VERSATOR tokamak at M.I.T. Ion energy distribution functions as narrow as 5 eV have been measured.

  13. 40 CFR 92.109 - Analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... consistent with the general requirements of 40 CFR part 1065, subpart I, for sampling and analysis of... (CONTINUED) CONTROL OF AIR POLLUTION FROM LOCOMOTIVES AND LOCOMOTIVE ENGINES Test Procedures § 92.109... shall be at least 60 percent of full-scale chart deflection. For NOX analyzers using a water trap,...

  14. Modeling and analyzing architectural change with Alloy

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ingstrup, Mads

    2010-01-01

    to the uptake of reconfiguration techniques in industry. Using the Alloy language and associated tool, we propose a practical way to formally model and analyze runtime architectural change expressed as architectural scripts. Our evaluation shows the performance to be acceptable; our experience that...... the modelling language is convenient and expressive, and that our model accurately repesents the implementation it is used to reason about....

  15. Quantum Key Distribution with Screening and Analyzing

    CERN Document Server

    Kye, W H

    2006-01-01

    We propose a quantum key distribution scheme by using screening angles and analyzing detectors which enable to notice the presence of Eve who eavesdrops the quantum channel. We show the security of the proposed quantum key distribution against impersonation, photon number splitting, Trojan Horse, and composite attacks.

  16. Quantum Key Distribution with Screening and Analyzing

    OpenAIRE

    Kye, Won-Ho

    2006-01-01

    We propose a quantum key distribution scheme by using screening angles and analyzing detectors which enable to notice the presence of Eve who eavesdrops the quantum channel, as the revised protocol of the recent quantum key distribution [Phys. Rev. Lett. 95, 040501 (2005)]. We discuss the security of the proposed quantum key distribution against various attacks including impersonation attack and Trojan Horse attack.

  17. Analyzing computer system performance with Perl

    CERN Document Server

    Gunther, Neil J

    2011-01-01

    This expanded second edition of Analyzing Computer System Performance with Perl::PDQ, builds on the success of the first edition. It contains new chapters on queues, tools and virtualization, and new Perl listing format to aid readability of PDQ models.

  18. LINUX kernel analyzing and device driver design

    International Nuclear Information System (INIS)

    With the development of GNU software, more and more people try to apply Linux in Data Acquisition and Control System. The core techniques of kernel are analyzed and the key methods are introduced for designing LINUX Device Driver of PCI character device based on authors' realized Data Acquisition and Control System

  19. BK/TD models for analyzing in vitro impedance data on cytotoxicity

    OpenAIRE

    Teng, Sophie; Barcellini-Couget, Sylvie; Beaudoin, Rémy; Desousa, Georges; Rahmani, Roger; Pery, Alexandre

    2015-01-01

    The ban of animal testing has enhanced the development of new in vitro technologies for cosmetics safety assessment. Impedance metrics is one such technology which enables monitoring of cell viability in real time. However, analyzing real time data requires moving from static to dynamic toxicity assessment. In the present study, we built mechanistic biokinetic/toxicodynamic (BK/TD) models to analyze the time course of cell viability in cytotoxicity assay using impedance. These models accou...

  20. Mobilitics: Analyzing Privacy Leaks in Smartphones

    OpenAIRE

    Achara, Jagdish,; Castelluccia, Claude; Lefruit, James-Douglass; Roca, Vincent; Baudot, Franck; Delcroix, Geoffrey

    2013-01-01

    Who, do you think, is aware of almost everything you do? Well, it's probably right there in your pocket, if you own a smartphone and carry it with you. In order to evaluate the actual privacy risks of smartphones and to raise public awareness of these risks, the CNIL (French data protection authority) and the Inria (French public science and technology institution dedicated to computational sciences) Privatics team started working together in 2012 as part of the Mobilitics project.

  1. Analyzing Music Services Positioning Through Qualitative Research

    OpenAIRE

    Manuel Cuadrado; María José Miquel; Juan D. MONTORO

    2015-01-01

    Information technologies have produced new ways of distributing and consuming music, mainly by youth, in relation to both goods and services. In the case of goods, there has been a dramatic shift from traditional ways of buying and listening to music to new digital platforms. There has also been an evolution in relation to music services. In this sense, live music concerts have been losing their audiences over the past few years, as have music radio stations, in favor of streaming platforms. ...

  2. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Highlights: • A new methodology for evaluating the risk at nuclear facilities was developed. • Five measures reflecting all factors that should be concerned to assess risk were developed. • The attributes on NMAC and nuclear security culture are included as attributes for analyzing. • The newly developed methodology can be used to evaluate risk of both existing facility and future nuclear system. - Abstract: A methodology for evaluating risks at nuclear facilities is developed in this work. A series of measures is drawn from the analysis of factors that determine risks. Five measures are created to evaluate risks at nuclear facilities. These include the legal and institutional framework, material control, physical protection system effectiveness, human resources, and consequences. Evaluation attributes are developed for each measure and specific values are given in order to calculate the risk value quantitatively. Questionnaires are drawn up on whether or not a state has properly established a legal and regulatory framework (based on international standards). These questionnaires can be a useful measure for comparing the status of the physical protection regime between two countries. Analyzing an insider threat is not an easy task and no methodology has been developed for this purpose. In this study, attributes that could quantitatively evaluate an insider threat, in the case of an unauthorized removal of nuclear materials, are developed by adopting the Nuclear Material Accounting & Control (NMAC) system. The effectiveness of a physical protection system, P(E), could be analyzed by calculating the probability of interruption, P(I), and the probability of neutralization, P(N). In this study, the Tool for Evaluating Security System (TESS) code developed by KINAC is used to calculate P(I) and P(N). Consequence is an important measure used to analyze risks at nuclear facilities. This measure comprises radiological, economic, and social damage. Social and

  3. External technology sources: Embodied or disembodied technology acquisition

    OpenAIRE

    Cassiman, Bruno; Veugelers, Reinhilde

    2000-01-01

    This paper analyzes the choice between different innovation activities of a firm. In particular, we study the technology acquisition decision of the firm, i.e. its technology BUY decision as part of the firm's innovation strategy. We take a closer look at the different types of external technology acquisition where we distinguish two broad types of technology buy decisions. On the one hand, the firm can acquire new technology which is embodied in an asset that is acquir...

  4. Application and promotion of wireless charging technology

    OpenAIRE

    Yan, Kaijun

    2014-01-01

    The aim of this thesis is to study wireless charging technology and analyze the application and promotion of each technology. This technology is based on Faraday’s electromagnetic in 1830s. It is not a new technology but it is developing high speed nowadays. This thesis introduces four mainstream types of wireless charging technology and three main-stream standards, and analyzes their features and development status. Wireless charging technology has been applied to some products, suc...

  5. An improved prism energy analyzer for neutrons

    International Nuclear Information System (INIS)

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å

  6. An improved prism energy analyzer for neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, J., E-mail: jennifer.schulz@helmholtz-berlin.de [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany); Ott, F. [Laboratoire Leon Brillouin, Bât 563 CEA Saclay, 91191 Gif sur Yvette Cedex (France); Krist, Th. [Helmholtz-Zentrum Berlin für Materialien und Energie, Hahn-Meitner-Platz 1, 14109 Berlin (Germany)

    2014-04-21

    The effects of two improvements of an existing neutron energy analyzer consisting of stacked silicon prism rows are presented. First we tested the effect of coating the back of the prism rows with an absorbing layer to suppress neutron scattering by total reflection and by refraction at small angles. Experiments at HZB showed that this works perfectly. Second the prism rows were bent to shift the transmitted wavelength band to larger wavelengths. At HZB we showed that bending increased the transmission of neutrons with a wavelength of 4.9 Å. Experiments with a white beam at the EROS reflectometer at LLB showed that bending of the energy analyzing device to a radius of 7.9 m allows to shift the transmitted wavelength band from 0 to 9 Å to 2 to 16 Å.

  7. The Krsko NPP Analyzer - phase II

    International Nuclear Information System (INIS)

    During the assessment of the nuclear safety level in Slovenia the SNSA (Slovenian Nuclear Safety Administration) found a need to gain a software support for analyzing the transients of the Krsko Nuclear Power Plant. Combining the RELAP5 code with graphical interface NPA (Nuclear Plant Analyzer) management staff saw an opportunity to have a powerful instrument for analyses and calculations on a user friendly basis and to get familiar with RELAP5 input deck without extra costs. At the beginning the project started with first phase, where Jozef Stefan Institute - OR4 formed a basic scope of the work and prepared the demo version of the SBLOCA on the basis of RELAP5/Mod3.1code input deck. Tractebel was chosen as a supplier of the project's second phase on the base of public bid. In 1996 the work started with translation of the input model from version Mod2.5 to Mod3.2 with standard routines and small final corrections. After this phase of the project a user of the Krsko NPP Analyzer can run accidents as SBLOCA, Main Steam Line Break, Feedwater Line Break, SGTR, and many other transients activating and combining interactive commands, starting from a full power operation. The third phase is planed. The SNSA has plans to improve the model of the Krsko NPP for a better simulation of core phenomena and to have detailed models of safety and auxiliary systems for increasing the number of possible transients and failures. The Critical Safety Function window will be created as a special mask. The analyzer will be used for education of employees and external experts, who are engaged in case of an emergency, to get familiar with the NPP systems and their operation. During the assessment of the nuclear safety level in Slovenia the SNSA (Slovenian Nuclear Safety Administration) found a need to gain a software support for analyzing the transients of the Krsko Nuclear Power Plant. Combining the RELAP5 code with graphical interface NPA (Nuclear Plant Analyzer) management staff saw an

  8. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  9. Information Theory for Analyzing Neural Networks

    OpenAIRE

    Sørngård, Bård

    2014-01-01

    The goal of this thesis was to investigate how information theory could be used to analyze artificial neural networks. For this purpose, two problems, a classification problem and a controller problem were considered. The classification problem was solved with a feedforward neural network trained with backpropagation, the controller problem was solved with a continuous-time recurrent neural network optimized with evolution.Results from the classification problem shows that mutual information ...

  10. Firms’ Innovation Strategies Analyzed and Explained

    OpenAIRE

    Tavassoli, Sam; Karlsson, Charlie

    2015-01-01

    This paper analyzes various innovation strategies of firms. Using five waves of the Community Innovation Survey in Sweden, we have traced the innovative behavior of firms over a ten-year period, i.e. between 2002 and 2012. We distinguish between sixteen innovation strategies, which compose of Schumpeterian four types of innovations (process, product, marketing, and organizational) plus various combinations of these four types. First, we find that firms are not homogenous in choosing innovatio...

  11. The analyzing of Dove marketing strategy

    Institute of Scientific and Technical Information of China (English)

    Guo; Yaohui

    2015-01-01

    <正>1.Introduction In this report,I try to analyze the related information about DOVE chocolate.Firstly,I would like to introduce this product.Dove chocolate is one of a series of products launched by the world’s largest pet food and snack food manufacturers,U.S.multinational food company Mars(Mars).Entered China in 1989,It becomes China’s leading brand of chocolate in

  12. LEGAL-EASE:Analyzing Chinese Financial Statements

    Institute of Scientific and Technical Information of China (English)

    EDWARD; MA

    2008-01-01

    In this article,we will focus on under- standing and analyzing the typical accounts of Chinese financial statements,including the balance sheet and income statement. Accounts are generally incorrectly prepared. This can be due to several factors,incom- petence,as well as more serious cases of deliberate attempts to deceive.Regardless, accounts can be understood and errors or specific acts of misrepresentation uncovered. We will conduct some simple analysis to demonstrate how these can be spotted.

  13. Organization theory. Analyzing health care organizations.

    Science.gov (United States)

    Cors, W K

    1997-02-01

    Organization theory (OT) is a tool that can be applied to analyze and understand health care organizations. Transaction cost theory is used to explain, in a unifying fashion, the myriad changes being undertaken by different groups of constituencies in health care. Agency theory is applied to aligning economic incentives needed to ensure Integrated Delivery System (IDS) success. By using tools such as OT, a clearer understanding of organizational changes is possible. PMID:10164970

  14. SPAM: Signal Processing to Analyze Malware

    OpenAIRE

    Nataraj, Lakshmanan; Manjunath, B.S.

    2016-01-01

    In this article, we explored orthogonal methods to analyze malware motivated by signal and image processing. Malware samples are represented as images or signals. Image and signal-based features are extracted to characterize malware. Our extensive experiments demonstrate the efficacy of our methods on malware classification and retrieval. We believe that our techniques will open the scope of signal-and image-based methods to broader fields in computer security.

  15. Analyzing the robustness of telecommunication networks

    OpenAIRE

    Eller, Karol Schaeffer

    1992-01-01

    This project report defines network robustness and discusses capability indicators that could be used to analyze network robustness. Growing dependence on telecommunication networks and recent network outages have focused attention on network robustness. The National Communications System (NCS), a confederation of 23 Federal departments and agencies, has been concerned with network robustness since its formation in 1963. The NCS is developing and implementing systems a...

  16. Coordinating, Scheduling, Processing and Analyzing IYA09

    Science.gov (United States)

    Gipson, John; Behrend, Dirk; Gordon, David; Himwich, Ed; MacMillan, Dan; Titus, Mike; Corey, Brian

    2010-01-01

    The IVS scheduled a special astrometric VLBI session for the International Year of Astronomy 2009 (IYA09) commemorating 400 years of optical astronomy and 40 years of VLBI. The IYA09 session is the most ambitious geodetic session to date in terms of network size, number of sources, and number of observations. We describe the process of designing, coordinating, scheduling, pre-session station checkout, correlating, and analyzing this session.

  17. SACO: Static analyzer for concurrent objects

    OpenAIRE

    Albert Albiol, Elvira; Arenas Sánchez, Purificación; Flores Montoya, A.; Genaim, Samir; Gómez-Zamalloa Gil, Miguel; Martín Martín, Enrique; Puebla, G.; Román Díez, Guillermo

    2014-01-01

    We present the main concepts, usage and implementation of SACO, a static analyzer for concurrent objects. Interestingly, SACO is able to infer both liveness(namely termination and resource boundedness) and safety properties (namely deadlock freedom) of programs based on concurrent objects. The system integrates auxiliary analyses such as points-to and may-happen-in-parallel, which are essential for increasing the accuracy of the aforementioned more complex properties. SACO provides accurate ...

  18. ANANAS - A Framework For Analyzing Android Applications

    OpenAIRE

    Eder, Thomas; Rodler, Michael; Vymazal, Dieter; Zeilinger, Markus

    2013-01-01

    Android is an open software platform for mobile devices with a large market share in the smartphone sector. The openness of the system as well as its wide adoption lead to an increasing amount of malware developed for this platform. ANANAS is an expandable and modular framework for analyzing Android applications. It takes care of common needs for dynamic malware analysis and provides an interface for the development of plugins. Adaptability and expandability have been main design goals during...

  19. Measuring and analyzing child labor: methodological issues

    OpenAIRE

    Grimsrud, Bjorne

    2002-01-01

    Current statistics on child labor are generally based on economically active children. This paper will argue that these figures are not a workable proxy for data on child labor, generating numbers of child laborers, and their gender composition that do not represent the group described by the international definition of child labor. This raises the question of reliable alternatives ways of measuring children's activities, with the aim of analyzing the incidence of child labor. The paper addre...

  20. Measuring and analyzing child labor : methodological issues

    OpenAIRE

    Grimsrud, Bjorne

    2001-01-01

    The paper argues that the current statistics on child labor (generally based on economically active children), are not a workable proxy for data on child labor, generating numbers of child laborers, and their gender composition, that do not represent the groups described by the international definition of child labor. This raises the question of reliable alternative ways of measuring children's activities, with the aim of analyzing the incidence of child labor. The paper addresses this, and p...

  1. Thermo Scientific Sulfur Dioxide Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The Sulfur Dioxide Analyzer measures sulfur dioxide based on absorbance of UV light at one wavelength by SO2 molecules which then decay to a lower energy state by emitting UV light at a longer wavelength. Specifically, SO2 + hυ1 →SO2 *→SO2 + hυ2 The emitted light is proportional to the concentration of SO2 in the optical cell. External communication with the analyzer is available through an Ethernet port configured through the instrument network of the AOS systems. The Model 43i-TLE is part of the i-series of Thermo Scientific instruments. The i-series instruments are designed to interface with external computers through the proprietary Thermo Scientific iPort Software. However, this software is somewhat cumbersome and inflexible. BNL has written an interface program in National Instruments LabView that both controls the Model 43i-TLE Analyzer AND queries the unit for all measurement and housekeeping data. The LabView vi (the software program written by BNL) ingests all raw data from the instrument and outputs raw data files in a uniform data format similar to other instruments in the AOS and described more fully in Section 6.0 below.

  2. Aliasing Errours in Parallel Signature Analyzers

    Institute of Scientific and Technical Information of China (English)

    闵应骅; YashwantK.Malaiya

    1990-01-01

    A Linear Feedback Shift Register(LFSR)can be used to compress test response data as a Signature Analyzer(SA).Parallel Signature Analyzers(PSAs)implemented as multiple input LFSRs are faster and require less hardware overhead than Serial Signature Analyzers(SSAs) for compacting test response data for Built-In Self-Test(BIST)in IC of boare-testing environments.However,the SAs are prone to aliasing errors because of some specific types of error patterns.An alias is a faulty output signature that is identical to the fault-free signature.A penetrating analysis of detecting capability of SAs depends strongly on mathematical manipulations,instead of being aware of some special cases of examples.In addition,the analysis should not be restricted to a particular structure of LFSR,but be appropriate for various structures of LFSRs.This paper presents necessary and sufficient conditions for aliasing errors based on a complete mathematical description of various types of SAs.An LFSR reconfiguration scheme is suggested which will prevent any aliasing double errors.Such a prevention cannot be obtained by any extension of an LFSR.

  3. Method of stabilizing single channel analyzers

    International Nuclear Information System (INIS)

    A method and the apparatus to reduce the drift of single channel analyzers are described. Essentially, this invention employs a time-sharing or multiplexing technique to insure that the outputs from two single channel analyzers (SCAS) maintain the same count ratio regardless of variations in the threshold voltage source or voltage changes, the multiplexing technique is accomplished when a flip flop, actuated by a clock, changes state to switch the output from the individual SCAS before these outputs are sent to a ratio counting scalar. In the particular system embodiment disclosed that illustrates this invention, the sulfur content of coal is determined by subjecting the coal to radiation from a neutron producing source. A photomultiplier and detector system equates the transmitted gamma radiation to an analog voltage signal and sends the same signal after amplification, to a SCA system that contains the invention. Therein, at least two single channel analyzers scan the analog signal over different parts of a spectral region. The two outputs may then be sent to a digital multiplexer so that the output from the multiplexer contains counts falling within two distinct segments of the region. By dividing the counts from the multiplexer by each other, the percentage of sulfur within the coal sample under observation may be determined. (U.S.)

  4. On-line chemical composition analyzer development

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, M.J.; Garrison, A.A.; Muly, E.C.; Moore, C.F.

    1992-02-01

    The energy consumed in distillation processes in the United States represents nearly three percent of the total national energy consumption. If effective control of distillation columns can be accomplished, it has been estimated that it would result in a reduction in the national energy consumption of 0.3%. Real-time control based on mixture composition could achieve these savings. However, the major distillation processes represent diverse applications and at present there does not exist a proven on-line chemical composition sensor technology which can be used to control these diverse processes in real-time. This report presents a summary of the findings of the second phase of a three phase effort undertaken to develop an on-line real-time measurement and control system utilizing Raman spectroscopy. A prototype instrument system has been constructed utilizing a Perkin Elmer 1700 Spectrometer, a diode pumped YAG laser, two three axis positioning systems, a process sample cell land a personal computer. This system has been successfully tested using industrially supplied process samples to establish its performance. Also, continued application development was undertaken during this Phase of the program using both the spontaneous Raman and Surface-enhanced Raman modes of operation. The study was performed for the US Department of Energy, Office of Industrial Technologies, whose mission is to conduct cost-shared R D for new high-risk, high-payoff industrial energy conservation technologies. Although this document contains references to individual manufacturers and their products, the opinions expressed on the products reported do not necessarily reflect the position of the Department of Energy.

  5. Portable Device Analyzes Rocks and Minerals

    Science.gov (United States)

    2008-01-01

    inXitu Inc., of Mountain View, California, entered into a Phase II SBIR contract with Ames Research Center to develop technologies for the next generation of scientific instruments for materials analysis. The work resulted in a sample handling system that could find a wide range of applications in research and industrial laboratories as a means to load powdered samples for analysis or process control. Potential industries include chemical, cement, inks, pharmaceutical, ceramics, and forensics. Additional applications include characterizing materials that cannot be ground to a fine size, such as explosives and research pharmaceuticals.

  6. Measuring and analyzing on natural radioactive nuclide uranium concentration in mineral water from market

    International Nuclear Information System (INIS)

    Using the Laser-fluorescence analyzing technology and adopting the standard mix method, the measuring and analyzing on mineral water was made. Seventeen samples of mineral water were chosen. The LMA-3 type laser trace analysis instrument was employed. The measuring result showed that the uranium content of the mineral water belongs in normal radioactive background level

  7. Technological Networks

    Science.gov (United States)

    Mitra, Bivas

    The study of networks in the form of mathematical graph theory is one of the fundamental pillars of discrete mathematics. However, recent years have witnessed a substantial new movement in network research. The focus of the research is shifting away from the analysis of small graphs and the properties of individual vertices or edges to consideration of statistical properties of large scale networks. This new approach has been driven largely by the availability of technological networks like the Internet [12], World Wide Web network [2], etc. that allow us to gather and analyze data on a scale far larger than previously possible. At the same time, technological networks have evolved as a socio-technological system, as the concepts of social systems that are based on self-organization theory have become unified in technological networks [13]. In today’s society, we have a simple and universal access to great amounts of information and services. These information services are based upon the infrastructure of the Internet and the World Wide Web. The Internet is the system composed of ‘computers’ connected by cables or some other form of physical connections. Over this physical network, it is possible to exchange e-mails, transfer files, etc. On the other hand, the World Wide Web (commonly shortened to the Web) is a system of interlinked hypertext documents accessed via the Internet where nodes represent web pages and links represent hyperlinks between the pages. Peer-to-peer (P2P) networks [26] also have recently become a popular medium through which huge amounts of data can be shared. P2P file sharing systems, where files are searched and downloaded among peers without the help of central servers, have emerged as a major component of Internet traffic. An important advantage in P2P networks is that all clients provide resources, including bandwidth, storage space, and computing power. In this chapter, we discuss these technological networks in detail. The review

  8. A chemical analyzer for charged ultrafine particles

    Directory of Open Access Journals (Sweden)

    S. G. Gonser

    2013-04-01

    Full Text Available New particle formation is a frequent phenomenon in the atmosphere and of major significance for the earth's climate and human health. To date the mechanisms leading to the nucleation of particles as well as to aerosol growth are not completely understood. A lack of appropriate measurement equipment for online analysis of the chemical composition of freshly nucleated particles is one major limitation. We have developed a Chemical Analyzer for Charged Ultrafine Particles (CAChUP capable of analyzing particles with diameters below 30 nm. A bulk of size separated particles is collected electrostatically on a metal filament, resistively desorbed and consequently analyzed for its molecular composition in a time of flight mass spectrometer. We report of technical details as well as characterization experiments performed with the CAChUP. Our instrument was tested in the laboratory for its detection performance as well as for its collection and desorption capabilities. The manual application of known masses of camphene (C10H16 to the desorption filament resulted in a detection limit between 0.5 and 5 ng, and showed a linear response of the mass spectrometer. Flow tube experiments of 25 nm diameter secondary organic aerosol from ozonolysis of alpha-pinene also showed a linear relation between collection time and the mass spectrometer's signal intensity. The resulting mass spectra from the collection experiments are in good agreement with published work on particles generated by the ozonolysis of alpha-pinene. A sensitivity study shows that the current setup of CAChUP is ready for laboratory measurements and for the observation of new particle formation events in the field.

  9. A computer program for analyzing channel geometry

    Science.gov (United States)

    Regan, R.S.; Schaffranek, R.W.

    1985-01-01

    The Channel Geometry Analysis Program (CGAP) provides the capability to process, analyze, and format cross-sectional data for input to flow/transport simulation models or other computational programs. CGAP allows for a variety of cross-sectional data input formats through use of variable format specification. The program accepts data from various computer media and provides for modification of machine-stored parameter values. CGAP has been devised to provide a rapid and efficient means of computing and analyzing the physical properties of an open-channel reach defined by a sequence of cross sections. CGAP 's 16 options provide a wide range of methods by which to analyze and depict a channel reach and its individual cross-sectional properties. The primary function of the program is to compute the area, width, wetted perimeter, and hydraulic radius of cross sections at successive increments of water surface elevation (stage) from data that consist of coordinate pairs of cross-channel distances and land surface or channel bottom elevations. Longitudinal rates-of-change of cross-sectional properties are also computed, as are the mean properties of a channel reach. Output products include tabular lists of cross-sectional area, channel width, wetted perimeter, hydraulic radius, average depth, and cross-sectional symmetry computed as functions of stage; plots of cross sections; plots of cross-sectional area and (or) channel width as functions of stage; tabular lists of cross-sectional area and channel width computed as functions of stage for subdivisions of a cross section; plots of cross sections in isometric projection; and plots of cross-sectional area at a fixed stage as a function of longitudinal distance along an open-channel reach. A Command Procedure Language program and Job Control Language procedure exist to facilitate program execution on the U.S. Geological Survey Prime and Amdahl computer systems respectively. (Lantz-PTT)

  10. A incorporação de novas tecnologias nos serviços de saúde: o desafio da análise dos fatores em jogo Adoption of new technologies by health services: the challenge of analyzing relevant factors

    Directory of Open Access Journals (Sweden)

    Evelinda Trindade

    2008-05-01

    Full Text Available A dinâmica exponencial de incorporação tecnológica na saúde tem sido considerada como uma das razões para o crescimento dos gastos do setor. Estas decisões envolvem múltiplos níveis e stakeholders. A descentralização multiplicou os níveis de decisão, com difíceis escolhas múltiplas e recursos restritos. A inter-relação entre os atores é complexa, em sistemas criativos com múltiplos determinantes e fatores de confusão. Esta revisão discute a interação entre os fatores que influenciam as decisões de incorporação de tecnologias nos serviços de saúde e propõe uma estrutura para sua análise. A aplicação e intensidade desses fatores nos processos de decisão de incorporação de produtos e programas nos serviços de saúde conformam a capacidade instalada nas redes locais e regionais e modifica o sistema de saúde. A observação empírica dos processos de decisão de incorporação tecnológica nos serviços de saúde do Brasil constitui um desafio importante. O reconhecimento estruturado e dimensionamento destas variáveis podem auxiliar a melhorar o planejamento pró-ativo dos serviços de saúde.The exponential increase in the incorporation of health technologies has been considered a key factor in increased expenditures by the health sector. Such decisions involve multiple levels and stakeholders. Decentralization has multiplied the decision-making levels, with numerous difficult choices and limited resources. The interrelationship between stakeholders is complex, in creative systems with multiple determinants and confounders. The current review discusses the interaction between the factors influencing the decisions to incorporate technologies by health services, and proposes a structure for their analysis. The application and intensity of these factors in decision-making and the incorporation of products and programs by health services shapes the installed capacity of local and regional networks and modifies the

  11. Analyzing WMAP Observation by Quantum Gravity

    CERN Document Server

    Hamada, Ken-ji; Sugiyama, Naoshi; Yukawa, Tetsuyuki

    2007-01-01

    The angular power spectra of cosmic microwave background are analyzed under the light of the evolutional scenario of the universe based on the renormalizable quantum theory of gravity in four dimensions. The equation of evolution is solved numerically fixing the power law spectrum predicted by the conformal gravity for the initial condition. The equation requires to introduce a dynamical energy scale about 10^{17}GeV, where the inflationary space-time evolution makes a transition to the big-bang of the conventional Friedmann universe. The quality of fit to the three-year data of WMAP implies the possibility to understand the observation by quantum gravity.

  12. Nonlinear single-spin spectrum analyzer.

    Science.gov (United States)

    Kotler, Shlomi; Akerman, Nitzan; Glickman, Yinnon; Ozeri, Roee

    2013-03-15

    Qubits have been used as linear spectrum analyzers of their environments. Here we solve the problem of nonlinear spectral analysis, required for discrete noise induced by a strongly coupled environment. Our nonperturbative analytical model shows a nonlinear signal dependence on noise power, resulting in a spectral resolution beyond the Fourier limit as well as frequency mixing. We develop a noise characterization scheme adapted to this nonlinearity. We then apply it using a single trapped ion as a sensitive probe of strong, non-Gaussian, discrete magnetic field noise. Finally, we experimentally compared the performance of equidistant vs Uhrig modulation schemes for spectral analysis. PMID:25166519

  13. Analyzing the Control Structure of PEPA

    DEFF Research Database (Denmark)

    Yang, Fan; Nielson, Hanne Riis

    The Performance Evaluation Process Algebra, PEPA, is introduced by Jane Hillston as a stochastic process algebra for modelling distributed systems and especially suitable for performance evaluation. We present a static analysis that very precisely approximates the control structure of processes...... to PEPA programs, the approximating result is very precise. Based on the analysis, we also develop algorithms for validating the deadlock property of PEPA programs. The techniques have been implemented in a tool which is able to analyze processes with a control structure that more than one thousand...

  14. Using SCR methods to analyze requirements documentation

    Science.gov (United States)

    Callahan, John; Morrison, Jeffery

    1995-01-01

    Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.

  15. Thermo Scientific Ozone Analyzer Instrument Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Springston, S. R. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2016-03-01

    The primary measurement output from the Thermo Scientific Ozone Analyzer is the concentration of the analyte (O3) reported at 1-s resolution in units of ppbv in ambient air. Note that because of internal pneumatic switching limitations the instrument only makes an independent measurement every 4 seconds. Thus, the same concentration number is repeated roughly 4 times at the uniform, monotonic 1-s time base used in the AOS systems. Accompanying instrument outputs include sample temperatures, flows, chamber pressure, lamp intensities and a multiplicity of housekeeping information. There is also a field for operator comments made at any time while data is being collected.

  16. Light-weight analyzer for odor recognition

    Energy Technology Data Exchange (ETDEWEB)

    Vass, Arpad A; Wise, Marcus B

    2014-05-20

    The invention provides a light weight analyzer, e.g., detector, capable of locating clandestine graves. The detector utilizes the very specific and unique chemicals identified in the database of human decompositional odor. This detector, based on specific chemical compounds found relevant to human decomposition, is the next step forward in clandestine grave detection and will take the guess-work out of current methods using canines and ground-penetrating radar, which have historically been unreliable. The detector is self contained, portable and built for field use. Both visual and auditory cues are provided to the operator.

  17. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  18. Analyzing Ever Growing Datasets in PHENIX

    International Nuclear Information System (INIS)

    After 10 years of running, the PHENIX experiment has by now accumulated more than 700 TB of reconstructed data which are directly used for analysis. Analyzing these amounts of data efficiently requires a coordinated approach. Beginning in 2005 we started to develop a system for the RHIC Atlas Computing Facility (RACF) which allows the efficient analysis of these large data sets. The Analysis Taxi is now the tool which allows any collaborator to process any data set taken since 2003 in weekly passes with turnaround times of typically three to four days.

  19. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    This study analyzes how a group of ‘mediators’ in a large, multinational company adapted a computer-mediated communication technology (a ‘virtual workspace’) to the organizational context (and vice versa) by modifying features of the technology, providing ongoing support for users, and promoting...... appropriate conventions of use. Our findings corroborate earlier research on technology-use mediation, which suggests that such mediators can exert considerable influence on how a particular technology will be established and used in an organization. However, this study also indicates that the process of...... technology-use mediation is more complex and indeterminate than earlier literature suggests. In particular, we want to draw attention to the fact that advanced computer-mediated communication technologies are equivocal and that technology-use mediation consequently requires ongoing sensemaking (Weick 1995)....

  20. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2003-01-01

    This study analyzes how a group of ‘mediators’ in a large, multinational company adapted a computer-mediated communication technology (a ‘virtual workspace’) to the organizational context (and vice versa) by modifying features of the technology, providing ongoing support for users, and promoting...... technology-use mediation is more complex and indeterminate than earlier literature suggests. In particular, we want to draw attention to the fact that advanced computer-mediated communication technologies are equivocal and that technology-use mediation consequently requires ongoing sensemaking (Weick 1995)....... appropriate conventions of use. Our findings corroborate earlier research on technology-use mediation, which suggests that such mediators can exert considerable influence on how a particular technology will be established and used in an organization. However, this study also indicates that the process of...

  1. BWR plant analyzer development at BNL

    International Nuclear Information System (INIS)

    An engineering plant analyzer has been developed at BNL for realistically and accurately simulating transients and severe abnormal events in BWR power plants. Simulations are being carried out routinely with high fidelity, high simulation speed, at low cost and with unsurpassed user convenience. The BNL Plant Analyzer is the only operating facility which (a) simulates more than two orders-of-magnitude faster than the CDC-7600 mainframe computer, (b) is accessible and fully operational in on-line interactive mode, remotely from anywhere in the US, from Europe or the Far East (Korea), via widely available IBM-PC compatible personal computers, standard modems and telephone lines, (c) simulates both slow and rapid transients seven times faster than real-time speed in direct access, and four times faster in remote access modes, (d) achieves high simulation speed without compromising fidelity, and (e) is available to remote access users at the low cost of $160 per hour. The accomplishment of detailed and accurate simulations in complex power plants at high speed and low cost are due chiefly to two reasons. The first reason is the application of five distinct modeling principles [2] which are not employed in any other simulation code. The second, and even more important reason is the utilization of a special-purpose peripheral computer with its 13 task-specific parallel processors

  2. The Solar Wind Ion Analyzer for MAVEN

    Science.gov (United States)

    Halekas, J. S.; Taylor, E. R.; Dalton, G.; Johnson, G.; Curtis, D. W.; McFadden, J. P.; Mitchell, D. L.; Lin, R. P.; Jakosky, B. M.

    2015-12-01

    The Solar Wind Ion Analyzer (SWIA) on the MAVEN mission will measure the solar wind ion flows around Mars, both in the upstream solar wind and in the magneto-sheath and tail regions inside the bow shock. The solar wind flux provides one of the key energy inputs that can drive atmospheric escape from the Martian system, as well as in part controlling the structure of the magnetosphere through which non-thermal ion escape must take place. SWIA measurements contribute to the top level MAVEN goals of characterizing the upper atmosphere and the processes that operate there, and parameterizing the escape of atmospheric gases to extrapolate the total loss to space throughout Mars' history. To accomplish these goals, SWIA utilizes a toroidal energy analyzer with electrostatic deflectors to provide a broad 360∘×90∘ field of view on a 3-axis spacecraft, with a mechanical attenuator to enable a very high dynamic range. SWIA provides high cadence measurements of ion velocity distributions with high energy resolution (14.5 %) and angular resolution (3.75∘×4.5∘ in the sunward direction, 22.5∘×22.5∘ elsewhere), and a broad energy range of 5 eV to 25 keV. Onboard computation of bulk moments and energy spectra enable measurements of the basic properties of the solar wind at 0.25 Hz.

  3. Analyzing and Mining Ordered Information Tables

    Institute of Scientific and Technical Information of China (English)

    SAI Ying (赛英); Y. Y. Yao

    2003-01-01

    Work in inductive learning has mostly been concentrated on classifying. However,there are many applications in which it is desirable to order rather than to classify instances. For modelling ordering problems, we generalize the notion of information tables to ordered information tables by adding order relations in attribute values. Then we propose a data analysis model by analyzing the dependency of attributes to describe the properties of ordered information tables.The problem of mining ordering rules is formulated as finding association between orderings of attribute values and the overall ordering of objects. An ordering rules may state that "if the value of an object x on an attribute a is ordered ahead of the value of another object y on the same attribute, then x is ordered ahead of y". For mining ordering rules, we first transform an ordered information table into a binary information table, and then apply any standard machine learning and data mining algorithms. As an illustration, we analyze in detail Maclean's universities ranking for the year 2000.

  4. The MAVEN Solar Wind Electron Analyzer

    Science.gov (United States)

    Mitchell, D. L.; Mazelle, C.; Sauvaud, J.-A.; Thocaven, J.-J.; Rouzaud, J.; Fedorov, A.; Rouger, P.; Toublanc, D.; Taylor, E.; Gordon, D.; Robinson, M.; Heavner, S.; Turin, P.; Diaz-Aguado, M.; Curtis, D. W.; Lin, R. P.; Jakosky, B. M.

    2016-04-01

    The MAVEN Solar Wind Electron Analyzer (SWEA) is a symmetric hemispheric electrostatic analyzer with deflectors that is designed to measure the energy and angular distributions of 3-4600-eV electrons in the Mars environment. This energy range is important for impact ionization of planetary atmospheric species, and encompasses the solar wind core and halo populations, shock-energized electrons, auroral electrons, and ionospheric primary photoelectrons. The instrument is mounted at the end of a 1.5-meter boom to provide a clear field of view that spans nearly 80 % of the sky with ˜20° resolution. With an energy resolution of 17 % (Δ E/E), SWEA readily distinguishes electrons of solar wind and ionospheric origin. Combined with a 2-second measurement cadence and on-board real-time pitch angle mapping, SWEA determines magnetic topology with high (˜8-km) spatial resolution, so that local measurements of the plasma and magnetic field can be placed into global context.

  5. Analyzing delay causes in Egyptian construction projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2014-01-01

    Full Text Available Construction delays are common problems in civil engineering projects in Egypt. These problems occur frequently during project life-time leading to disputes and litigation. Therefore, it is essential to study and analyze causes of construction delays. This research presents a list of construction delay causes retrieved from literature. The feedback of construction experts was obtained through interviews. Subsequently, a questionnaire survey was prepared. The questionnaire survey was distributed to thirty-three construction experts who represent owners, consultants, and contractor’s organizations. Frequency Index, Severity Index, and Importance Index are calculated and according to the highest values of them the top ten delay causes of construction projects in Egypt are determined. A case study is analyzed and compared to the most important delay causes in the research. Statistical analysis is carried out using analysis of variance ANOVA method to test delay causes, obtained from the survey. The test results reveal good correlation between groups while there is significant difference between them for some delay causes and finally roadmap for prioritizing delay causes groups is presented.

  6. Sentiment Analyzer for Arabic Comments System

    Directory of Open Access Journals (Sweden)

    Alaa El-Dine Ali Hamouda

    2013-04-01

    Full Text Available Today, the number of users of social network is increasing. Millions of users share opinions on different aspects of life every day. Therefore social network are rich sources of data for opinion mining and sentiment analysis. Also users have become more interested in following news pages on Facebook. Several posts; political for example, have thousands of users’ comments that agree/disagree with the post content. Such comments can be a good indicator for the community opinion about the post content. For politicians, marketers, decision makers …, it is required to make sentiment analysis to know the percentage of users agree, disagree and neutral respect to a post. This raised the need to analyze theusers’ comments in Facebook. We focused on Arabic Facebook news pages for the task of sentiment analysis. We developed a corpus for sentiment analysis and opinion mining purposes. Then, we used different machine learning algorithms – decision tree, support vector machines, and naive bayes - to develop sentiment analyzer. The performance of the system using each technique was evaluated and compared with others.

  7. Analyzing Network Coding Gossip Made Easy

    CERN Document Server

    Haeupler, Bernhard

    2010-01-01

    We give a new technique to analyze the stopping time of gossip protocols that are based on random linear network coding (RLNC). Our analysis drastically simplifies, extends and strengthens previous results. We analyze RLNC gossip in a general framework for network and communication models that encompasses and unifies the models used previously in this context. We show, in most settings for the first time, that it converges with high probability in the information-theoretically optimal time. Most stopping times are of the form O(k + T) where k is the number of messages to be distributed and T is the time it takes to disseminate one message. This means RLNC gossip achieves "perfect pipelining". Our analysis directly extends to highly dynamic networks in which the topology can change completely at any time. This remains true even if the network dynamics are controlled by a fully adaptive adversary that knows the complete network state. Virtually nothing besides simple O(kT) sequential flooding protocols was prev...

  8. Calibration of the portable wear metal analyzer

    Science.gov (United States)

    Quinn, Michael J.

    1987-12-01

    The Portable Wear Metal Analyzer (PWMA), a graphite furnace atomic absorption (AA) spectrometer, developed under a contract for this laboratory, was evaluated using powdered metal particles suspended in oil. The PWMA is a microprocessor controlled automatic sequential multielement AA spectrometer designed to support the deployed aircraft requirement for spectrometric oil analysis. The PWMA will analyze for nine elements (Ni, Fe, Cu, Cr, Ag, Mg, Si, Ti, Al) at a rate of 4 min per sample. The graphite tube and modified sample introduction system increase the detection of particles in oil when compared to the currently used techniques of flame AA or spark atomic emission (AE) spectroscopy. The PWMA shows good-to-excellent response for particles in sizes of 0 to 5 and 5 to 10 micrometers and fair response to particles of 10 to 20 and 20 to 30 micrometers. All trends in statistical variations are easily explained by system considerations. Correction factors to the calibration curves are necessary to correlate the analytical capability of the PWMA to the performance of existing spectrometric oil analysis (SOA) instruments.

  9. Analyzing Interoperability of Protocols Using Model Checking

    Institute of Scientific and Technical Information of China (English)

    WUPeng

    2005-01-01

    In practical terms, protocol interoperability testing is still laborious and error-prone with little effect, even for those products that have passed conformance testing. Deadlock and unsymmetrical data communication are familiar in interoperability testing, and it is always very hard to trace their causes. The previous work has not provided a coherent way to analyze why the interoperability was broken among protocol implementations under test. In this paper, an alternative approach is presented to analyzing these problems from a viewpoint of implementation structures. Sequential and concurrent structures are both representative implementation structures, especially in event-driven development model. Our research mainly discusses the influence of sequential and concurrent structures on interoperability, with two instructive conclusions: (a) a sequential structure may lead to deadlock; (b) a concurrent structure may lead to unsymmetrical data communication. Therefore, implementation structures carry weight on interoperability, which may not gain much attention before. To some extent, they are decisive on the result of interoperability testing. Moreover, a concurrent structure with a sound task-scheduling strategy may contribute to the interoperability of a protocol implementation. Herein model checking technique is introduced into interoperability analysis for the first time. As the paper shows, it is an effective way to validate developers' selections on implementation structures or strategies.

  10. Approaches for Managing and Analyzing Unstructured Data

    Directory of Open Access Journals (Sweden)

    N. Veeranjaneyulu

    2014-01-01

    Full Text Available Large volumes of data that will be stored and accessed in future is unstructured. The unstructured data is generated in a very fast pace and uses large storage areas. This increases the storage budget. Extracting value from this unstructured data which balances the budget is the most challenging task. Archives of interactive media, satellite and medical images, information from social network sites, legal documents, presentations and web pages from various data sources affects the data center's ability to maintain control over the unstructured data. Therefore, it is very essential to design systems to provide efficient storage, and access to these vast and continuously growing repositories of unstructured data. This can be achieved by retrieving structured information from the unstructured data. In this paper, we discuss approaches to process and manage such data. We also elaborate the architecture, technologies and applications to facilitate system design and evaluation.

  11. Investigating and analyzing parameters of coal gasification

    Energy Technology Data Exchange (ETDEWEB)

    Postrzednik, S.

    1983-07-01

    Investigations into coal gasification carried out by the Institute for Heat Technology of the Silesian Technical University in Gliwice within the MR-I-10 research program ('Optimization of thermodynamics and flow problems') are evaluated. The Institute is developing a mathematical model of coal gasification on a commercial scale. Laboratory investigations into reaction kinetics of coal gasification are aimed at determining relations used by this model. Test stand used for dry coal gasification, gasification procedure and calculation methods are discussed. The test stand consists of a heating system, an analytical balance, temperature control system, a system recording temperature fluctuations and a flow rate control system. The results of investigations are shown in the form of curves which describe isothermal coal gasification. 6 references.

  12. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  13. Mapping Technology Space by Normalizing Technology Relatedness Networks

    CERN Document Server

    Alstott, Jeff; Yan, Bowen; Luo, Jianxi

    2015-01-01

    Technology is a complex system, with technologies relating to each other in a space that can be mapped as a network. The technology relatedness network's structure can reveal properties of technologies and of human behavior, if it can be mapped accurately. Technology networks have been made from patent data, using several measures of relatedness. These measures, however, are influenced by factors of the patenting system that do not reflect technologies or their relatedness. We created technology networks that precisely controlled for these impinging factors and normalized them out, using data from 3.9 million patents. The normalized technology relatedness networks were sparse, with only ~20% of technology domain pairs more related than would be expected by chance. Different measures of technology relatedness became more correlated with each other after normalization, approaching a single dimension of technology relatedness. The normalized network corresponded with human behavior: we analyzed the patenting his...

  14. 从就业现状分析医疗美容技术专业教育教学改革中的问题与对策%Analyzing the problems and countermeasures needing attention in educating and teaching medical beauty technology from the view of employment present situation

    Institute of Scientific and Technical Information of China (English)

    郝超

    2016-01-01

    作为美容医学教育的重要分支,医疗美容技术是当前新兴专业,市场对毕业生需求量大,就业前景较好,面对求大于供的就业市场,医疗美容行业正如火如荼地发展着[1-2]。可是目前本专业的发展面临很多问题,毕业生频繁跳槽,人才五年流失率高。招生就业形势都不容乐观,如果不解决此困境,再多的教育教学改革只能成为资源浪费。笔者结合本校历年来医疗美容技术专业的就业现状,校企合作情况,探讨教育教学中改革的思路、措施与对策。%As an important branch of cosmetic medical education, medical beauty technology is a emerging specialty currently. The job market needs large number of graduates, and it has better employment prospect.Faced with demand exceeds supply in the job market,Medical beauty industry is developing in its full swing.But the current specialty development faces many problems,such as graduates change jobs frequently,high wastage rate of graduates talent for five years.Both the enrollment and employment situation are not optimistic.If we don’t resolve this dilemma,no matter how much education and teaching reform can only be a waste of resources.This article attempts to discuss the ideas, measures and countermeasures in education and teaching reform based on the employment present situation of medical beauty technology specialty and school-enterprise cooperation in my collage .

  15. Analyzing the problems and countermeasures needing attention in educating and teaching medical beauty technology from the view of employment present situation%从就业现状分析医疗美容技术专业教育教学改革中的问题与对策

    Institute of Scientific and Technical Information of China (English)

    郝超

    2016-01-01

    作为美容医学教育的重要分支,医疗美容技术是当前新兴专业,市场对毕业生需求量大,就业前景较好,面对求大于供的就业市场,医疗美容行业正如火如荼地发展着[1-2]。可是目前本专业的发展面临很多问题,毕业生频繁跳槽,人才五年流失率高。招生就业形势都不容乐观,如果不解决此困境,再多的教育教学改革只能成为资源浪费。笔者结合本校历年来医疗美容技术专业的就业现状,校企合作情况,探讨教育教学中改革的思路、措施与对策。%As an important branch of cosmetic medical education, medical beauty technology is a emerging specialty currently. The job market needs large number of graduates, and it has better employment prospect.Faced with demand exceeds supply in the job market,Medical beauty industry is developing in its full swing.But the current specialty development faces many problems,such as graduates change jobs frequently,high wastage rate of graduates talent for five years.Both the enrollment and employment situation are not optimistic.If we don’t resolve this dilemma,no matter how much education and teaching reform can only be a waste of resources.This article attempts to discuss the ideas, measures and countermeasures in education and teaching reform based on the employment present situation of medical beauty technology specialty and school-enterprise cooperation in my collage .

  16. Toward Sustainable Anticipatory Governance: Analyzing and Assessing Nanotechnology Innovation Processes

    Science.gov (United States)

    Foley, Rider Williams

    Cities around the globe struggle with socio-economic disparities, resource inefficiency, environmental contamination, and quality-of-life challenges. Technological innovation, as one prominent approach to problem solving, promises to address these challenges; yet, introducing new technologies, such as nanotechnology, into society and cities has often resulted in negative consequences. Recent research has conceptually linked anticipatory governance and sustainability science: to understand the role of technology in complex problems our societies face; to anticipate negative consequences of technological innovation; and to promote long-term oriented and responsible governance of technologies. This dissertation advances this link conceptually and empirically, focusing on nanotechnology and urban sustainability challenges. The guiding question for this dissertation research is: How can nanotechnology be innovated and governed in responsible ways and with sustainable outcomes? The dissertation: analyzes the nanotechnology innovation process from an actor- and activities-oriented perspective (Chapter 2); assesses this innovation process from a comprehensive perspective on sustainable governance (Chapter 3); constructs a small set of future scenarios to consider future implications of different nanotechnology governance models (Chapter 4); and appraises the amenability of sustainability problems to nanotechnological interventions (Chapter 5). The four studies are based on data collected through literature review, document analysis, participant observation, interviews, workshops, and walking audits, as part of process analysis, scenario construction, and technology assessment. Research was conducted in collaboration with representatives from industry, government agencies, and civic organizations. The empirical parts of the four studies focus on Metropolitan Phoenix. Findings suggest that: predefined mandates and economic goals dominate the nanotechnology innovation process

  17. Stackable differential mobility analyzer for aerosol measurement

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Meng-Dawn (Oak Ridge, TN); Chen, Da-Ren (Creve Coeur, MO)

    2007-05-08

    A multi-stage differential mobility analyzer (MDMA) for aerosol measurements includes a first electrode or grid including at least one inlet or injection slit for receiving an aerosol including charged particles for analysis. A second electrode or grid is spaced apart from the first electrode. The second electrode has at least one sampling outlet disposed at a plurality different distances along its length. A volume between the first and the second electrode or grid between the inlet or injection slit and a distal one of the plurality of sampling outlets forms a classifying region, the first and second electrodes for charging to suitable potentials to create an electric field within the classifying region. At least one inlet or injection slit in the second electrode receives a sheath gas flow into an upstream end of the classifying region, wherein each sampling outlet functions as an independent DMA stage and classifies different size ranges of charged particles based on electric mobility simultaneously.

  18. Analyzing Options for Airborne Emergency Wireless Communications

    Energy Technology Data Exchange (ETDEWEB)

    Michael Schmitt; Juan Deaton; Curt Papke; Shane Cherry

    2008-03-01

    In the event of large-scale natural or manmade catastrophic events, access to reliable and enduring commercial communication systems is critical. Hurricane Katrina provided a recent example of the need to ensure communications during a national emergency. To ensure that communication demands are met during these critical times, Idaho National Laboratory (INL) under the guidance of United States Strategic Command has studied infrastructure issues, concerns, and vulnerabilities associated with an airborne wireless communications capability. Such a capability could provide emergency wireless communications until public/commercial nodes can be systematically restored. This report focuses on the airborne cellular restoration concept; analyzing basic infrastructure requirements; identifying related infrastructure issues, concerns, and vulnerabilities and offers recommended solutions.

  19. Fully Analyzing an Algebraic Polya Urn Model

    CERN Document Server

    Morcrette, Basile

    2012-01-01

    This paper introduces and analyzes a particular class of Polya urns: balls are of two colors, can only be added (the urns are said to be additive) and at every step the same constant number of balls is added, thus only the color compositions varies (the urns are said to be balanced). These properties make this class of urns ideally suited for analysis from an "analytic combinatorics" point-of-view, following in the footsteps of Flajolet-Dumas-Puyhaubert, 2006. Through an algebraic generating function to which we apply a multiple coalescing saddle-point method, we are able to give precise asymptotic results for the probability distribution of the composition of the urn, as well as local limit law and large deviation bounds.

  20. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social and...... policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs), and...... stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  1. Detect Adjacent Well by Analyzing Geomagnetic Anomalies

    Directory of Open Access Journals (Sweden)

    Su Zhang

    2014-03-01

    Full Text Available This study describes a method of determining the position of adjacent well by analyzing geomagnetic anomalies in the drilling. In the experiment, put a casing in the geomagnetic field respectively to simulate 3 conditions, which are vertical well, deviated well and horizontal well. Study the interference of regional geomagnetic caused by casing, summary the law of the regional geomagnetic field anomalies caused by the adjacent casing. Experimental results show that: magnetic intensity distortion caused by deviated well is similar to that caused by horizontal well, but the distortion is different from vertical well. The scope and amplitude of N and E component magnetic intensity distortion will increase with the increase of casing inclination, meanwhile the scope and amplitude of V component distortion will decrease and the distortion value changes from negative to positive to the southwest of adjacent well. Through the analysis of geomagnetic anomalies, the position of the adjacent wells could be determined.

  2. Coke from small-diameter tubes analyzed

    International Nuclear Information System (INIS)

    The mechanism for coke deposit formation and the nature of the coke itself can vary with the design of the ethylene furnace tube bank. In this article, coke deposits from furnaces with small-diameter pyrolysis tubes are examined. The samples were taken from four furnaces of identical design (Plant B). As in both the first and second installments of the series, the coke deposits were examined using a scanning electron microscope (SEM) and an energy dispersive X-ray analyzer (EDAX). The deposits from the small-diameter tubes are compared with the coke deposits from the furnace discussed in earlier articles. Analysis of the coke in both sets of samples are then used to offer recommendations for improved decoking procedures, operating procedures, better feed selection, and better selection of the metallurgy used in furnace tubes, to extend the operating time of the furnace tubes by reducing the amount and type of coke build up

  3. Composite blade structural analyzer (COBSTRAN) user's manual

    Science.gov (United States)

    Aiello, Robert A.

    1989-01-01

    The installation and use of a computer code, COBSTRAN (COmposite Blade STRuctrual ANalyzer), developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades was described. This code combines composite mechanics and laminate theory with an internal data base of fiber and matrix properties. Inputs to the code are constituent fiber and matrix material properties, factors reflecting the fabrication process, composite geometry and blade geometry. COBSTRAN performs the micromechanics, macromechanics and laminate analyses of these fiber composites. COBSTRAN generates a NASTRAN model with equivalent anisotropic homogeneous material properties. Stress output from NASTRAN is used to calculate individual ply stresses, strains, interply stresses, thru-the-thickness stresses and failure margins. Curved panel structures may be modeled providing the curvature of a cross-section is defined by a single value function. COBSTRAN is written in FORTRAN 77.

  4. Composite Blade Structural Analyzer (COBSTRAN) demonstration manual

    Science.gov (United States)

    Aiello, Robert A.

    1989-01-01

    The input deck setup is described for a computer code, composite blade structural analyzer (COBSTRAN) which was developed for the design and analysis of composite turbofan and turboprop blades and also for composite wind turbine blades. This manual is intended for use in conjunction with the COBSTRAN user's manual. Seven demonstration problems are described with pre- and postprocessing input decks. Modeling of blades which are solid thru-the-thickness and also aircraft wing airfoils with internal spars is shown. Corresponding NASTRAN and databank input decks are also shown. Detail descriptions of each line of the pre- and post-processing decks is provided with reference to the Card Groups defined in the user's manual. A dictionary of all program variables and terms used in this manual may be found in Section 6 of the user's manual.

  5. Analyzing hydrological sustainability through water balance.

    Science.gov (United States)

    Menció, Anna; Folch, Albert; Mas-Pla, Josep

    2010-05-01

    The objective of the Water Framework Directive (2000/60/EC) is to assist in the development of management plans that will lead to the sustainable use of water resources in all EU member states. However, defining the degree of sustainability aimed at is not a straightforward task. It requires detailed knowledge of the hydrogeological characteristics of the basin in question, its environmental needs, the amount of human water demand, and the opportunity to construct a proper water balance that describes the behavior of the hydrological system and estimates available water resources. An analysis of the water balance in the Selva basin (Girona, NE Spain) points to the importance of regional groundwater fluxes in satisfying current exploitation rates, and shows that regional scale approaches are often necessary to evaluate water availability. In addition, we discuss the pressures on water resources, and analyze potential actions, based on the water balance results, directed towards achieving sustainable water management in the basin. PMID:20229067

  6. Method and apparatus for analyzing ionizable materials

    International Nuclear Information System (INIS)

    An apparatus and method are described for analyzing a solution of ionizable compounds in a liquid. The solution is irradiated with electromagnetic radiation to ionize the compounds and the electrical conductivity of the solution is measured. The radiation may be X-rays, ultra-violet, infra-red or microwaves. The solution may be split into two streams, only one of which is irradiated, the other being used as a reference by comparing conductivities of the two streams. The liquid must be nonionizable and is preferably a polar solvent. The invention provides an analysis technique useful in liquid chromatography and in gas chromatography after dissolving the eluted gases in a suitable solvent. Electrical conductivity measurements performed on the irradiated eluent provide a quantitative indication of the ionizable materials existing within the eluent stream and a qualitative indication of the purity of the eluent stream. (author)

  7. Analyzing and forecasting the European social climate

    Directory of Open Access Journals (Sweden)

    Liliana DUGULEANĂ

    2015-06-01

    Full Text Available The paper uses the results of the sample survey Eurobarometer, which has been requested by the European Commission. The social climate index is used to measure the level of perceptions of population by taking into account their personal situation and their perspective at national level. The paper makes an analysis of the evolution of social climate indices for the countries of European Union and offers information about the expectations of population of analyzed countries. The obtained results can be compared with the forecasting of Eurobarometer, on short term of one year and medium term of five years. Modelling the social climate index and its influence factors offers useful information about the efficiency of social protection and inclusion policies.

  8. Nuclear Plant Analyzer: Installation manual. Volume 1

    International Nuclear Information System (INIS)

    This report contains the installation instructions for the Nuclear Plant Analyzer (NPA) System. The NPA System consists of the Computer Visual System (CVS) program, the NPA libraries, the associated utility programs. The NPA was developed at the Idaho National Engineering Laboratory under the sponsorship of the US Nuclear Regulatory Commission to provide a highly flexible graphical user interface for displaying the results of these analysis codes. The NPA also provides the user with a convenient means of interactively controlling the host program through user-defined pop-up menus. The NPA was designed to serve primarily as an analysis tool. After a brief introduction to the Computer Visual System and the NPA, an analyst can quickly create a simple picture or set of pictures to aide in the study of a particular phenomenon. These pictures can range from simple collections of square boxes and straight lines to complex representations of emergency response information displays

  9. Color recognition system for urine analyzer

    Science.gov (United States)

    Zhu, Lianqing; Wang, Zicai; Lin, Qian; Dong, Mingli

    2010-08-01

    In order to increase the speed of photoelectric conversion, a linear CCD is applied as the photoelectric converter instead of the traditional photodiode. A white LED is used as the light source of the system. The color information of the urine test strip is transferred into the CCD through a reflecting optical system. It is then converted to digital signals by an A/D converter. The test results of urine analysis are obtained by a data processing system. An ARM microprocessor is selected as the CPU of the system and a CPLD is employed to provide a driving timing for the CCD drive and the A/D converter. Active HDL7.2 and Verilog HDL are used to simulate the driving timing of the CPLD. Experimental results show that the correctness rate of the test results is better than 90%. The system satisfies the requirements of the color information collection of urine analyzer.

  10. A miniaturized stepwise injection spectrophotometric analyzer

    International Nuclear Information System (INIS)

    A novel micro-stepwise injection analyzer (μSWIA) has been developed for the automation and miniaturization of spectrophotometric analysis. The main unit of this device is a mixing chamber (MC) connected to the atmosphere. This part of the μSWIA provides rapid and effective homogenization of the reaction mixture components and completion of the reaction by means of gas bubbling. The μSWIA contained a rectangular labyrinth channel designed in way allowing one to eliminate bubbles by moving a solution from the MC to an optical channel. The light-emitting diode (LED) was used as a light emitter and the analytical signal was measured by a portable spectrophotometer. Fluid movement was attained via the use of a computer-controlled syringe pump. The μSWIA was successfully used for the spectrophotometric determination of cysteine in biologically active supplements and fodder by using 18-molybdo-2-phosphate heteropoly anion (18-MPA) as the reagent. (author)

  11. Analyzer of neutron flux in real time

    International Nuclear Information System (INIS)

    With base in the study of the real signals of neutron flux of instability events occurred in the Laguna Verde nuclear power plant where the nucleus oscillation phenomena of the reactor are in the 0 to 2.5 Hz range, it has been seen the possibility about the development a surveillance and diagnostic equipment capable to analyze in real time the behavior of nucleus in this frequencies range. An important method for surveillance the stability of the reactor nucleus is the use of the Power spectral density which allows to determine the frequencies and amplitudes contained in the signals. It is used an instrument carried out by LabVIEW graphic programming with a data acquisition card of 16 channels which works at Windows 95/98 environment. (Author)

  12. Electrode contamination effects of retarding potential analyzer.

    Science.gov (United States)

    Fang, H K; Oyama, K-I; Cheng, C Z

    2014-01-01

    The electrode contamination in electrostatic analyzers such as Langmuir probes and retarding potential analyzers (RPA) is a serious problem for space measurements. The contamination layer acts as extra capacitance and resistance and leads to distortion in the measured I-V curve, which leads to erroneous measurement results. There are two main effects of the contamination layer: one is the impedance effect and the other is the charge attachment and accumulation due to the capacitance. The impedance effect can be reduced or eliminated by choosing the proper sweeping frequency. However, for RPA the charge accumulation effect becomes serious because the capacitance of the contamination layer is much larger than that of the Langmuir probe of similar dimension. The charge accumulation on the retarding potential grid causes the effective potential, that ions experience, to be changed from the applied voltage. Then, the number of ions that can pass through the retarding potential grid to reach the collector and, thus, the measured ion current are changed. This effect causes the measured ion drift velocity and ion temperature to be changed from the actual values. The error caused by the RPA electrode contamination is expected to be significant for sounding rocket measurements with low rocket velocity (1-2 km/s) and low ion temperature of 200-300 K in the height range of 100-300 km. In this paper we discuss the effects associated with the RPA contaminated electrodes based on theoretical analysis and experiments performed in a space plasma operation chamber. Finally, the development of a contamination-free RPA for sounding rocket missions is presented. PMID:24517809

  13. Analyzing large biological datasets with association networks

    Energy Technology Data Exchange (ETDEWEB)

    Karpinets, Tatiana V [ORNL; Park, Byung H [ORNL; Uberbacher, Edward C [ORNL

    2012-01-01

    Due to advances in high throughput biotechnologies biological information is being collected in databases at an amazing rate, requiring novel computational approaches for timely processing of the collected data into new knowledge. In this study we address this problem by developing a new approach for discovering modular structure, relationships and regularities in complex data. These goals are achieved by converting records of biological annotations of an object, like organism, gene, chemical, sequence, into networks (Anets) and rules (Arules) of the associated annotations. Anets are based on similarity of annotation profiles of objects and can be further analyzed and visualized providing a compact birds-eye view of most significant relationships in the collected data and a way of their clustering and classification. Arules are generated by Apriori considering each record of annotations as a transaction and augmenting each annotation item by its type. Arules provide a way to validate relationships discovered by Anets producing comprehensive statistics on frequently associated annotations and specific confident relationships among them. A combination of Anets and Arules represents condensed information on associations among the collected data, helping to discover new knowledge and generate hypothesis. As an example we have applied the approach to analyze bacterial metadata from the Genomes OnLine Database. The analysis allowed us to produce a map of sequenced bacterial and archaeal organisms based on their genomic, metabolic and physiological characteristics with three major clusters of metadata representing bacterial pathogens, environmental isolates, and plant symbionts. A signature profile of clustered annotations of environmental bacteria if compared with pathogens linked the aerobic respiration, the high GC content and the large genome size to diversity of metabolic activities and physiological features of the organisms.

  14. A calibration free vector network analyzer

    Science.gov (United States)

    Kothari, Arpit

    Recently, two novel single-port, phase-shifter based vector network analyzer (VNA) systems were developed and tested at X-band (8.2--12.4 GHz) and Ka-band (26.4--40 GHz), respectively. These systems operate based on electronically moving the standing wave pattern, set up in a waveguide, over a Schottky detector and sample the standing wave voltage for several phase shift values. Once this system is fully characterized, all parameters in the system become known and hence theoretically, no other correction (or calibration) should be required to obtain the reflection coefficient, (Gamma), of an unknown load. This makes this type of VNA "calibration free" which is a significant advantage over other types of VNAs. To this end, a VNA system, based on this design methodology, was developed at X-band using several design improvements (compared to the previous designs) with the aim of demonstrating this "calibration-free" feature. It was found that when a commercial VNA (HP8510C) is used as the source and the detector, the system works as expected. However, when a detector is used (Schottky diode, log detector, etc.), obtaining correct Gamma still requires the customary three-load calibration. With the aim of exploring the cause, a detailed sensitivity analysis of prominent error sources was performed. Extensive measurements were done with different detection techniques including use of a spectrum analyzer as power detector. The system was tested even for electromagnetic compatibility (EMC) which may have contributed to this issue. Although desired results could not be obtained using the proposed standing-wave-power measuring devices like the Schottky diode but the principle of "calibration-free VNA" was shown to be true.

  15. Analyzing and Visualizing Whole Program Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Panas, T; Quinlan, D; Vuduc, R

    2007-05-10

    This paper describes our work to develop new tool support for analyzing and visualizing the architecture of complete large-scale (millions or more lines of code) programs. Our approach consists of (i) creating a compact, accurate representation of a whole C or C++ program, (ii) analyzing the program in this representation, and (iii) visualizing the analysis results with respect to the program's architecture. We have implemented our approach by extending and combining a compiler infrastructure and a program visualization tool, and we believe our work will be of broad interest to those engaged in a variety of program understanding and transformation tasks. We have added new whole-program analysis support to ROSE [15, 14], a source-to-source C/C++ compiler infrastructure for creating customized analysis and transformation tools. Our whole-program work does not rely on procedure summaries; rather, we preserve all of the information present in the source while keeping our representation compact. In our representation, a million-line application fits in well less than 1 GB of memory. Because whole-program analyses can generate large amounts of data, we believe that abstracting and visualizing analysis results at the architecture level is critical to reducing the cognitive burden on the consumer of the analysis results. Therefore, we have extended Vizz3D [19], an interactive program visualization tool, with an appropriate metaphor and layout algorithm for representing a program's architecture. Our implementation provides developers with an intuitive, interactive way to view analysis results, such as those produced by ROSE, in the context of the program's architecture. The remainder of this paper summarizes our approach to whole-program analysis (Section 2) and provides an example of how we visualize the analysis results (Section 3).

  16. Irradiation Accidents in Radiotherapy Analyze, Manage, Prevent

    International Nuclear Information System (INIS)

    Why do errors occur? How to minimize them? In a context of widely publicized major incidents, of accelerated technological advances in radiotherapy planning and delivery, and of global communication and information resources, this critical issue had to be addressed by the professionals of the field, and so did most national and international organizations. The ISMP, aware of its responsibility, decided as well to put an emphasis on the topic at the occasion of its annual meeting. In this frame, potential errors in terms of scenarios, pathways of occurrence, and dosimetry, will first be examined. The goal being to prioritize error prevention according to likelihood of events and their dosimetric impact. Then, case study of three incidents will be detailed: Epinal, Glasgow and Detroit. For each one, a description of the incident and the way it was reported, its investigation, and the lessons that can be learnt will be presented. Finally, the implementation of practical measures at different levels, intra- and inter institutions, like teaching, QA procedures enforcement or voluntary incident reporting, will be discussed

  17. Upgrade of the uranium automatic analyzer 'Rapiduran': First part

    International Nuclear Information System (INIS)

    Main aspects of the modernization of RAPIDURAN system, an automated analyzer for batches of samples of uranium built in Finland in the 80's decade are presented. This is a complex system that comprises many pneumatic, mechanic and electronic components. At present, the electronic system is outdated and presents frequent failures, being its operation not reliable. For this reason, actions to modernize this system using high technology electronic components were taken. To carry out this work and in order to understand the operation of each component, firstly it was necessary to perform a reengineering process using inverse engineering. Thereafter, an integral maintenance of the pneumatic and mechanic system was performed, followed by the design and assembly of a new system for electronic control. (orig.)

  18. PLC backplane analyzer for field forensics and intrusion detection

    Science.gov (United States)

    Mulder, John; Schwartz, Moses Daniel; Berg, Michael; Van Houten, Jonathan Roger; Urrea, Jorge Mario; King, Michael Aaron; Clements, Abraham Anthony; Trent, Jason; Depoy, Jennifer M; Jacob, Joshua

    2015-05-12

    The various technologies presented herein relate to the determination of unexpected and/or malicious activity occurring between components communicatively coupled across a backplane. Control data, etc., can be intercepted at a backplane where the backplane facilitates communication between a controller and at least one device in an automation process. During interception of the control data, etc., a copy of the control data can be made, e.g., the original control data can be replicated to generate a copy of the original control data. The original control data can continue on to its destination, while the control data copy can be forwarded to an analyzer system to determine whether the control data contains a data anomaly. The content of the copy of the control data can be compared with a previously captured baseline data content, where the baseline data can be captured for a same operational state as the subsequently captured control data.

  19. Analyzing Tibetan Monastics Conception of Universe Through Their Drawings

    Science.gov (United States)

    Sonam, Tenzin; Chris Impey

    2016-06-01

    Every culture and tradition has their own representation of the universe that continues to evolve through new technologies and discoveries, and as a result of cultural exchange. With the recent introduction of Western science into the Tibetan Buddhist monasteries in India, this study explores the monastics’ conception of the universe prior to their formal instruction in science. Their drawings were analyzed using Tversky’s three criteria for drawing analysis namely—segmentation, order, and hierarchical structure of knowledge. Among the sixty Buddhist monastics included in this study, we find that most of them draw a geocentric model of the universe with the Solar System as the dominant physical system, reflecting little influence of modern astronomical knowledge. A few monastics draw the traditional Buddhist model of the world. The implications of the monastics' representation of the universe for their assimilation of modern science is discussed.

  20. Field-usable portable analyzer for chlorinated organic compounds

    International Nuclear Information System (INIS)

    In 1992, a chemical sensor was developed which showed almost perfect selectivity to vapors of chlorinated solvents. When interfaced to an instrument, a chemical analyzer will be produced that has near- absolute selectivity to vapors of volatile chlorinated organic compounds. TRI has just completed the second of a 2-phase program to develop this new instrument system, which is called the RCL MONITOR. In Phase II, this instrument was deployed in 5 EM40 operations. Phase II applications covered clean-up process monitoring, environmental modeling, routine monitoring, health and safety, and technology validation. Vapor levels between 0 and 100 ppM can be determined in 90 s with a lower detection limit of 0.5 ppM using the hand-portable instrument. Based on the favorable performance of the RCL MONITOR, the commercial instrument was released for commercial sales on Sept. 20, 1996

  1. Qualitative Methodology in Analyzing Educational Phenomena

    Directory of Open Access Journals (Sweden)

    Antonio SANDU

    2010-12-01

    Full Text Available Semiological analysis of educational phenomena allow researchers access to a multidimensional universe of meanings that is represented by the school, not so much seen as an institution, but as a vector of social action through educational strategies. We consider education as a multidimensional phenomenon since its analysis allows the researcher to explore a variety of research hypotheses of different paradigmatic perspectives that converge in an educational finality. According to the author Simona Branc one of the most appropriate methods used in qualitative data analysis is Grounded Theory; this one assumes a systematic process of generating concepts and theories based on the data collected. Specialised literature defines Grounded Theory as an inductive approach that starts with general observations and during the analytical process creates conceptual categories that explain the theme explored. Research insist on the role of the sociologic theory of managing the research data and for providing ways of conceptualizing the descriptions and explanations.Qualitative content analysis is based on the constructivist paradigm (constructionist in the restricted sense that we used previously. It aims to create an “understanding of the latent meanings of the analyzed messages”. Quantitative content analysis involves a process of encoding and statistical analysis of data extracted from the content of the paper in the form of extractions like: frequencies, contingency analysis, etc

  2. Analyzing Consumer Behavior Towards Contemporary Food Retailers

    Directory of Open Access Journals (Sweden)

    E.Dursun

    2008-01-01

    Full Text Available The objective of this research is analyzing consumer behaviors towards to contemporary food retailers. Food retailing has been changing during recent years in Turkey. Foreign investors captivated with this market potential of food retailing. Retailer‟s format has been changed and featuring large-scale, extended product variety and full service retailers spreading rapidly through the nation-wide. Consumers‟ tend to shop their household needs from contemporary retailers due mainly to urbanism, increasing women workforce and income growth. In this research, original data collected through face-to-face interview from 385 respondents which are located in Istanbul. Different Socio-Economic Status (SES groups‟ ratio for Istanbul was forming sampling distribution. Consumers prefer closest food retailers which are mainly purchasing food products. Consumers purchase more than their planned what their needs; especially C SES group average comes first for the spending money for unplanned shopping. Chain stores and hypermarkets are the most preferred retailers in food purchasing. Moreover, consumer responses to judgments related to retailing are being investigating with factor analysis.

  3. Analyzing waterflood responses for Pekisko B

    Energy Technology Data Exchange (ETDEWEB)

    Fedenczuk, L. [Gambit Consulting Ltd., (Canada); Pedersen, P.; Marshall, M. [Ocelot Energy Inc., Calgary, AB (Canada)

    1999-11-01

    A study was conducted between January 1990 and September 1997 to find fluid communication between injectors and producers in the Pekisko B pool, the largest oil pool in the Sylvan Lake Field, and the third largest oil pool in the Gilby, Medicine River and Sylvan Lake Field areas in Alberta. A total of eight injectors (patterns) and all wells within a radius of 1700 m from any specific injector were analyzed. The analysis was based on responses of producers to change in the injection rates. The strength of the oil response was measured as the correlation between the oil rate and the injection rate. Three sets of primary parameters characterized the oil, gas and water response. The waterflood responses were presented in the form of special XY diagrams. All injectors showed medium or strong communication with producers as defined by the oil responses. This method of finding fluid responses proved to be more cost and time effective than conventional engineering methods. 4 refs., 8 figs.

  4. Eastern Mediterranean Natural Gas: Analyzing Turkey's Stance

    Directory of Open Access Journals (Sweden)

    Abdullah Tanriverdi

    2016-02-01

    Full Text Available Recent large-scale natural gas discoveries in East Mediterranean have drawn attention to the region. The discoveries caused both hope and tension in the region. As stated, the new resources may serve as a new hope for all relevant parties as well as the region if managed in a collaborative and conciliatory way. Energy may be a remedy to Cyprus' financial predicament, initiate a process for resolving differences between Turkey and Cyprus, normalize Israel-Turkey relations and so on. On the contrary, adopting unilateral and uncooperative approach may aggravate the tension and undermine regional stability and security. In this sense, the role of energy in generating hope or tension is dependent on the approaches of related parties. The article will analyze Turkey's attitude in East Mediterranean case in terms of possible negative and positive implications for Turkey in the energy field. The article examines Turkey's position and the reasons behind its stance in the East Mediterranean case. Considering Turkey's energy profile and energy policy goals, the article argues that the newly found hydrocarbons may bring in more stakes for Turkey if Turkey adopts a cooperative approach in this case.

  5. Analyzing Spatiotemporal Anomalies through Interactive Visualization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2014-06-01

    Full Text Available As we move into the big data era, data grows not just in size, but also in complexity, containing a rich set of attributes, including location and time information, such as data from mobile devices (e.g., smart phones, natural disasters (e.g., earthquake and hurricane, epidemic spread, etc. We are motivated by the rising challenge and build a visualization tool for exploring generic spatiotemporal data, i.e., records containing time location information and numeric attribute values. Since the values often evolve over time and across geographic regions, we are particularly interested in detecting and analyzing the anomalous changes over time/space. Our analytic tool is based on geographic information system and is combined with spatiotemporal data mining algorithms, as well as various data visualization techniques, such as anomaly grids and anomaly bars superimposed on the map. We study how effective the tool may guide users to find potential anomalies through demonstrating and evaluating over publicly available spatiotemporal datasets. The tool for spatiotemporal anomaly analysis and visualization is useful in many domains, such as security investigation and monitoring, situation awareness, etc.

  6. Signal processing and analyzing works of art

    Science.gov (United States)

    Johnson, Don H.; Johnson, C. Richard, Jr.; Hendriks, Ella

    2010-08-01

    In examining paintings, art historians use a wide variety of physico-chemical methods to determine, for example, the paints, the ground (canvas primer) and any underdrawing the artist used. However, the art world has been little touched by signal processing algorithms. Our work develops algorithms to examine x-ray images of paintings, not to analyze the artist's brushstrokes but to characterize the weave of the canvas that supports the painting. The physics of radiography indicates that linear processing of the x-rays is most appropriate. Our spectral analysis algorithms have an accuracy superior to human spot-measurements and have the advantage that, through "short-space" Fourier analysis, they can be readily applied to entire x-rays. We have found that variations in the manufacturing process create a unique pattern of horizontal and vertical thread density variations in the bolts of canvas produced. In addition, we measure the thread angles, providing a way to determine the presence of cusping and to infer the location of the tacks used to stretch the canvas on a frame during the priming process. We have developed weave matching software that employs a new correlation measure to find paintings that share canvas weave characteristics. Using a corpus of over 290 paintings attributed to Vincent van Gogh, we have found several weave match cliques that we believe will refine the art historical record and provide more insight into the artist's creative processes.

  7. PSAIA – Protein Structure and Interaction Analyzer

    Directory of Open Access Journals (Sweden)

    Vlahoviček Kristian

    2008-04-01

    Full Text Available Abstract Background PSAIA (Protein Structure and Interaction Analyzer was developed to compute geometric parameters for large sets of protein structures in order to predict and investigate protein-protein interaction sites. Results In addition to most relevant established algorithms, PSAIA offers a new method PIADA (Protein Interaction Atom Distance Algorithm for the determination of residue interaction pairs. We found that PIADA produced more satisfactory results than comparable algorithms implemented in PSAIA. Particular advantages of PSAIA include its capacity to combine different methods to detect the locations and types of interactions between residues and its ability, without any further automation steps, to handle large numbers of protein structures and complexes. Generally, the integration of a variety of methods enables PSAIA to offer easier automation of analysis and greater reliability of results. PSAIA can be used either via a graphical user interface or from the command-line. Results are generated in either tabular or XML format. Conclusion In a straightforward fashion and for large sets of protein structures, PSAIA enables the calculation of protein geometric parameters and the determination of location and type for protein-protein interaction sites. XML formatted output enables easy conversion of results to various formats suitable for statistic analysis. Results from smaller data sets demonstrated the influence of geometry on protein interaction sites. Comprehensive analysis of properties of large data sets lead to new information useful in the prediction of protein-protein interaction sites.

  8. The model JSR-12 neutron coincidence analyzer

    International Nuclear Information System (INIS)

    This paper reports that one of the ways in which non-destructive assays for nuclear materials is made involved counting the neutron signatures which result from spontaneous or induced fissions in fissile materials. A major problem in determining the number of fission neutrons is trying to separate them from the background of neutrons arising from alpha particle interactions with lighter nuclei in the matrix materials of the samples being assayed. The JSR-12 neutron coincidence analyzer operates on the principle that fission neutrons occur in multiples of two or more, whereas background neutrons occur randomly as single events. By exploiting this time correlation difference, the JSR-12 can determine the fission neutron signal. This instrument represents a considerable upgrade from the industry standard JSR-11, by doubling the response speed and adding complete computer control of all functions, as well as employing non-volatile memory for data storage. Operation has been simplified for field use by using an LCD display to guide the operator in setting up assay parameters, and by time-date tagging all assays for later retrieval

  9. NRC plant-analyzer development at BNL

    International Nuclear Information System (INIS)

    The objective of this program is to develop an LWR engineering plant analyzer capable of performing realistic and accurate simulations of plant transients and Small-Break Loss of Coolant Accidents at real-time and faster than real-time computing speeds and at low costs for preparing, executing and evaluating such simulations. The program is directed toward facilitating reactor safety analyses, on-line plant monitoring, on-line accident diagnosis and mitigation and toward improving reactor operator training. The AD10 of Applied Dynamics International, Ann Arbor, MI, a special-purpose peripheral processor for high-speed systems simulation, is programmed through a PDP-11/34 minicomputer and carries out digital simulations with analog hardware in the input/output loop (up to 256 channels). Analog signals from a control panel are being used now to activate or to disable valves and to trip pump drive motors or regulators without interrupting the simulation. An IBM personal computer with multicolor graphics capabilities and a CRT monitor are used to produce on-line labelled diagrams of selected plant parameters as functions of time

  10. Update on the USNRC's nuclear plant analyzer

    International Nuclear Information System (INIS)

    The Nuclear Plant Analyzer (NPA) is the U.S. Nuclear Regulatory Commission's (NRC's) state-of-the-art nuclear reactor simulation capability. This computer software package integrates high fidelity nuclear reactor simulation codes such as the TRAC and RELAPS series of codes with color graphics display techniques and advanced workstation hardware. An overview of this program was given at the 1984 Summer Computer Simulation Conference (SCSC), with selected topics discussed at the 1985 and 1986 SCSCs. This paper addresses these activities and related experiences. First, The Class VI computer implementation is discussed. The trade-offs between gaining significantly greater computational speed and central memory, with the loss of performance due to many more simultaneous users is shown. Second, the goal of the super-minicomputer implementation is to produce a very cost-effective system that utilizes advanced (multi-dimensional, two-phase coolant) simulation capabilities at real wall-clock simulation times. Benchmarking of the initial super-minicomputer implementation is discussed. Finally, the technical and economic feasibility is addressed for implementing the super-minicomputer version of the NPA with the RELAPS simulation code onto the Black Fox full scope nuclear power plant simulator

  11. Analyzing Design Heating Loads in Superinsulated Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Arena, Lois [Consortium for Advanced Residential Buildings, Norwalk, CT (United States)

    2015-06-16

    The U.S. Department of Energy’s Building America research team Consortium for Advanced Residential Buildings (CARB) worked with the EcoVillage cohousing community in Ithaca, New York, on the Third Residential EcoVillage Experience neighborhood. This communityscale project consists of 40 housing units—15 apartments and 25 single-family residences. Units range in size from 450 ft2 to 1,664 ft2 and cost from $80,000 for a studio apartment to $235,000 for a three- or four-bedroom single-family home. For the research component of this project, CARB analyzed current heating system sizing methods for superinsulated homes in cold climates to determine if changes in building load calculation methodology should be recommended. Actual heating energy use was monitored and compared to results from the Air Conditioning Contractors of America’s Manual J8 (MJ8) and the Passive House Planning Package software. Results from that research indicate that MJ8 significantly oversizes heating systems for superinsulated homes and that thermal inertia and internal gains should be considered for more accurate load calculations.

  12. Analyzing modified unimodular gravity via Lagrange multipliers

    Science.gov (United States)

    Sáez-Gómez, Diego

    2016-06-01

    The so-called unimodular version of general relativity is revisited. Unimodular gravity is constructed by fixing the determinant of the metric, which leads to the trace-free part of the equations instead of the usual Einstein field equations. Then a cosmological constant naturally arises as an integration constant. While unimodular gravity turns out to be equivalent to general relativity (GR) at the classical level, it provides important differences at the quantum level. Here we extend the unimodular constraint to some extensions of general relativity that have drawn a lot of attention over the last years—f (R ) gravity (or its scalar-tensor picture) and Gauss-Bonnet gravity. The corresponding unimodular version of such theories is constructed as well as the conformal transformation that relates the Einstein and Jordan frames for these nonminimally coupled theories. From the classical point of view, the unimodular versions of such extensions are completely equivalent to their originals, but an effective cosmological constant arises naturally, which may provide a richer description of the evolution of the Universe. Here we analyze the case of Starobisnky inflation and compare it with the original one.

  13. Analyzing sociodemographic factors amongst blood donors

    Directory of Open Access Journals (Sweden)

    Shenga Namgay

    2010-01-01

    Full Text Available Introduction: Blood transfusion is a fundamental and requisite part of any National Health Service for optimum management of emergency conditions like severe trauma shock and resuscitation with the optimum stock of its different components. The objective of the present study was to analyze the factors of knowledge of prospective blood donors that may influence their perception and awareness about blood donation. Materials and Methods: This population-based cross-sectional study was conducted at Gangtok in the state of Sikkim, India, on 300 subjects of the adult population selected by two-stage cluster sampling. The main outcome variables were the socioeconomic and demographic variables of knowledge of blood donation. By interview technique, using the pre-tested structured close-ended questionnaire, the principal investigator collected the data. Results: In our study population, 46% of the study population was found to have a high knowledge score. The knowledge about blood donation was found to be statistically significant with the occupational status and the education levels, both in the bivariate and in the multivariate analyses. Knowledge about blood donation was not significantly related to age, sex, marital status, religion, community status and per capita monthly family income. Conclusion: The study suggested that the perceptions toward voluntary blood donation could be influenced to a large extent by sociodemographic variables of knowledge among the general population.

  14. Artificial patinas analyzed with PIXE method

    International Nuclear Information System (INIS)

    Aiming at the restoration and conservation of the archaeological metallic objects, the artificial patinas can be used to simulate the natural patinas (corrosion products in metal and its alloys), permitting the characterization and corrosion mechanisms studies. The natural patinas formation is difficult to study because of the long corrosion production process in materials which take years to be formed. On the other hand the artificial patinas can be easily produced in a shorter time, moreover, they can be used as simulation of the corrosion process and in substitution of monuments and old art objects, damaged for some reason. In our study artificial patinas were produced over pellets of copper and bronze with sulfate, chloride and nitrate solutions and were analyzed with PIXE (Proton Induced X-Ray Emission) technique to supply qualitative and quantitative information of the corrosion elements. The quantitative PIXE analysis takes into account the incident ion beam absorption and the emergent X-ray of the sample, as well as the patina layer and the backing. The PIXE results have shown the presence of S, Cl and Fe and some other elements already known form the backings, such as Cu, Sn, etc. PIXE measurements were also realized in reference metallic materials. (author)

  15. Novel topological descriptors for analyzing biological networks

    Directory of Open Access Journals (Sweden)

    Varmuza Kurt K

    2010-06-01

    Full Text Available Abstract Background Topological descriptors, other graph measures, and in a broader sense, graph-theoretical methods, have been proven as powerful tools to perform biological network analysis. However, the majority of the developed descriptors and graph-theoretical methods does not have the ability to take vertex- and edge-labels into account, e.g., atom- and bond-types when considering molecular graphs. Indeed, this feature is important to characterize biological networks more meaningfully instead of only considering pure topological information. Results In this paper, we put the emphasis on analyzing a special type of biological networks, namely bio-chemical structures. First, we derive entropic measures to calculate the information content of vertex- and edge-labeled graphs and investigate some useful properties thereof. Second, we apply the mentioned measures combined with other well-known descriptors to supervised machine learning methods for predicting Ames mutagenicity. Moreover, we investigate the influence of our topological descriptors - measures for only unlabeled vs. measures for labeled graphs - on the prediction performance of the underlying graph classification problem. Conclusions Our study demonstrates that the application of entropic measures to molecules representing graphs is useful to characterize such structures meaningfully. For instance, we have found that if one extends the measures for determining the structural information content of unlabeled graphs to labeled graphs, the uniqueness of the resulting indices is higher. Because measures to structurally characterize labeled graphs are clearly underrepresented so far, the further development of such methods might be valuable and fruitful for solving problems within biological network analysis.

  16. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  17. Analyzing wildfire exposure on Sardinia, Italy

    Science.gov (United States)

    Salis, Michele; Ager, Alan A.; Arca, Bachisio; Finney, Mark A.; Alcasena, Fermin; Bacciu, Valentina; Duce, Pierpaolo; Munoz Lozano, Olga; Spano, Donatella

    2014-05-01

    We used simulation modeling based on the minimum travel time algorithm (MTT) to analyze wildfire exposure of key ecological, social and economic features on Sardinia, Italy. Sardinia is the second largest island of the Mediterranean Basin, and in the last fifty years experienced large and dramatic wildfires, which caused losses and threatened urban interfaces, forests and natural areas, and agricultural productions. Historical fires and environmental data for the period 1995-2009 were used as input to estimate fine scale burn probability, conditional flame length, and potential fire size in the study area. With this purpose, we simulated 100,000 wildfire events within the study area, randomly drawing from the observed frequency distribution of burn periods and wind directions for each fire. Estimates of burn probability, excluding non-burnable fuels, ranged from 0 to 1.92x10-3, with a mean value of 6.48x10-5. Overall, the outputs provided a quantitative assessment of wildfire exposure at the landscape scale and captured landscape properties of wildfire exposure. We then examined how the exposure profiles varied among and within selected features and assets located on the island. Spatial variation in modeled outputs resulted in a strong effect of fuel models, coupled with slope and weather. In particular, the combined effect of Mediterranean maquis, woodland areas and complex topography on flame length was relevant, mainly in north-east Sardinia, whereas areas with herbaceous fuels and flat areas were in general characterized by lower fire intensity but higher burn probability. The simulation modeling proposed in this work provides a quantitative approach to inform wildfire risk management activities, and represents one of the first applications of burn probability modeling to capture fire risk and exposure profiles in the Mediterranean basin.

  18. NON-DESTRUCTIVE SOIL CARBON ANALYZER.

    Energy Technology Data Exchange (ETDEWEB)

    Wielopolski, Lucian; Hendrey, G.; Orion, I.; Prior, S.; Rogers, H.; Runion, B.; Torbert, A.

    2004-02-01

    This report describes the feasibility, calibration, and safety considerations of a non-destructive, in situ, quantitative, volumetric soil carbon analytical method based on inelastic neutron scattering (INS). The method can quantify values as low as 0.018 gC/cc, or about 1.2% carbon by weight with high precision under the instrument's configuration and operating conditions reported here. INS is safe and easy to use, residual soil activation declines to background values in under an hour, and no radiological requirements are needed for transporting the instrument. The labor required to obtain soil-carbon data is about 10-fold less than with other methods, and the instrument offers a nearly instantaneous rate of output of carbon-content values. Furthermore, it has the potential to quantify other elements, particularly nitrogen. New instrumentation was developed in response to a research solicitation from the U.S. Department of Energy (DOE LAB 00-09 Carbon Sequestration Research Program) supporting the Terrestrial Carbon Processes (TCP) program of the Office of Science, Biological and Environmental Research (BER). The solicitation called for developing and demonstrating novel techniques for quantitatively measuring changes in soil carbon. The report includes raw data and analyses of a set of proof-of-concept, double-blind studies to evaluate the INS approach in the first phase of developing the instrument. Managing soils so that they sequester massive amounts of carbon was suggested as a means to mitigate the atmospheric buildup of anthropogenic CO{sub 2}. Quantifying changes in the soils' carbon stocks will be essential to evaluating such schemes and documenting their performance. Current methods for quantifying carbon in soil by excavation and core sampling are invasive, slow, labor-intensive and locally destroy the system being observed. Newly emerging technologies, such as Laser Induced Breakdown Spectroscopy and Near-Infrared Spectroscopy, offer soil

  19. ANALYZING APPROACHES TO EVALUATING AIRLINES PERFORMANCE

    Directory of Open Access Journals (Sweden)

    O. Marusych

    2014-09-01

    Full Text Available Currently, understanding of what constitutes a business process, what internal factors impact the company's performance make airlines review the methods for assessing their performance. To assess the profitability of the processes, several different methods were developed whose approbation proved both their popularity and the reliability of the results: the concept of the economic value added (EVA, a Balanced Scorecard (BSC and the process-oriented cost-benefit analysis (BPA. The concept of the economic value added is now claiming to play a leading role in determining the shapes and structural elements of the methodology of corporate governance in the world. The system of parameters that characterize the activities of an airline within the concept of cost management, has constantly been updated. With the introduction of modern information technologies, appearing of new ideas the parameters are becoming more objective and complex. Despite the fact that the system of parameters is an indicator of the general state of an enterprise, it would be wrong to only limit oneself to their analysis. In order to immediately prevent or solve any problems arising at a plant, a system of timely and reliable indicators performance is required to most fully evaluate the efficiency of an airline as a whole. Such a system is a balanced system of efficiency indexes. It can significantly improve the quality of business management, especially if a company has several activities. By means of this technique it is possible to assess the effectiveness of not only individual investments or projects, but also of entire companies or commercial entities. The problem of a balanced scorecard is that it does not solve the fundamental problems of performance measurement and gives guidelines how to combine diverse indicators into an overall assessment of effectiveness. This problem of a combination of different types of indicators is resolved through a process-oriented analysis of

  20. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    Science.gov (United States)

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework. PMID:24110214

  1. Fuzzy Based Auto-coagulation Control Through Photometric Dispersion Analyzer

    Institute of Scientific and Technical Information of China (English)

    白桦; 李圭白

    2004-01-01

    The main role of water treatment plants is to supply high-quality safe drinking water. Coagulation is one of the most important stages of surface water treatment. The photometric dispersion analyzer(PDA) is a new optical method for flocculation monitoring, and is feasible to realize coagulation feedback control. The on line modification of the coagulation control system' s set point( or optimum dosing coagulant) has influenced the application of this technology in water treatment plant for a long time. A fuzzy control system incorporating the photometric dispersion analyzer was utilized in this coagulation control system. Proposed is a fuzzy logic inference control system by using Takagi and Sugeno' s fuzzy if-then rule for the self-correction of set point on line. Programmed is the dosing rate fuzzy control system in SIEMENS small-scale programmable logic controller. A 400 L/min middle-scale water treatment plant was utilized to simulate the reaction. With the changes of raw water quality, the set point was modified correctly in time, as well as coagulant dosing rate, and residual turbility before filtration was eligible and stable. Results show that this fuzzy inference and control system performs well on the coagulation control system through PDA.

  2. Assistive Technology

    Science.gov (United States)

    ... Page Resize Text Printer Friendly Online Chat Assistive Technology Assistive technology (AT) is any service or tool that helps ... be difficult or impossible. For older adults, such technology may be a walker to improve mobility or ...

  3. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use

  4. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  5. CONTEMPORARY SOCIAL MANAGEMENT TECHNOLOGIES

    OpenAIRE

    Plotnikov Mikhail Vyacheslavovich

    2012-01-01

    Analyzing the practices of development, application and research in managerial social technologies, the author reveals a number of essential problems in their further development. The revealed problems are combined into three groups: the problems of theory and methodology, the problems of development and the problem of practical application. Basing on the analysis of modern managerial social technologies, the author suggests comprehensive and universal classification that ...

  6. HMR Log Analyzer: Analyze Web Application Logs Over Hadoop MapReduce

    OpenAIRE

    Sayalee Narkhede; Tripti Baraskar

    2013-01-01

    In today’s Internet world, log file analysis is becoming a necessary task for analyzing the customer’sbehavior in order to improve advertising and sales as well as for datasets like environment, medical,banking system it is important to analyze the log data to get required knowledge from it. Web mining is theprocess of discovering the knowledge from the web data. Log files are getting generated very fast at therate of 1-10 Mb/s per machine, a single data center can generate tens of terabytes ...

  7. Nano technology

    International Nuclear Information System (INIS)

    This book is introduction of nano technology, which describes what nano technology is, alpha and omega of nano technology, the future of Korean nano technology and human being's future and nano technology. The contents of this book are nano period is coming, a engine of creation, what is molecular engineering, a huge nano technology, technique on making small things, nano materials with exorbitant possibility, the key of nano world the most desirable nano technology in bio industry, nano development plan of government, the direction of development for nano technology and children of heart.

  8. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM PC VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  9. ICAN - INTEGRATED COMPOSITE ANALYZER (IBM 370 VERSION)

    Science.gov (United States)

    Murthy, P. L.

    1994-01-01

    The Integrated Composite Analyzer (ICAN) is a computer program designed to carry out a comprehensive linear analysis of multilayered fiber composites. The analysis contains the essential features required to effectively design structural components made from fiber composites. ICAN includes the micromechanical design features of the Intraply Hybrid Composite Design (INHYD) program to predict ply level hygral, thermal, and mechanical properties. The laminate analysis features of the Multilayered Filamentary Composite Analysis (MFCA) program are included to account for interply layer effects. ICAN integrates these and additional features to provide a comprehensive analysis capability for composite structures. Additional features unique to ICAN include the following: 1) ply stress-strain influence coefficients, 2) microstresses and microstrain influence coefficients, 3) concentration factors around a circular hole, 4) calculation of probable delamination locations around a circular hole, 5) Poisson's ratio mismatch details near a straight edge, 6) free-edge stresses, 7) material card input for finite element analysis using NASTRAN (available separately from COSMIC) or MARC, 8) failure loads based on maximum stress criterion, and laminate failure stresses based on first-ply failures and fiber breakage criteria, 9) transverse shear stresses, normal and interlaminar stresses, and 10) durability/fatigue type analyses for thermal as well as mechanical cyclic loads. The code can currently assess degradation due to mechanical and thermal cyclic loads with or without a defect. ICAN includes a dedicated data bank of constituent material properties, and allows the user to build a database of material properties of commonly used fibers and matrices so the user need only specify code names for constituents. Input to ICAN includes constituent material properties (or code names), factors reflecting the fabrication process, and composite geometry. ICAN performs micromechanics

  10. Analyzers Measure Greenhouse Gases, Airborne Pollutants

    Science.gov (United States)

    2012-01-01

    In complete darkness, a NASA observatory waits. When an eruption of boiling water billows from a nearby crack in the ground, the observatory s sensors seek particles in the fluid, measure shifts in carbon isotopes, and analyze samples for biological signatures. NASA has landed the observatory in this remote location, far removed from air and sunlight, to find life unlike any that scientists have ever seen. It might sound like a scene from a distant planet, but this NASA mission is actually exploring an ocean floor right here on Earth. NASA established a formal exobiology program in 1960, which expanded into the present-day Astrobiology Program. The program, which celebrated its 50th anniversary in 2010, not only explores the possibility of life elsewhere in the universe, but also examines how life begins and evolves, and what the future may hold for life on Earth and other planets. Answers to these questions may be found not only by launching rockets skyward, but by sending probes in the opposite direction. Research here on Earth can revise prevailing concepts of life and biochemistry and point to the possibilities for life on other planets, as was demonstrated in December 2010, when NASA researchers discovered microbes in Mono Lake in California that subsist and reproduce using arsenic, a toxic chemical. The Mono Lake discovery may be the first of many that could reveal possible models for extraterrestrial life. One primary area of interest for NASA astrobiologists lies with the hydrothermal vents on the ocean floor. These vents expel jets of water heated and enriched with chemicals from off-gassing magma below the Earth s crust. Also potentially within the vents: microbes that, like the Mono Lake microorganisms, defy the common characteristics of life on Earth. Basically all organisms on our planet generate energy through the Krebs Cycle, explains Mike Flynn, research scientist at NASA s Ames Research Center. This metabolic process breaks down sugars for energy

  11. Using Simulation to Analyze Acoustic Environments

    Science.gov (United States)

    Wood, Eric J.

    2016-01-01

    One of the main projects that was worked on this semester was creating an acoustic model for the Advanced Space Suit in Comsol Multiphysics. The geometry tools built into the software were used to create an accurate model of the helmet and upper torso of the suit. After running the simulation, plots of the sound pressure level within the suit were produced, as seen below in Figure 1. These plots show significant nulls which should be avoided when placing microphones inside the suit. In the future, this model can be easily adapted to changes in the suit design to determine optimal microphone placements and other acoustic properties. Another major project was creating an acoustic diverter that will potentially be used to route audio into the Space Station's Node 1. The concept of the project was to create geometry to divert sound from a neighboring module, the US Lab, into Node 1. By doing this, no new audio equipment would need to be installed in Node 1. After creating an initial design for the diverter, analysis was performed in Comsol in order to determine how changes in geometry would affect acoustic performance, as shown in Figure 2. These results were used to produce a physical prototype diverter on a 3D printer. With the physical prototype, testing was conducted in an anechoic chamber to determine the true effectiveness of the design, as seen in Figure 3. The results from this testing have been compared to the Comsol simulation results to analyze how closely the Comsol results are to real-world performance. While the Comsol results do not seem to closely resemble the real world performance, this testing has provided valuable insight into how much trust can be placed in the results of Comsol simulations. A final project that was worked on during this tour was the Audio Interface Unit (AIU) design for the Orion program. The AIU is a small device that will be used for as an audio communication device both during launch and on-orbit. The unit will have functions

  12. Reviews Book: The 4% Universe: Dark Matter, Dark Energy and the Race to Discover the Rest of Reality Book: Quantitative Understanding of Biosystems: An Introduction to Biophysics Book: Edison's Electric Light: The Art of Invention Book: The Edge of Physics: Dispatches from the Frontiers of Cosmology Equipment: Voicebox Equipment: Tracker 4 Books: Hands-On Introduction to NI LabVIEW with Vernier, and Engineering Projects with NI LabVIEW and Vernier Places to Visit: Discovery Museum Book: Philosophy of Science: A Very Short Introduction Web Watch

    Science.gov (United States)

    2011-11-01

    WE RECOMMEND Quantitative Understanding of Biosystems: An Introduction to Biophysics Text applies physics to biology concepts Edison's Electric Light: The Art of Invention Edison's light still shines brightly The Edge of Physics: Dispatches from the Frontiers of Cosmology Anecdotes explore cosmology Voicebox Voicebox kit discovers the physics and evolution of speech Tracker 4 Free software tracks motion analysis Hands-On Introduction to NI LabVIEW with Vernier, and Engineering Projects with NI LabVIEW and Vernier Books support the LabVIEW software Discovery Museum Newcastle museum offers science enjoyment for all Philosophy of Science: A Very Short Introduction Philosophy opens up science questions WORTH A LOOK The 4% Universe: Dark Matter, Dark Energy and the Race to Discover the Rest of Reality Book researches the universe WEB WATCH Superconductivity websites are popular

  13. Lead Paint Analyzer. Deactivation and Decommissioning Focus Area. OST Reference #2317

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1999-09-01

    The U.S. Department of Energy (DOE) continually seeks safer and more cost-effective technologies for use in decontamination and decommissioning (D&D) of nuclear facilities. To this end, the Deactivation and Decommissioning Focus Area (DDFA) of the DOE’s Office of Science and Technology (OST) sponsors Large-Scale Demonstration and Deployment Projects (LSDDP). At these LSDDPs, developers and vendors of improved or innovative technologies showcase products that are potentially beneficial to DOE’s projects, and to others in the D&D community. Benefits sought include decreased health and safety risks to personnel and the environment, increased productivity, and decreased cost of operation. The Idaho National Engineering and Environmental Laboratory (INEEL) LSDDP generated a list of statements defining specific needs or problems where improved technology could be incorporated into ongoing D&D tasks. One of the stated needs was for a Lead Paint Analyzer that would reduce costs and shorten schedules in DOE’s Decommissioning Project. The Niton 700 Series Multi-element Analyzer is a hand-held, battery-operated unit that uses x-ray fluorescence spectroscopy (XRF) to analyze 25 elements, including the presence of lead in paint. The baseline technologies consist of collecting field samples and sending the samples to a laboratory for analysis. This demonstration investigated the associated costs and the required time to take an analysis with the multi-element analyzer with respect to the baseline technology. The Niton 700 Series Multi-element Analyzer performs in situ real-time analyses to identify and quantify lead, chromium, cadmium, and other metals in lead-based paint. Benefits expected from using the multi-element spectrum analyzer include: Reduced cost; Easier use; Reduced schedules in DOE’s decommissioning projects.

  14. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  15. Technology 2020

    Science.gov (United States)

    Newby, Mike

    2005-01-01

    This brief article discusses the new technologies that may be available in 2020 that will impact the field of education. The author believes that the new educational themes will be "flexibility" and "autonomy", and the new technological theme will be "transparency". Topics discussed include genetic technology, pharmacological technology, digital…

  16. A Serum Biomarker Model to Diagnose Pancreatic Cancer Using Proteomic Fingerprint Technology

    Institute of Scientific and Technical Information of China (English)

    Chunlin Ge; Ning Ma; Dianbo Yao; Fengming Luan; Chaojun Hu; Yongzhe Li; Yongfeng Liu

    2008-01-01

    OBJECTIVE To establish a serum protein pattern model for screening pancreatic cancer.METHODS Twenty-nine serum samples from patients with pancreatic cancer were collected before surgery,and an additional 57 serum samples from age and sex-matched individuals without cancer were used as controls.WCX magnetic beans and a PBS Ⅱ-C protein chip reader(Ciphergen Biosystems Inc)were employed to detect the protein fingerprint expression of all serum samples.The resulting profiles comparing serum from cancer and normal patients were analyzed with the Biomarker Wizard system,to establish a model using the Biomarker Pattern system software.A double-blind test was used to determine the sensitivity and specificity of the model.RESULTS A group of 4 biomarkers (relative molecular weights were 5,705 Da,4,935 Da,5,318 Da,3,243 Da)were selected to set up a decision tree to produce the classification model to effectively screen pancreatic cancer patients.The results yielded a sensitivitv of 100%(20/20),specificity of 97.4%(37/38).The ROC curve was 99.7%.A double-blind test used to challenge the model resulted in a sensitivity of 88.9% and a specifcity of 89.5%.CONCLUSION New serum biomarkers of pancreatic cancer have been identified.The pattern of combined markers provides a powerful and reliable diagnostic method for pancreatic cancer with high sensitivity and specificity.

  17. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Sung Kee; Jung, U.; Park, H. R.

    2010-04-15

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  18. Development of Modulators Against Degenerative Aging Using Radiation Fusion Technology

    International Nuclear Information System (INIS)

    In this study, we selected final 20 biomarkers for the degenerative aging to develop radiation aging modeling, and validated a few of selected markers to utilize them in the screening of aging modulators. To select the biomarkers of the degenerative aging, 4 categories of aging-related markers (immune/hematopoiesis, oxidative damage, signaling molecule, lipid metabolism) were comparatively analyzed in irradiated and normally aged biosystems (cell lines or mice). In result, most of the biomarkers showed similar changes by irradiation and normal aging. Regarding the immune/hematopoiesis, the decline of immune cell functions (lymphocyte, NK cell) and Th1/Th2 imbalance, and decreased antigen-presenting of dendritic cells were observed and 10 biomarkers were selected in this category. mtDNA deletion was selected for the oxidative damage marker, 6 biomarkers including p21 and p-FOXO3a for signaling molecule biomarkers, and 3 biomarkers including the adipose tissue weight were selected for lipid metabolism. In addition, the various radiation application conditions by single/factionated irradiation and the periods after the irradiation were investigated for the optimal induction of changes of biomarker, which revealed that total 5Gy of 10 or more fractionated irradiations and 4 months or greather period were observed to be optimal. To found the basis for the screening of natural aging modulators, some selected aging biomarkers were validated by their inhibition by well-known natural agents (EGCG, HemoHIM, etc) in aged cell or mouse model. Additionally, by evaluating the reductive efficacy of 5 natural agents on the degeneration of skin and reproductive organs induced by radiation and chemicals (cyclophosphamide, etc), we established the base for the screening of degenerative diseases by various factors

  19. Design of multi-channel amplitude analyzer base on LonWorks

    International Nuclear Information System (INIS)

    The paper introduces the multi-channel analyzer which adopts LonWorks technology. The system detects the pulse peak by hardware circuits and controls data acquisition and network communication by Micro Controller and Unit and Neuron chip. SCM is programmed by Keil C51; the communication between SCM and nerve cell is realized by Neron C language, and the computer program is written by VB language. Test results show that this analyzer is with fast conversion speed and low power consumption. (authors)

  20. A design of portable energy dispersive X-ray fluorescence analyzer

    International Nuclear Information System (INIS)

    A portable energy dispersive X-ray fluorescence analyzer (EDXRF) is designed. Excitation source is a small-caliber X-ray tube with Mo target. High-voltage power supply and filament supply is designed based on inverter technology. Semiconductor detectors is electric refrigerated, the qualities of the analyzer and the energy calibration curve are studied by the energy spectrum measurements for Cu, Zn, Ni, Pb, and the best range of elemental analysis is given by the theoretical analysis. (authors)

  1. High-resolving electrostatic charged particles energy analyzer with fine tuning for space investigations

    International Nuclear Information System (INIS)

    The paper presents results of numerical calculations of a high-resolving electrostatic energy analyzer, based on a bounded cylindrical field, for investigations of flows of charged particles in space. The analyzer possesses with ability of fine tuning of focusing characteristics, using an additional tuning potential, applied to one of electrodes. A combination of high energy resolution ability with high transmission, simple design and compactness makes this instrument very promising for space technologies

  2. Technological Standardization, Endogenous Productivity and Transitory Dynamics

    OpenAIRE

    Baron, J; Schmidt, J.

    2014-01-01

    We uncover technological standardization as a microeconomic mechanism which is vital for the implementation of new technologies, in particular general purpose technologies. The interdependencies of these technologies require common rules (“standardization”) to ensure compatibility. Using data on standardization, we are therefore able to identify technology shocks and analyze their impact on macroeconomic variables. First, our results show that technology shocks diffuse slowly and generate a p...

  3. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  4. Thromboelastography platelet mapping in healthy dogs using 1 analyzer versus 2 analyzers

    OpenAIRE

    Blois, Shauna L.; Banerjee, Amrita; Wood, R. Darren; Park, Fiona M.

    2013-01-01

    The objective of this study was to describe the results of thromboelastography platelet mapping (TEG-PM) carried out using 2 techniques in 20 healthy dogs. Maximum amplitudes (MA) generated by thrombin (MAthrombin), fibrin (MAfibrin), adenosine diphosphate (ADP) receptor activity (MAADP), and thromboxane A2 (TxA2) receptor activity (stimulated by arachidonic acid, MAAA) were recorded. Thromboelastography platelet mapping was carried out according to the manufacturer’s guidelines (2-analyzer t...

  5. Noise and Analyzer-Crystal Angular Position Analysis for Analyzer-Based Phase-Contrast Imaging

    OpenAIRE

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-01-01

    The analyzer-based phase-contrast X-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the...

  6. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    OpenAIRE

    Farzin Heravi; Roozbeh Rashed; Leila Raziee

    2013-01-01

    Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to ...

  7. Analyzing Public Library Service Interactions to Improve Public Library Customer Service and Technology Systems

    OpenAIRE

    Holly Arnason; Louise Reimer

    2012-01-01

    Objective – To explore the types and nature of assistance library customers are asking library staff for in a large Canadian urban public library system.Methods – A qualitative study employing transaction logging combined with embedded observation occurred for three-day sample periods at a selection of nine branches over the course of eight months. Staff recorded questions and interactions at service desks (in person, by phone, and electronically), as well as questions received during schedul...

  8. A Hybrid Method of Analyzing Patents for Sustainable Technology Management in Humanoid Robot Industry

    OpenAIRE

    Jongchan Kim; Joonhyuck Lee; Gabjo Kim; Sangsung Park; Dongsik Jang

    2016-01-01

    A humanoid, which refers to a robot that resembles a human body, imitates a human’s intelligence, behavior, sense, and interaction in order to provide various types of services to human beings. Humanoids have been studied and developed constantly in order to improve their performance. Humanoids were previously developed for simple repetitive or hard work that required significant human power. However, intelligent service robots have been developed actively these days to provide necessary info...

  9. Advances and considerations in technologies for growing, imaging, and analyzing 3-D root system architecture

    Science.gov (United States)

    The ability of a plant to mine the soil for nutrients and water is determined by how, where, and when roots are arranged in the soil matrix. The capacity of plant to maintain or improve its yield under limiting conditions, such as nutrient deficiency or drought, is affected by root system architectu...

  10. The university-industry knowledge relationship: Analyzing patents and the science base of technologies

    CERN Document Server

    Leydesdorff, Loet

    2009-01-01

    Via the Internet, information scientists can obtain cost-free access to large databases in the hidden or deep web. These databases are often structured far more than the Internet domains themselves. The patent database of the U.S. Patent and Trade Office is used in this study to examine the science base of patents in terms of the literature references in these patents. University-based patents at the global level are compared with results when using the national economy of the Netherlands as a system of reference. Methods for accessing the on-line databases and for the visualization of the results are specified. The conclusion is that 'biotechnology' has historically generated a model for theorizing about university-industry relations that cannot easily be generalized to other sectors and disciplines.

  11. The university-industry knowledge relationship: Analyzing patents and the science base of technologies

    OpenAIRE

    Leydesdorff, Loet

    2009-01-01

    Via the Internet, information scientists can obtain cost-free access to large databases in the hidden or deep web. These databases are often structured far more than the Internet domains themselves. The patent database of the U.S. Patent and Trade Office is used in this study to examine the science base of patents in terms of the literature references in these patents. University-based patents at the global level are compared with results when using the national economy of the Netherlands as ...

  12. Technology For Information Engineering (TIE): A New Way of Storing, Retrieving and Analyzing Information

    OpenAIRE

    Lewak, Jerzy

    2002-01-01

    The theoretical foundations of a new model and paradigm (called TIE) for data storage and access are introduced. Associations between data elements are stored in a single Matrix table, which is usually kept entirely in RAM for quick access. The model ties together a very intuitive "guided" GUI to the Matrix structure, allowing extremely easy complex searches through the data. Although it is an "Associative Model" in that it stores the data associations separately from the data itself, in cont...

  13. An integrated empirical and modeling methodology for analyzing solar reflective roof technologies on commercial buildings

    International Nuclear Information System (INIS)

    Buildings impact the environment in many ways as a result of both their energy use and material consumption. In urban areas, the emission of greenhouse gases and the creation of microclimates are among their most prominent impacts so the adoption of building design strategies and materials that address both these issues will lead to significant reductions in a building's overall environmental impact. This report documents the energy savings and surface temperature reduction achieved by replacing an existing commercial building's flat roof with a more reflective 'cool roof' surface material. The research methodology gathered data on-site (surface temperatures and reflectivity) and used this in conjunction with the as-built drawings to construct a building energy simulation model. A 20-year cost benefit analysis (CBA) was conducted to determine the return on investment (ROI) for the new cool roof construction based on the energy simulation results. The results of the EnergyPlus trademark simulation modeling revealed that reductions of 1.3-1.9% and 2.6-3.8% of the total monthly electricity consumption can be achieved from the 50% cool roof replacement already implemented and a future 100% roof replacement, respectively. This corresponds to a saving of approximately $22,000 per year in energy costs at current prices and a consequent 9-year payback period for the added cost of installing the 100% cool roof. The environmental benefits associated with these electricity savings, particularly the reductions in environmental damage and peak-time electricity demand, represent the indirect benefits of the cool roof system. (author)

  14. Digital Media in Primary Schools: Literacy or Technology? Analyzing Government and Media Discourses

    Science.gov (United States)

    Pereira, Sara; Pereira, Luís

    2015-01-01

    This article examines the political and the media discourses concerning the Portuguese governmental program responsible for delivering a laptop named "Magalhães" to all primary school children. The analysis is based on the official documents related to the launch and development of the initiative as well as the press coverage of this…

  15. Massively Parallel Sequencing Approaches for Characterization of Structural Variation

    OpenAIRE

    Koboldt, Daniel C.; Larson, David E.; Chen, Ken; Ding, Li; Wilson, Richard K.

    2012-01-01

    The emergence of next-generation sequencing (NGS) technologies offers an incredible opportunity to comprehensively study DNA sequence variation in human genomes. Commercially available platforms from Roche (454), Illumina (Genome Analyzer and Hiseq 2000), and Applied Biosystems (SOLiD) have the capability to completely sequence individual genomes to high levels of coverage. NGS data is particularly advantageous for the study of structural variation (SV) because it offers the sensitivity to de...

  16. Smile Analyzer: A Software Package for Analyzing the Characteristics of the Speech and Smile

    Directory of Open Access Journals (Sweden)

    Roozbeh Rashed

    2013-01-01

    Full Text Available Taking into account the factors related to lip-tooth relationships in orthodontic diagnosis and treatment planning is of prime importance. Manual quantitative analysis of facial parameters on photographs during smile and speech is a difficult and time-consuming job. Since there is no comprehensive and user-friendly software package, we developed a software program called "Smile Analyzer" in the Department of Orthodontics of Mashhad Faculty of Dentistry for measuring the parameters related to lip-tooth relationships and other facial landmarks on the photographs taken during various facial expressions. The software was designed using visual basic. NET and the ADO. NET was used for developing its Microsoft Access database. The program runs on Microsoft Windows. It is capable of analyzing many parameters or variables in many patients' photographs, although 19 more common variables are previously defined as a default list of variables. When all variables are measured or calculated, a report can be generated and saved in either PDF or MS Excel format. Data are readily transferable to statistical software like SPSS for Windows.  

  17. CSNI specialist meeting on simulators and plant analyzers

    International Nuclear Information System (INIS)

    The Specialist Meeting on Simulators and Plant Analyzers, held in June 9-12, 1992, in Lappeenranta, Finland, was sponsored by the Committee on the Safety on Nuclear Installations (CSNI) of the OECD Nuclear Energy Agency (NEA). It was organized in collaboration with the Technical Research Centre of Finland (VTT) and the Lappeenranta Technical University of Technology (LTKK). All the presented papers were invited and devided into four sessions. In the first session the objectives, requirements and consepts of simulators were discussed against present standards and guidelines. The second session focused on the capabilities of current analytical models. The third session focused on the experiences gained so far from the applications. The final fourth session concentrated on simulators, which are currently under development, and future plans with regard to both development and utilization. At the end of the meeting topics of the meeting were discussed at the panel discussion. Summaries of the sessions and shortened version of the panel discussion are included into the proceeding. (orig.)

  18. Field-usable portable analyzer for chlorinated organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, W.J.; Penrose, W.R.; Stetter, J.R. [Transducer Research, Inc., Naperville, IL (United States)

    1995-10-01

    Transducer Research, Inc. (TRI) has been working with the DOE Morgantown Energy Technology Center to develop a new chemical monitor based on a unique sensor which responds selectively to vapors of chlorinated solvents. We are also developing field applications for the monitor in actual DOE cleanup operations. During the initial phase, prototype instruments were built and field tested. Because of the high degree of selectivity that is obtained, no response was observed with common hydrocarbon organic compounds such as BTX (benzene, toluene, xylene) or POLs (petroleum, oil, lubricants), and in fact, no non-halogen-containing chemical has been identified which induces a measurable response. By the end of the Phase I effort, a finished instrument system was developed and test marketed. This instrument, called the RCL MONITOR, was designed to analyze individual samples or monitor an area with automated repetitive analyses. Vapor levels between 0 and 500 ppm can be determined in 90 s with a lower detection limit of 0.2 ppm using the handportable instrument. In addition to the development of the RCL MONITOR, advanced sampler systems are being developed to: (1) extend the dynamic range of the instrument through autodilution of the vapor and (2) allow chemical analyses to be performed on aqueous samples. When interfaced to the samplers, the RCL MONITOR is capable of measuring chlorinated solvent contamination in the vapor phase up to 5000 ppm and in water and other condensed media from 10 to over 10,000 ppb(wt)--without hydrocarbon and other organic interferences.

  19. Software Developed for Analyzing High- Speed Rolling-Element Bearings

    Science.gov (United States)

    Fleming, David P.

    2005-01-01

    COBRA-AHS (Computer Optimized Ball & Roller Bearing Analysis--Advanced High Speed, J.V. Poplawski & Associates, Bethlehem, PA) is used for the design and analysis of rolling element bearings operating at high speeds under complex mechanical and thermal loading. The code estimates bearing fatigue life by calculating three-dimensional subsurface stress fields developed within the bearing raceways. It provides a state-of-the-art interactive design environment for bearing engineers within a single easy-to-use design-analysis package. The code analyzes flexible or rigid shaft systems containing up to five bearings acted upon by radial, thrust, and moment loads in 5 degrees of freedom. Bearing types include high-speed ball, cylindrical roller, and tapered roller bearings. COBRA-AHS is the first major upgrade in 30 years of such commercially available bearing software. The upgrade was developed under a Small Business Innovation Research contract from the NASA Glenn Research Center, and incorporates the results of 30 years of NASA and industry bearing research and technology.

  20. A Ca/Fe X-ray fluorescence analyzer suitable for the purpose of teaching

    International Nuclear Information System (INIS)

    This paper introduces a Ca/Fe XRF analyzer specially designed for the purpose of teaching the related courses on nuclear engineering and nuclear technology in the university. Both working principle and constitution of the instrument are presented. A comparison between XRF analysis and chemical analysis showed that the two results were in agreement with an error of ±0.7%

  1. Study and realization of a multichannel analyzer with a gamma chain acquisition

    International Nuclear Information System (INIS)

    Electronics is an important area, it involves multiple searches on several facilities in order to have effective diagnostic. Our study in the the National Center for Science and Nuclear Technologies was a good opportunity to improve our knowledge and the various acquisitions it was based on the Study and realization of a multichannel analyzer with a gamma chain acquisition.

  2. Co-production of Knowledge in Multi-stakeholder Processes: Analyzing Joint Experimentation as Social Learning

    NARCIS (Netherlands)

    Akpo, E.; Crane, T.A.; Vissoh, P.; Tossou, C.R.

    2015-01-01

    Purpose: Changing research design and methodologies regarding how researchers articulate with end-users of technology is an important consideration in developing sustainable agricultural practices. This paper analyzes a joint experiment as a multi-stakeholder process and contributes to understand ho

  3. Analyzing Science Activities in Force and Motion Concepts: A Design of an Immersion Unit

    Science.gov (United States)

    Ayar, Mehmet C.; Aydeniz, Mehmet; Yalvac, Bugrahan

    2015-01-01

    In this paper, we analyze the science activities offered at 7th grade in the Turkish science and technology curriculum along with addressing the curriculum's original intent. We refer to several science education researchers' ideas, including Chinn & Malhotra's (Science Education, 86:175--218, 2002) theoretical framework and…

  4. Using Networks to Visualize and Analyze Process Data for Educational Assessment

    Science.gov (United States)

    Zhu, Mengxiao; Shu, Zhan; von Davier, Alina A.

    2016-01-01

    New technology enables interactive and adaptive scenario-based tasks (SBTs) to be adopted in educational measurement. At the same time, it is a challenging problem to build appropriate psychometric models to analyze data collected from these tasks, due to the complexity of the data. This study focuses on process data collected from SBTs. We…

  5. Technological Learning for Carbon Capture and Sequestration Technologies

    OpenAIRE

    K. Riahi; Rubin, E.S.; Taylor, M. R.; L. Schrattenholzer; Hounshell, D.

    2004-01-01

    This paper analyzes potentials of carbon capture and sequestration technologies (CCT) in a set of long-term energy-economic-environmental scenarios based on alternative assumptions for technological progress of CCT. In order to get a reasonable guide to future technological progress in managing CO2 emissions, we review past experience in controlling sulfur dioxide (SO2) emissions from power plants. By doing so, we quantify a "learning curve" for CCT, which describes the relationship between ...

  6. Living Technology

    DEFF Research Database (Denmark)

    2010-01-01

    This book is aimed at anyone who is interested in learning more about living technology, whether coming from business, the government, policy centers, academia, or anywhere else. Its purpose is to help people to learn what living technology is, what it might develop into, and how it might impact...... our lives. The phrase 'living technology' was coined to refer to technology that is alive as well as technology that is useful because it shares the fundamental properties of living systems. In particular, the invention of this phrase was called for to describe the trend of our technology becoming...... increasingly life-like or literally alive. Still, the phrase has different interpretations depending on how one views what life is. This book presents nineteen perspectives on living technology. Taken together, the interviews convey the collective wisdom on living technology's power and promise, as well as its...

  7. 40 CFR 86.1321-94 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86... Procedures § 86.1321-94 Hydrocarbon analyzer calibration. The FID hydrocarbon analyzer shall receive the... into service and at least annually thereafter, the FID hydrocarbon analyzer shall be adjusted...

  8. 40 CFR 92.119 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 92... Hydrocarbon analyzer calibration. The HFID hydrocarbon analyzer shall receive the following initial and... into service and at least annually thereafter, the HFID hydrocarbon analyzer shall be adjusted...

  9. Application of Digital Mockup Technology

    OpenAIRE

    Gaoming Ding

    2011-01-01

    Digital simulation design is one of the mechanical product modern design methods. We have introduced the concept and meaning of digital mockup technology in mechanical design and its architecture, design cycle, which are concerned multi-domain UML, multi-body dynamics and multidisciplinary design. We also have analyzed the automobile digital simulation design method and digital mockup technology.

  10. Technology Tiers

    DEFF Research Database (Denmark)

    Karlsson, Christer

    2015-01-01

    A technology tier is a level in a product system: final product, system, subsystem, component, or part. As a concept, it contrasts traditional “vertical” special technologies (for example, mechanics and electronics) and focuses “horizontal” feature technologies such as product characteristics...

  11. University Teaching with Digital Technologies

    OpenAIRE

    Marcelo-García, Carlos; Yot-Domínguez, Carmen; Mayor-Ruiz, Cristina

    2015-01-01

    This research aims to analyze the level of use of technology by university teachers. We are interested by the frequency of their use in designing the teaching-learning process. The research questions were: what types of learning activities which include are designed by university teachers? What types of technologies do teachers use in the design of their instruction? What is the level of use of digital technologies in the learning designs? To respond to these issues, we designed an inventory ...

  12. Technological Dynamics and Social Capability

    DEFF Research Database (Denmark)

    Fagerberg, Jan; Feldman, Maryann; Srholec, Martin

    2014-01-01

    This article analyzes factors shaping technological capabilities in USA and European countries, and shows that the differences between the two continents in this respect are much smaller than commonly assumed. The analysis demonstrates a tendency toward convergence in technological capabilities for...... the sample as a whole between 1998 and 2008. The results indicate that social capabilities, such as well-developed public knowledge infrastructure, an egalitarian distribution of income, a participatory democracy and prevalence of public safety condition the growth of technological capabilities...

  13. Environmental regulation and technology transfers

    OpenAIRE

    Asano, Takao; Matsushima, Noriaki

    2012-01-01

    This paper analyzes the situation in which a national government introduces environmental regulations. Within the framework of an international duopoly with environmental regulations, this paper shows that an environmental tax imposed by the government in the home country can induce a foreign firm with advanced abatement technology to license it to a domestic firm without this technology. Furthermore, when the domestic firm's production technology is less efficient than that of the foreign fi...

  14. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2004-01-01

    deepens our understanding of how organizations appropriate new electronic communication media. The paper analyzes how a group of mediators in a large, multinational company adapted a new web-based CMC technology (a virtual workspace) to the local organizational context (and vice versa) by modifying...... features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. We found that these mediators exerted considerable influence on how the technology was established and used in the organization. The mediators were not neutral facilitators of a well......Implementation of new computer-mediated communication (CMC) systems in organizations is a complex socio-technical endeavour, involving the mutual adaptation of technology and organization over time. Drawing on the analytic concept of sensemaking, this paper provides a theoretical perspective that...

  15. Technology-Use Mediation

    DEFF Research Database (Denmark)

    Bansler, Jørgen P.; Havn, Erling C.

    2004-01-01

    Implementation of new computer-mediated communication (CMC) systems in organizations is a complex socio-technical endeavour, involving the mutual adaptation of technology and organization over time. Drawing on the analytic concept of sensemaking, this paper provides a theoretical perspective that...... deepens our understanding of how organizations appropriate new electronic communication media. The paper analyzes how a group of mediators in a large, multinational company adapted a new web-based CMC technology (a virtual workspace) to the local organizational context (and vice versa) by modifying...... features of the technology, providing ongoing support for users, and promoting appropriate conventions of use. We found that these mediators exerted considerable influence on how the technology was established and used in the organization. The mediators were not neutral facilitators of a well...

  16. Healthcare technology and technology assessment

    OpenAIRE

    Herndon, James H.; Hwang, Raymond; Bozic, K. H.

    2007-01-01

    New technology is one of the primary drivers for increased healthcare costs in the United States. Both physician and industry play important roles in the development, adoption, utilization and choice of new technologies. The Federal Drug Administration regulates new drugs and new medical devices, but healthcare technology assessment remains limited. Healthcare technology assessment originated in federal agencies; today it is decentralized with increasing private sector efforts. Innovation is ...

  17. Technology Lecturer Turned Technology Teacher

    Science.gov (United States)

    Lee, Kerry

    2009-01-01

    This case study outlines a program developed by a group of 6 teachers' college lecturers who volunteered to provide a technology program to year 7 & 8 children (11- and 12-year-olds) for a year. This involved teaching technology once a week. As technology education was a new curriculum area when first introduced to the college, few lecturers had…

  18. Analysis of Impact of 3D Printing Technology on Traditional Manufacturing Technology

    Science.gov (United States)

    Wu, Niyan; Chen, Qi; Liao, Linzhi; Wang, Xin

    With quiet rise of 3D printing technology in automobile, aerospace, industry, medical treatment and other fields, many insiders hold different opinions on its development. This paper objectively analyzes impact of 3D printing technology on mold making technology and puts forward the idea of fusion and complementation of 3D printing technology and mold making technology through comparing advantages and disadvantages of 3D printing mold and traditional mold making technology.

  19. Demonstration Technology Application and Analysis on the Scientific and Technological Progress

    OpenAIRE

    Qingzhu Qi; Zhixiao Jiang

    2013-01-01

    This paper takes Tianjin for example and analyzes the development tend of scientific and technological progress in Tianjin. From five aspects as ‘environment of scientific and technological progress’, ‘input of scientific and technological activities’, ‘output of scientific and technological activities’, ‘high-tech industrialization’, ‘science and technology for economic and social development’, the paper analysis the correlation between GDP and scientific and technological progress. Research...

  20. Technology '90

    International Nuclear Information System (INIS)

    The US Department of Energy (DOE) laboratories have a long history of excellence in performing research and development in a number of areas, including the basic sciences, applied-energy technology, and weapons-related technology. Although technology transfer has always been an element of DOE and laboratory activities, it has received increasing emphasis in recent years as US industrial competitiveness has eroded and efforts have increased to better utilize the research and development resources the laboratories provide. This document, Technology '90, is the latest in a series that is intended to communicate some of the many opportunities available for US industry and universities to work with the DOE and its laboratories in the vital activity of improving technology transfer to meet national needs. Technology '90 is divided into three sections: Overview, Technologies, and Laboratories. The Overview section describes the activities and accomplishments of the DOE research and development program offices. The Technologies section provides descriptions of new technologies developed at the DOE laboratories. The Laboratories section presents information on the missions, programs, and facilities of each laboratory, along with a name and telephone number of a technology transfer contact for additional information. Separate papers were prepared for appropriate sections of this report

  1. Sensemaking technologies

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research scope: The scope of the project is to study technological implementation processes by using Weick's sensemaking concept (Weick, 1995). The purpose of using a social constructivist approach to investigate technological implementation processes is to find out how new technologies transform...... patterns of social action and interaction in organisations (Barley 1986; 1990, Orlikowski 2000). Current research in the field shows that new technologies affect organisational routines/structures/social relationships/power relations/dependencies and alter organisational roles (Barley 1986; 1990, Burkhardt...... & Brass, 1990; Kling 1991; Orlikowski 2000). It also demonstrates that technology is a flexible variable adapted to the organisation's needs, culture, climate and management philosophy, thus leading to different uses and outcomes of the same technology in different organisations (Barley 1986; 1990...

  2. Soulful Technologies

    DEFF Research Database (Denmark)

    Fausing, Bent

    2010-01-01

    Samsung introduced in 2008 a mobile phone called "Soul" made with a human touch and including itself a "magic touch". Through the analysis of a Nokia mobile phone TV-commercials I want to examine the function and form of digital technology in everyday images. The mobile phone and its digital camera...... and other devices are depicted by everyday aesthetics as capable of producing a unique human presence and interaction. The medium, the technology is a necessary helper of this very special and lost humanity. Without the technology, no special humanity, no soul - such is the prophecy. This...... personification or anthropomorphism is important for the branding of new technology. Technology is seen as creating a techno-transcendence towards a more qualified humanity which is in contact with fundamental human values like intuition, vision, and sensing; all the qualities that technology, industrialization...

  3. Study on brackish water treatment technology

    Institute of Scientific and Technical Information of China (English)

    HE Xu-wen(何绪文); Xu De-ping (许德平); WU Bing(吴兵); WANG Tong(王通)

    2003-01-01

    Based on the characters of deep well-water quality in Fenxi Mining Group in Liulin, the feasibilities of two treatment technologies which use electrodialysis and reverse osmosis are analyzed. Through analyzing and comparing, reverse osmosis technology has several advantages, such as good treatment effect, convenient operating management and low run-cost.

  4. Technology alliances

    International Nuclear Information System (INIS)

    In the field of nuclear technology, Canada and Korea developed a highly successful relationship that could serve as a model for other high-technology industries. This is particularly significant when one considers the complexity and technical depth required to design, build and operate a nuclear reactor. This paper will outline the overall framework for technology transfer and cooperation between Canada and Korea, and will focus on cooperation in nuclear R and D between the two countries

  5. Second order focusing property of 210deg cylindrical energy analyzer

    International Nuclear Information System (INIS)

    It was confirmed experimentally that a 210deg cylindrical energy analyzer with drift spaces has second order focusing. The properties of the analyzer is deeply dependent on the fringing field around the entrance and exit of the cylinders. (author)

  6. Lab-on-a-chip Astrobiology Analyzer Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall goal of this program (through Phase III) is to develop an analyzer to measure chemical signatures of life in extraterrestrial settings. The analyzer...

  7. ARC Code TI: Inference Kernel for Open Static Analyzers (IKOS)

    Data.gov (United States)

    National Aeronautics and Space Administration — IKOS is a C++ library designed to facilitate the development of sound static analyzers based on Abstract Interpretation. Specialization of a static analyzer for an...

  8. Technological Tyranny

    Science.gov (United States)

    Greenwood, Dick

    1984-08-01

    It is implicitly assumed by those who create, develop, control and deploy new technology, as well as by society at-large, that technological innovation always represents progress. Such an unchallenged assumption precludes an examination and evaluation of the interrelationships and impact the development and use of technology have on larger public policy matters, such as preservation of democratic values, national security and military policies, employment, income and tax policies, foreign policy and the accountability of private corporate entities to society. This brief challenges those assumptions and calls for social control of technology.

  9. On Some Improvements of Ion Microprobe Mass Analyzer

    Directory of Open Access Journals (Sweden)

    O.S. Kuzema

    2013-10-01

    Full Text Available It has been considered the ion-optical properties and characteristics of ion microprobe analyzer in which the primary ion beam mass separation is realized by the magnetic prism, and the beam spherical capacitor is used in the secondary ion analyzing system as energy analyzer which form parallel ion beam at the mass analyzer inlet. It has allowed to improve the instrument parameters and to scale down its overall dimension.

  10. 21 CFR 862.2500 - Enzyme analyzer for clinical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Enzyme analyzer for clinical use. 862.2500 Section... Instruments § 862.2500 Enzyme analyzer for clinical use. (a) Identification. An enzyme analyzer for clinical use is a device intended to measure enzymes in plasma or serum by nonkinetic or kinetic measurement...

  11. 40 CFR 89.320 - Carbon monoxide analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Carbon monoxide analyzer calibration... Test Equipment Provisions § 89.320 Carbon monoxide analyzer calibration. (a) Calibrate the NDIR carbon... introduction into service and annually thereafter, the NDIR carbon monoxide analyzer shall be checked...

  12. 40 CFR 86.1325-94 - Methane analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Methane analyzer calibration. 86.1325... Procedures § 86.1325-94 Methane analyzer calibration. Prior to introduction into service and monthly thereafter, the methane analyzer shall be calibrated: (a) Follow the manufacturer's instructions...

  13. 40 CFR 86.125-94 - Methane analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Methane analyzer calibration. 86.125... Complete Heavy-Duty Vehicles; Test Procedures § 86.125-94 Methane analyzer calibration. Prior to introduction into service and monthly thereafter, the methane analyzer shall be calibrated: (a) Follow...

  14. 40 CFR 86.331-79 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration. 86....331-79 Hydrocarbon analyzer calibration. The following steps are followed in sequence to calibrate the hydrocarbon analyzer. It is suggested, but not required, that efforts be made to minimize relative...

  15. 21 CFR 868.1720 - Oxygen gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Oxygen gas analyzer. 868.1720 Section 868.1720...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1720 Oxygen gas analyzer. (a) Identification. An oxygen gas analyzer is a device intended to measure the concentration of oxygen in...

  16. 21 CFR 868.1640 - Helium gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Helium gas analyzer. 868.1640 Section 868.1640...) MEDICAL DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1640 Helium gas analyzer. (a) Identification. A helium gas analyzer is a device intended to measure the concentration of helium in a...

  17. 21 CFR 868.1075 - Argon gas analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Argon gas analyzer. 868.1075 Section 868.1075 Food... DEVICES ANESTHESIOLOGY DEVICES Diagnostic Devices § 868.1075 Argon gas analyzer. (a) Identification. An argon gas analyzer is a device intended to measure the concentration of argon in a gas mixture to aid...

  18. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Quench checks; NOX analyzer. 86.327-79... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum... capillary, and if used, dilution capillary. (c) Quench check as follows: (1) Calibrate the NOX analyzer...

  19. A Study on the Nuclear Technology Policy

    International Nuclear Information System (INIS)

    The objective of the study was to make policy-proposals for enhancing the effectiveness and efficiency of national nuclear technology R and D programs. To do this, environmental changes of international nuclear energy policy and trends of nuclear technology development were surveyed and analyzed. This Study analyzed trends of nuclear technology policies and developed the nuclear energy R and D innovation strategy in a viewpoint of analyzing the changes in the global policy environment associated with nuclear technology development and development of national nuclear R and D strategy

  20. Technology Catalogue

    International Nuclear Information System (INIS)

    The Department of Energy's Office of Environmental Restoration and Waste Management (EM) is responsible for remediating its contaminated sites and managing its waste inventory in a safe and efficient manner. EM's Office of Technology Development (OTD) supports applied research and demonstration efforts to develop and transfer innovative, cost-effective technologies to its site clean-up and waste management programs within EM's Office of Environmental Restoration and Office of Waste Management. The purpose of the Technology Catalogue is to provide performance data on OTD-developed technologies to scientists and engineers assessing and recommending technical solutions within the Department's clean-up and waste management programs, as well as to industry, other federal and state agencies, and the academic community. OTD's applied research and demonstration activities are conducted in programs referred to as Integrated Demonstrations (IDs) and Integrated Programs (IPs). The IDs test and evaluate.systems, consisting of coupled technologies, at specific sites to address generic problems, such as the sensing, treatment, and disposal of buried waste containers. The IPs support applied research activities in specific applications areas, such as in situ remediation, efficient separations processes, and site characterization. The Technology Catalogue is a means for communicating the status. of the development of these innovative technologies. The FY93 Technology Catalogue features technologies successfully demonstrated in the field through IDs and sufficiently mature to be used in the near-term. Technologies from the following IDs are featured in the FY93 Technology Catalogue: Buried Waste ID (Idaho National Engineering Laboratory, Idaho); Mixed Waste Landfill ID (Sandia National Laboratories, New Mexico); Underground Storage Tank ID (Hanford, Washington); Volatile organic compound (VOC) Arid ID (Richland, Washington); and VOC Non-Arid ID (Savannah River Site, South Carolina)

  1. Thermally activated technologies: Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2003-05-01

    The purpose of this Technology Roadmap is to outline a set of actions for government and industry to develop thermally activated technologies for converting America’s wasted heat resources into a reservoir of pollution-free energy for electric power, heating, cooling, refrigeration, and humidity control. Fuel flexibility is important. The actions also cover thermally activated technologies that use fossil fuels, biomass, and ultimately hydrogen, along with waste heat.

  2. Technology Selection and Appropriate Technology

    OpenAIRE

    Englander, A. Steven

    1981-01-01

    This paper provides a formal model of technology choice by a single region. Case studies have indicated that the technology acquired by LDCs often seem unsuitable, although the criteria for suitability are often unclear. The reasons which are presented for inappropriateness of the selection often rely more on political arguments then economic ones, or treat the recipient country as a passive actor in the whole process. Can a technology actively selected by a recipient country ever by inapprop...

  3. Lasers technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    The Lasers Technology Program of IPEN is committed to the development of new lasers based on the research of optical materials and new technologies, as well to laser applications in several areas: Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. The Program is basically divided into two main areas: Material and Laser Development and Laser Applications.

  4. Lasers technology

    International Nuclear Information System (INIS)

    The Lasers Technology Program of IPEN is committed to the development of new lasers based on the research of optical materials and new technologies, as well to laser applications in several areas: Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. The Program is basically divided into two main areas: Material and Laser Development and Laser Applications

  5. Radiation Technology

    International Nuclear Information System (INIS)

    The conference was organized to evaluate the application directions of radiation technology in Vietnam and to utilize the Irradiation Centre in Hanoi with the Co-60 source of 110 kCi. The investigation and study of technico-economic feasibility for technology development to various items of food and non-food objects was reported. (N.H.A)

  6. Maritime Technology

    DEFF Research Database (Denmark)

    Sørensen, Herman

    1997-01-01

    Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text.......Elementary introduction to the subject "Maritime Technology".The contents include drawings, sketches and references in English without any supplementary text....

  7. Technology collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Jacob [Halliburton (Brazil)

    2011-07-01

    The aim of this paper is to present Halliburton's Brazilian technology center. Halliburton has technology centers in the United States, Saudi Arabia, India, Singapore and Brazil, all of which aim at delivering accelerated innovation in the oil sector. The technology centers engage in research and development activities with the help of various universities and in collaboration with the customer or supplier. The Halliburton Brazil technology center provides its customers with timely research and development solutions for enhancing recovery and mitigating reservoir uncertainty; they are specialized in finding solutions for pre- and post-salt carbonate drilling and in the enhancement of production from mature fields. This presentation showcased the work carried out by the Halliburton Brazil technology center to help customers develop their deepwater field activities.

  8. Sensemaking technology

    DEFF Research Database (Denmark)

    Madsen, Charlotte Øland

    Research objective: The object of the LOK research project is to gain a better understanding of the technological strategic processes in organisations by using the concept/metaphor of sensemaking. The project will investigate the technological strategies in organisations in order to gain a deeper...... understanding of the cognitive competencies and barriers towards implementing new technology in organisations. The research will therefore concentrate on researching the development process in the organisation's perception of the external environmental elements of customers, suppliers, competitors, internal...... and external technology and legislation and the internal environmental elements of structure, power relations and political arenas. All of these variables have influence on which/how technologies are implemented thus creating different outcomes all depending on the social dynamics that are triggered by changes...

  9. A Study on the Revitalizing of technology commercialization in KAERI

    International Nuclear Information System (INIS)

    The TEC training program should be implemented for researches who want to commercialize their own technologies. To build creative organization culture is essential for technology commercialization. Collaboration strategy is related to analyze how KAERI is catching up their technological capabilities in nuclear technology, and what the success factors of KAERI in technology commercialization are.

  10. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  11. Development and Applications of Simulation Technology

    Institute of Scientific and Technical Information of China (English)

    WangZicai

    2004-01-01

    The developing process of simulation technology is discussed in view of its development, maturation and further development.The applications of simulation technology in the fields of national economy are introduced. Finally, the level and status quo of simulation technology home and overseas are analyzed, and its future trend in the new century is presented.

  12. Formulating transgenerational technology critique as conflictual collaboration

    DEFF Research Database (Denmark)

    Chimirri, Niklas Alexander

    presentation illustrates that the pristine intention of engaging in intergenerational technology design is potentially helpful for collectively formulating a productive and sustainable technology critique. On the downside the applied methodologies lack viable concepts for meaningfully analyzing its...... possibilities and limitations. Conflictual collaboration will be presented as a processual-relational concept which sketches out opportunities for formulating a purposeful transgenerational technology critique....

  13. Small Artifacts - Big Technologies

    DEFF Research Database (Denmark)

    Kreiner, Kristian

    2005-01-01

    The computer IC is the heart of the information and telecommunication technology. It is a tiny artifact, but with incredible organizing powers. We use this physical artifact as the location for studying central problems of the knowledge economy. First, the paper describes the history of chip design...... and the emergence of the technological community involved in designing and manufacturing computer chips. The community is structured in a way that reflects the underlying physical nature silicon and the numerous other materials and chemicals involved. But it also reflects the human agency of defining...... new projects, of visioning the liberation from atoms, of committing to travel many detours in the labyrinths of development, and of perceiving and exploring the affordance that new technologies hide. Some of these characteristics are analyzed empirically in a case study of designing a chip for a...

  14. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    International Nuclear Information System (INIS)

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore's law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or 'homologous') on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K

  15. Scalable Parallel Methods for Analyzing Metagenomics Data at Extreme Scale

    Energy Technology Data Exchange (ETDEWEB)

    Daily, Jeffrey A. [Washington State Univ., Pullman, WA (United States)

    2015-05-01

    The field of bioinformatics and computational biology is currently experiencing a data revolution. The exciting prospect of making fundamental biological discoveries is fueling the rapid development and deployment of numerous cost-effective, high-throughput next-generation sequencing technologies. The result is that the DNA and protein sequence repositories are being bombarded with new sequence information. Databases are continuing to report a Moore’s law-like growth trajectory in their database sizes, roughly doubling every 18 months. In what seems to be a paradigm-shift, individual projects are now capable of generating billions of raw sequence data that need to be analyzed in the presence of already annotated sequence information. While it is clear that data-driven methods, such as sequencing homology detection, are becoming the mainstay in the field of computational life sciences, the algorithmic advancements essential for implementing complex data analytics at scale have mostly lagged behind. Sequence homology detection is central to a number of bioinformatics applications including genome sequencing and protein family characterization. Given millions of sequences, the goal is to identify all pairs of sequences that are highly similar (or “homologous”) on the basis of alignment criteria. While there are optimal alignment algorithms to compute pairwise homology, their deployment for large-scale is currently not feasible; instead, heuristic methods are used at the expense of quality. In this dissertation, we present the design and evaluation of a parallel implementation for conducting optimal homology detection on distributed memory supercomputers. Our approach uses a combination of techniques from asynchronous load balancing (viz. work stealing, dynamic task counters), data replication, and exact-matching filters to achieve homology detection at scale. Results for a collection of 2.56M sequences show parallel efficiencies of ~75-100% on up to 8K cores

  16. Next-generation sequencing technologies and the application in microbiology-A review%高通量测序技术及其在微生物学研究中的应用

    Institute of Scientific and Technical Information of China (English)

    秦楠; 栗东芳; 杨瑞馥

    2011-01-01

    20世纪70年代发明的核酸测序技术为基因组学及其相关学科的发展做出了巨大贡献,本世纪初发展的以Illumina公司的HiSeq 2000,ABI公司的SOLID,和Roche公司的454技术为代表的高通量测序技术又为基因组学的发展注入了新活力.本文在阐述这些技术的基础上,着重讨论了新一代测序技术在微生物领域中的应用.%Since its invention in 1970s, nucleic acid sequencing technology has contributed tremendously to the genomics advances.The next-generation sequencing technologies, represented by HiSeq 2000 from Illumina, SOLiD from Applied Biosystems and 454 from Roche, re-energized the application of genomics.In this review, we first introduced the next-generation sequencing technologies, then, described their potential applications in the field of microbiology.

  17. Globalization & technology

    DEFF Research Database (Denmark)

    Narula, Rajneesh

    Technology and globalization are interdependent processes. Globalization has a fundamental influence on the creation and diffusion of technology, which, in turn, affects the interdependence of firms and locations. This volume examines the international aspect of this interdependence at two levels....... The boundaries of firms and countries are increasingly porous and imprecise, because firms use alliances and outsourcing, and countries are rarely technologically self-sufficient. On the other hand, locations remain distinct and idiosyncratic, with innovation systems remaining largely nationally bound...... makers and managers on industrial policy as well as the organisation of research and development by firms....

  18. Coal technology

    International Nuclear Information System (INIS)

    The coal- and gas-fueled cogeneration plants develop rapidly and according to all the scenarios will continue to grow with ever improving power generation effect in counterpressure mode. As there is no 'cooling water waste', a greater percentage of houses should be heated electrically. The coal combustion technologies mentioned here will probably converge around 53-55% coefficient of performance. Emission requirements can be fulfilled by use of modern coal technologies. Coal will stay as a competitive fuel for cogeneration as other more advanced technologies are often yet at the demonstration stage. (EG)

  19. Microprocessor technology

    CERN Document Server

    Anderson, J S

    2012-01-01

    'Microprocessor Technology' provides a complete introduction to the subject of microprocessor technology using the Z80 and 6502 processors. An emphasis on fault-finding and repair makes this an ideal text for servicing courses including City & Guilds 2240 in the UK, microelectronics units on BTEC National/Advanced GNVQ and City & Guilds 7261 Microprocessor Technology. It will also provide a refresher course for those on 'bridging' and micro appreciation courses where a measure of comparative studies is required. Clear and concise explanations are supported by work

  20. Ergonomics technology

    Science.gov (United States)

    Jones, W. L.

    1977-01-01

    Major areas of research and development in ergonomics technology for space environments are discussed. Attention is given to possible applications of the technology developed by NASA in industrial settings. A group of mass spectrometers for gas analysis capable of fully automatic operation has been developed for atmosphere control on spacecraft; a version for industrial use has been constructed. Advances have been made in personal cooling technology, remote monitoring of medical information, and aerosol particle control. Experience gained by NASA during the design and development of portable life support units has recently been applied to improve breathing equipment used by fire fighters.

  1. Superconducting technology

    International Nuclear Information System (INIS)

    Superconductivity has a long history of about 100 years. Over the past 50 years, progress in superconducting materials has been mainly in metallic superconductors, such as Nb, Nb-Ti and Nb3Sn, resulting in the creation of various application fields based on the superconducting technologies. High-Tc superconductors, the first of which was discovered in 1986, have been changing the future vision of superconducting technology through the development of new application fields such as power cables. On basis of these trends, future prospects of superconductor technology up to 2040 are discussed. In this article from the viewpoints of material development and the applications of superconducting wires and electronic devices. (author)

  2. A quantitative assessment of the Hadoop framework for analyzing massively parallel DNA sequencing data

    OpenAIRE

    Siretskiy, Alexey; Sundqvist, Tore; Voznesenskiy, Mikhail; Spjuth, Ola

    2015-01-01

    Background New high-throughput technologies, such as massively parallel sequencing, have transformed the life sciences into a data-intensive field. The most common e-infrastructure for analyzing this data consists of batch systems that are based on high-performance computing resources; however, the bioinformatics software that is built on this platform does not scale well in the general case. Recently, the Hadoop platform has emerged as an interesting option to address the challenges of incre...

  3. Analyzing the forces binding a restriction endonuclease to DNA using a synthetic nanopore

    OpenAIRE

    Dorvel, B.; Sigalov, G.; Zhao, Q.; Comer, J.; Dimitrov, V; Mirsaidov, U.; Aksimentiev, A.; Timp, G.

    2009-01-01

    Restriction endonucleases are used prevalently in recombinant DNA technology because they bind so stably to a specific target sequence and, in the presence of cofactors, cleave double-helical DNA specifically at a target sequence at a high rate. Using synthetic nanopores along with molecular dynamics (MD), we have analyzed with atomic resolution how a prototypical restriction endonuclease, EcoRI, binds to the DNA target sequence—GAATTC—in the absence of a Mg2+ ion cofactor. We have previously...

  4. Analyzing the structure of computer expert training in higher education system

    OpenAIRE

    Sergej Rusakov; Igor' Semakin; Henner, E

    2010-01-01

    We consider a variety of programs for training computer and information technology experts, and analyze the structure of knowledge of IT experts of various types. With the help of cluster analysis, we have established three clusters of programs, which we chose to call "mathematician/programmer", "engineer/programmer", and "system administrator". We determine didactic units related to each module of the training and obtain expert evaluation of the coverage of the modules by the didactic units....

  5. STATISTICAL ANALYZE OF APEC COUNTRIES AND TURKEY IN TERMS OF KNOWLEDGE SOCIETY INDICATORS AND SOME FINDINGS

    OpenAIRE

    ARIÇ, HATİCE ERKEKOĞLU – K. HALİL

    2013-01-01

    In this study six variables were used which are related about knowledge society and calculated by World Bank. These variables are Knowledge Economy Index (KEI), Knowledge Index (KI), Economic Intensive and Institutional Regime, Innovation System, Education and Human Resources, Knowledge and Communication Technologies. Study includes twenty APEC countries and Turkey. As a method, hierarchical cluster analysis was used. Probability of correct classification was evaluated by discriminant analyze...

  6. Analyzing the impact of course structure on electronic textbook use in blended introductory physics courses

    OpenAIRE

    Seaton, Daniel T.; Kortemeyer, Gerd; Bergner, Yoav; Rayyan, Saif; David E. Pritchard

    2013-01-01

    We investigate how elements of course structure (i.e., the frequency of assessments as well as the sequencing and weight of course resources) influence the usage patterns of electronic textbooks (e-texts) in introductory physics courses. Specifically, we analyze the access logs of courses at Michigan State University and the Massachusetts Institute of Technology, each of which deploy e-texts as primary or secondary texts in combination with different formative assessments (e.g., embedded read...

  7. Analyzer of energy spectra of a magnetized relativistic electron beam

    International Nuclear Information System (INIS)

    Analyzer of magnetized REB instant energy spectrum is described. The analyzer operation principle is based on the application of a sharp change of the direction of force lines of a magnetic field which is non-adiabatic for the beam electrons. The analyzer design is described, the main factors effecting the energy resolution are considered. The analyzer serviceability is examined in the course of experiments on plasma heating using a heavy-current microsecond REB at the GOL-3 device. The analyzer energy resolution which does not exceed 10% at 0.8 MeV energy and 20% at 0.3 MeV is determined. Beam energy spectra are obtained in one of the regimes of beam interaction with plasma. The efficiency of beam interaction with plasma determined using the analyzer achieves 30%. 10 refs.; 7 figs

  8. Digital signal processing in the radio science stability analyzer

    Science.gov (United States)

    Greenhall, C. A.

    1995-01-01

    The Telecommunications Division has built a stability analyzer for testing Deep Space Network installations during flight radio science experiments. The low-frequency part of the analyzer operates by digitizing wave signals with bandwidths between 80 Hz and 45 kHz. Processed outputs include spectra of signal, phase, amplitude, and differential phase; time series of the same quantities; and Allan deviation of phase and differential phase. This article documents the digital signal-processing methods programmed into the analyzer.

  9. A case for static analyzers in the cloud

    OpenAIRE

    Barnett, Michael; Bouaziz, Mehdi; Logozzo, Francesco; Fähndrich, Manuel

    2013-01-01

    We describe our ongoing effort of moving a desktop static analyzer, Clousot, into a cloud-based one, Cloudot. A cloud-based static analyzer runs as a service. Clients issue analysis requests through the local network or over the internet. The analysis takes advantage of the large computation resources offered by the cloud: the underlying infrastructure ensures scaling and virtually unlimited storage. Cloud-based analyzers may relax performance-precision trade-offs usually associated with desk...

  10. L G-2 Scintrex manual.Fluorescence analyzer

    International Nuclear Information System (INIS)

    The Scintrex Fluorescence Analyzer LG-2 selectively detects the presence of certain fluorescent minerals through UV photoluminescence induced and provides quantitative information on its distribution.

  11. Analyzing Real-World Light Duty Vehicle Efficiency Benefits

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, Jeffrey; Wood, Eric; Chaney, Larry; Holden, Jacob; Jeffers, Matthew; Wang, Lijuan

    2016-06-08

    Off-cycle technologies represent an important pathway to achieve real-world fuel savings, through which OEMs can potentially receive credit toward CAFE compliance. DOE national labs such as NREL are well positioned to provide objective input on these technologies using large, national data sets in conjunction with OEM- and technology-specific testing. This project demonstrates an approach that combines vehicle testing (dynamometer and on-road) with powertrain modeling and simulation over large, representative datasets to quantify real-world fuel economy. The approach can be applied to specific off-cycle technologies (engine encapsulation, start/stop, connected vehicle, etc.) in A/B comparisons to support calculation of realistic real-world impacts. Future work will focus on testing-based A/B technology comparisons that demonstrate the significance of this approach.

  12. Exploration technology

    Energy Technology Data Exchange (ETDEWEB)

    Roennevik, H.C. [Saga Petroleum A/S, Forus (Norway)

    1996-12-31

    The paper evaluates exploration technology. Topics discussed are: Visions; the subsurface challenge; the creative tension; the exploration process; seismic; geology; organic geochemistry; seismic resolution; integration; drilling; value creation. 4 refs., 22 figs.

  13. Plasma technology

    International Nuclear Information System (INIS)

    IREQ was contracted by the Canadian Electrical Association to review plasma technology and assess the potential for application of this technology in Canada. A team of experts in the various aspects of this technology was assembled and each team member was asked to contribute to this report on the applications of plasma pertinent to his or her particular field of expertise. The following areas were examined in detail: iron, steel and strategic-metals production; surface treatment by spraying; welding and cutting; chemical processing; drying; and low-temperature treatment. A large market for the penetration of electricity has been identified. To build up confidence in the technology, support should be provided for selected R and D projects, plasma torch demonstrations at full power, and large-scale plasma process testing

  14. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    Science.gov (United States)

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  15. Lasers technology

    International Nuclear Information System (INIS)

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners

  16. Lasers technology

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-07-01

    The Laser Technology Program of IPEN is developed by the Center for Lasers and Applications (CLA) and is committed to the development of new lasers based on the research of new optical materials and new resonator technologies. Laser applications and research occur within several areas such as Nuclear, Medicine, Dentistry, Industry, Environment and Advanced Research. Additional goals of the Program are human resource development and innovation, in association with Brazilian Universities and commercial partners.

  17. Technology Trade

    OpenAIRE

    José L. Groizard

    2008-01-01

    This study addresses the question of why some countries import more R&D- intensive goods than others. Using a panel data set of 80 countries for the period 1970 to 1995, results indicate that domestic investment, FDI and the quality of intellectual property rights (IPR) systems positively affect technology imports. However, the higher the percentage of the workforce with primary studies, the lower technology imports are. Moreover, IPRs tend to reinforce the positive role played by FDI in impo...

  18. Knowledge Technologies

    OpenAIRE

    Milton, Nick

    2008-01-01

    Several technologies are emerging that provide new ways to capture, store, present and use knowledge. This book is the first to provide a comprehensive introduction to five of the most important of these technologies: Knowledge Engineering, Knowledge Based Engineering, Knowledge Webs, Ontologies and Semantic Webs. For each of these, answers are given to a number of key questions (What is it? How does it operate? How is a system developed? What can it be used for? What tools are available? Wha...

  19. Technological Inovattion

    OpenAIRE

    Alexandra Bostan

    2009-01-01

    The spectacular development of technology within the field of informatics and telecommunicationfor the last decade, associated with a postindustrial revolution, has solidly contributed to the globalization ofthe contemporary international economic life. A very important factor in promoting the globalization ofproduction and the financial globalization is the recent progress from the technology of information andcommunication which has a strong impact on the economic, social and cultural life....

  20. Technology overview

    International Nuclear Information System (INIS)

    The Integrated Assessment Program, funded by the ERDA Division of Technology Overview, is the mechanism by which health, environmental, social, economic and institutional factors are combined into a form useful for energy planning and decision making. This program selectively combines information about effects of alternative energy technologies (such as waste releases, land and water use, and social effects) to produce broad-based assessments of the advantages and disadvantages of energy and conservation options. As a corollary, needs for further research, development, and technology transfer are identified. The program is focused on four interrelated activities: supporting systems analysis to develop and improve methods for use in assessing and comparing impacts of energy and conservation options; integrated technological impact assessment, applying these methods to help select technologies for development that are safe, clean, and environmentally acceptable; regional comparative assessments, applying the results of the technological impact assessments to identification of regional energy strategies; and a regional outreach effort to assist regional and state agencies in their energy planning programs

  1. THE LINK BETWEEN NON TECHNOLOGICAL INNOVATIONS AND TECHNOLOGICAL INNOVATION

    OpenAIRE

    Nguyen-Thi, Thuc Uyen; Mothe, Caroline

    2010-01-01

    Purpose This paper aims to provide evidence of the major role of non-technological activities in the innovation process. It highlights the effects of marketing and organizational innovation strategies on technological innovation performance. Design/methodology/approach The article tests theoretical hypotheses on a sample of 555 firms of the 4th Community Innovation Survey (CIS 4) in 2006 in Luxembourg. Data are analyzed through a generalizedTobit model. Findings In the present study, evidence...

  2. Control of a pulse height analyzer using an RDX workstation

    International Nuclear Information System (INIS)

    The Nuclear Chemistry Division of Lawrence Livermore National laboratory is in the midst of upgrading its radiation counting facilities to automate data acquisition and quality control. This upgrade requires control of a pulse height analyzer (PHA) from an interactive LSI-11/23 workstation running RSX-11M. The PHA is a micro-computer based multichannel analyzer system providing data acquisition, storage, display, manipulation and input/output from up to four independent acquisition interfaces. Control of the analyzer includes reading and writing energy spectra, issuing commands, and servicing device interrupts. The analyzer communicates to the host system over a 9600-baud serial line using the Digital Data Communications link level Protocol (DDCMP). We relieved the RSX workstation CPU from the DDCMP overhead by implementing a DEC compatible in-house designed DMA serial line board (the ISL-11) to communicate with the analyzer. An RSX I/O device driver was written to complete the path between the analyzer and the RSX system by providing the link between the communication board and an application task. The I/O driver is written to handle several ISL-11 cards all operating in parallel thus providing support for control of multiple analyzers from a single workstation. The RSX device driver, its design and use by application code controlling the analyzer, and its operating environment will be discussed

  3. A Morphological Analyzer for Vocalized or Not Vocalized Arabic Language

    Science.gov (United States)

    El Amine Abderrahim, Med; Breksi Reguig, Fethi

    This research has been to show the realization of a morphological analyzer of the Arabic language (vocalized or not vocalized). This analyzer is based upon our object model for the Arabic Natural Language Processing (NLP) and can be exploited by NLP applications such as translation machine, orthographical correction and the search for information.

  4. 40 CFR 89.319 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... incorporated by reference at § 89.6. (ii) The HFID optimization procedures outlined in 40 CFR part 1065... shall be introduced directly at the analyzer, unless the “overflow” calibration option of 40 CFR part... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration....

  5. 40 CFR 90.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... procedures outlined in 40 CFR part 1065, subpart D. (iii) Alternative procedures may be used if approved in... method for dilute sampling described in 40 CFR part 1065, subpart F, may be used. (1) Adjust analyzer to... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration....

  6. 40 CFR 86.317-79 - Hydrocarbon analyzer specifications.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Hydrocarbon analyzer specifications....317-79 Hydrocarbon analyzer specifications. (a) Hydrocarbon measurements are to be made with a heated... measures hydrocarbon emissions on a dry basis is permitted for gasoline-fueled testing; Provided,...

  7. 40 CFR 91.316 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... 40 CFR part 1065, subpart D. (iii) Alternative procedures may be used if approved in advance by the... sampling described in 40 CFR part 1065, subpart F, may be used. (1) Adjust analyzer to optimize performance... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Hydrocarbon analyzer calibration....

  8. 40 CFR 89.321 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... periodic interference, system check, and calibration test procedures specified in 40 CFR part 1065 may be... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Test Equipment Provisions § 89.321 Oxides of nitrogen analyzer calibration. (a) The...

  9. 40 CFR 90.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ..., and calibration test procedures specified in 40 CFR part 1065, subpart D, may be used in lieu of the... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Emission Test Equipment Provisions § 90.318 Oxides of nitrogen analyzer calibration. (a) Calibrate...

  10. 40 CFR 91.318 - Oxides of nitrogen analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ... check, and calibration test procedures specified in 40 CFR part 1065, subparts C and D, may be used in... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Oxides of nitrogen analyzer... Provisions § 91.318 Oxides of nitrogen analyzer calibration. (a) Calibrate the chemiluminescent oxides...

  11. Empirical Validity of Ertl's Brain-Wave Analyzer (BWA02).

    Science.gov (United States)

    Fischer, Donald G.; And Others

    1978-01-01

    The empirical validity of Ertl's brain wave analyzer was investigated by the known contrasted groups method. Thirty-two academically talented and 16 academically handicapped were compared on four Primary Mental Abilities tests, two Sequential Tests of Educational Progress measures, and seven Brain Wave Analyzer measures. (Author/JKS)

  12. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  13. DUAL-CHANNEL PARTICLE SIZE AND SHAPE ANALYZER

    Institute of Scientific and Technical Information of China (English)

    Arjen van der Schoot

    2004-01-01

    @@ Fig. 1 shows a newly developed analyzer (Ankersmid CIS-100) that brings together two different measurement channels for accurate size and shape measurement of spherical and non-spherical particles. The size of spherical particles is measured by a HeNe Laser Beam; the size of non-spherical particles is analyzed by Dynamic Video Analysis of the particles' shape.

  14. 40 CFR 86.1221-90 - Hydrocarbon analyzer calibration.

    Science.gov (United States)

    2010-07-01

    ...-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1221-90 Hydrocarbon analyzer calibration. The FID... methanol-fueled vehicles shall be operated at 235° ±15 °F (113° ±8 °C)). Analyzers used with gasoline-fuel...-fuel may be optimized using methane, or if calibrated using propane the FID response to methane...

  15. Information Technology Teachers' Practices Regarding Mainstreaming Students

    OpenAIRE

    ERDOĞDU, Funda; ÖZBEY, Fidan

    2014-01-01

    This study is conducted to investigate information technology teachers’ practices related to instructional adaptations for mainstreaming students at information technology courses. In the research, the data were collected with the qualitatively patterned and semi-structured interview form. The data obtained from the interviews were analyzed through content analysis. The research findings indicate that information technology teachers at information technology courses carry out such practices a...

  16. Technological Unemployment and an Attainable Way Out

    OpenAIRE

    Pavlova, Adelina

    2015-01-01

    The purpose of the thesis is to analyze the available information on the technological unemployment issue. The hypothesis of the thesis is that displacement of workers because of technological development has reasonable chances to happen in the future. Technological unemployment is hotly debated issue. Some part of economists argue that technological unemployment is a short-term problem; others see it as a risk for society. Thus, at this stage it is important to identify controversy in studie...

  17. Technology and technology transfer: some basic issues

    OpenAIRE

    Shamsavari, Ali; Adikibi, Owen; Taha, Yasser

    2002-01-01

    This paper addresses various issues relating to technology and transfer of technology such as technology and society, technology and science, channels and models of technology transfer, the role of multinational companies in transfer of technology, etc. The ultimate objective is to pose the question of relevance of some existing models and ideas like technological independence in an increasingly globalised world economy.

  18. Technological breakthroughs and asset replacement

    OpenAIRE

    YATSENKO, Yuri; Hritonenko, Natali

    2008-01-01

    The authors analyze the optimal replacement of assets under continuous and discontinuous technological change. They investigate the variable lifetime of assets in an infinite-horizon replacement problem. Due to deterioration, the maintenance cost increases when the asset becomes older. Because of technological change, both maintenance and new capital costs decrease for a fixed asset age. The dynamics of the optimal lifetime is investigated analytically and numerically under tec...

  19. 21世纪制纸技术、纸制品进化之予测的有关人类要素分析的认识科学和文化遗传基因科学的方法%A COGNITIVE AND MEMETIC SCIENCE APPROACH TO ANALYZE THE HUMAN FACTORS IN PREDICTING THE EVOLUTION OF PAPER TECHNOLOGY AND PRODUCTS IN THE 21sT CENTURY

    Institute of Scientific and Technical Information of China (English)

    尾锅史彦

    2004-01-01

    @@ INTRODUCTION Predicting the future of paper industry is conventionally conducted from the technological and market-oriented aspects as well as a variety of constraints lying ahead of the industry such as resource, energy, and environmental issues.Since paper products, particularly paper media,have higher affinity to human being compared with other sheet-like materials such as plastics, metals,glasses and so on, not only the above factors but human factors such as ‘the affinity of paper to human being' and ‘the cognitive characteristics of paper'have to be taken into consideration in constructing a precise prediction model for the future of paper industry.

  20. Technology cycles and technology revolutions

    Energy Technology Data Exchange (ETDEWEB)

    Paganetto, Luigi; Scandizzo, Pasquale Lucio

    2010-09-15

    Technological cycles have been characterized as the basis of long and continuous periods economic growth through sustained changes in total factor productivity. While this hypothesis is in part consistent with several theories of growth, the sheer magnitude and length of the economic revolutions experienced by humankind seems to indicate surmise that more attention should be given to the origin of major technological and economic changes, with reference to one crucial question: role of production and use of energy in economic development.

  1. Experimental analysis of a new retarding field energy analyzer

    International Nuclear Information System (INIS)

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed

  2. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  3. Analyzing Service Oriented Architecture (SOA) in Open Source Products

    OpenAIRE

    Gohar, Adnan

    2010-01-01

    Service Oriented Architecture (SOA) is an architectural paradigm that allows building of infrastructures for diverse application interaction and integration via services across different platforms, domains of technology and locations. SOA differs from traditional architectures, as it focuses on integrating capabilities that are distributed and implemented using a mixture of technologies. SOA provides a set of methodologies and strategies to accomplish interoperability and integration among di...

  4. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ... perform this verification for batch gas analyzers or for continuous gas analyzers that are used only for... you must record data at a frequency greater than or equal to that of the updating-recording frequency... data from this test to determine t 50 for time alignment, record this time as t 0. (v) Allow...

  5. Knowledge Technologies

    CERN Document Server

    Milton, Nick

    2008-01-01

    Several technologies are emerging that provide new ways to capture, store, present and use knowledge. This book is the first to provide a comprehensive introduction to five of the most important of these technologies: Knowledge Engineering, Knowledge Based Engineering, Knowledge Webs, Ontologies and Semantic Webs. For each of these, answers are given to a number of key questions (What is it? How does it operate? How is a system developed? What can it be used for? What tools are available? What are the main issues?). The book is aimed at students, researchers and practitioners interested in Knowledge Management, Artificial Intelligence, Design Engineering and Web Technologies. During the 1990s, Nick worked at the University of Nottingham on the application of AI techniques to knowledge management and on various knowledge acquisition projects to develop expert systems for military applications. In 1999, he joined Epistemics where he worked on numerous knowledge projects and helped establish knowledge management...

  6. Seafood Technology

    DEFF Research Database (Denmark)

    Børresen, Torger

    This presentation will fill the total picture of this conference between fisheries and aquaculture, blue biotech and bioconservation, by considering the optimal processing technology of marine resources from the raw material until the seafood reaches the plate of the consumer. The situation today...... must be performed such that total traceability and authenticity of the final products can be presented on demand. The most important aspects to be considered within seafood technology today are safety, healthy products and high eating quality. Safety can be divided into microbiological safety...... and not presenting any safety risk per se. Seafood is healthy due to the omega-3 fatty acids and the nutritional value of vitamins, peptides and proteins. The processing technology must however be performed such that these valuable features are not lost during production. The same applies to the eating quality. Any...

  7. Persuasive Technology

    DEFF Research Database (Denmark)

    This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included in this vol......This book constitutes the proceedings of the 5th International Conference on Persuasive Technology, PERSUASIVE 2010, held in Copenhagen Denmark in June 2010. The 25 papers presented were carefully reviewed and selected from 80 submissions. In addition three keynote papers are included...... in this volume. The topics covered are emotions and user experience, ambient persuasive systems, persuasive design, persuasion profiles, designing for health, psychology of persuasion, embodied and conversational agents, economic incentives, and future directions for persuasive technology....

  8. Technology Management

    DEFF Research Database (Denmark)

    Pilkington, Alan

    2014-01-01

    This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter-relationships. Spe......This paper reports a bibliometric analysis (co-citation network analysis) of 10 journals in the management of technology (MOT) field. As well as introducing various bibliometric ideas, network analysis tools identify and explore the concepts covered by the field and their inter......-relationships. Specific results from different levels of analysis show the different dimensions of technology management: • Co-word terms identify themes • Journal co-citation network: linking to other disciplines • Co-citation network show concentrations of themes The analysis shows that MOT has a bridging role...

  9. International Space Station Major Constituent Analyzer On-orbit Performance

    Science.gov (United States)

    Gardner, Ben D.; Erwin, Phillip M.; Wiedemann, Rachel; Matty, Chris

    2016-01-01

    The Major Constituent Analyzer (MCA) is a mass spectrometer based system that measures the major atmospheric constituents on the International Space Station. A number of limited-life components require periodic change-out, including the ORU 02 analyzer and the ORU 08 Verification Gas Assembly. The most recent ORU 02 and ORU 08 assemblies are operating nominally. For ORU 02, the ion source filaments and ion pump lifetime continue to be key determinants of MCA performance. Additionally, testing is underway to evaluate the capacity of the MCA to analyze ammonia. Finally, plans are being made to bring the second MCA on ISS to an operational configuration.

  10. Emergency response training with the BNL plant analyzer

    International Nuclear Information System (INIS)

    Presented in the experience in the use of the BNL plant analyzer for NRC emergency response training to simulated accidents in a BWR. The unique features of the BNL Plant Analyzer that are important for the emergency response training are summarized. A closed-loop simulation of all the key systems of a power plant in question was found essential to the realism of the emergency drills conducted at NRC. The faster than real-time simulation speeds afforded by the BNL Plant Analyzer have demonstrated its usefulness for the timely conduct of the emergency response training

  11. Note: Portable rare-earth element analyzer using pyroelectric crystal

    International Nuclear Information System (INIS)

    We report a portable rare-earth element analyzer with a palm-top size chamber including the electron source of a pyroelectric crystal and the sample stage utilizing cathodoluminescence (CL) phenomenon. The portable rare-earth element analyzer utilizing CL phenomenon is the smallest reported so far. The portable rare-earth element analyzer detected the rare-earth elements Dy, Tb, Er, and Sm of ppm order in zircon, which were not detected by scanning electron microscopy-energy dispersive X-ray spectroscopy analysis. We also performed an elemental mapping of rare-earth elements by capturing a CL image using CCD camera

  12. Analyzing Risk and Performance Using the Multi-Factor Concept

    OpenAIRE

    Vermeulen, Erik; Spronk, Jaap; Wijst, Nico

    1996-01-01

    textabstractIn this paper, we present a new model to analyze the risk and the expected level of firm performance. This model is based on the multi-factor approach to risk, in which unexpected performance is explained through sensitivities to unexpected changes of risk factors. Instead of using the multi-factor approach for the analysis of security portfolios, it is used to analyze performance measures of firms. In this paper the multi-factor approach is not only used to analyze risk, but also...

  13. Narrow structures observed in the p-p analyzing power

    International Nuclear Information System (INIS)

    The momentum dependence of the p-p elastic analyzing power has been measured in small steps using an internal target during polarized beam acceleration from 1 to 3 GeV/c. The momentum bin size ranges from 5 to 18 MeV/c. The relative uncertainty of the analyzing power is typically less than 0.01 for each momentum bin. Narrow structures have been observed in the two-proton invariant mass distribution of the analyzing power. A brief discussion on the interpretation of the present results is also given. 11 refs., 7 figs., 1 tab

  14. Note: Portable rare-earth element analyzer using pyroelectric crystal

    Energy Technology Data Exchange (ETDEWEB)

    Imashuku, Susumu, E-mail: imashuku.susumu.2m@kyoto-u.ac.jp; Fuyuno, Naoto; Hanasaki, Kohei; Kawai, Jun [Department of Materials Science and Engineering, Kyoto University, Sakyo, Kyoto 606-8501 (Japan)

    2013-12-15

    We report a portable rare-earth element analyzer with a palm-top size chamber including the electron source of a pyroelectric crystal and the sample stage utilizing cathodoluminescence (CL) phenomenon. The portable rare-earth element analyzer utilizing CL phenomenon is the smallest reported so far. The portable rare-earth element analyzer detected the rare-earth elements Dy, Tb, Er, and Sm of ppm order in zircon, which were not detected by scanning electron microscopy-energy dispersive X-ray spectroscopy analysis. We also performed an elemental mapping of rare-earth elements by capturing a CL image using CCD camera.

  15. Mirror Technology

    Science.gov (United States)

    1992-01-01

    Under a NASA contract, MI-CVD developed a process for producing bulk silicon carbide by means of a chemical vapor deposition process. The technology allows growth of a high purity material with superior mechanical/thermal properties and high polishability - ideal for mirror applications. The company employed the technology to develop three research mirrors for NASA Langley and is now marketing it as CVD SILICON CARBIDE. Its advantages include light weight, thermal stability and high reflectivity. The material has nuclear research facility applications and is of interest to industrial users of high power lasers.

  16. Playful Technology

    DEFF Research Database (Denmark)

    Johansen, Stine Liv; Eriksson, Eva

    2013-01-01

    In this paper, the design of future services for children in Danish public libraries is discussed, in the light of new challenges and opportunities in relation to new media and technologies. The Danish government has over the last few years initiated and described a range of initiatives regarding...... in the library, the changing role of the librarians and the library space. We argue that intertwining traditional library services with new media forms and engaging play is the core challenge for future design in physical public libraries, but also that it is through new media and technology that new...

  17. Architectural technology

    DEFF Research Database (Denmark)

    2005-01-01

    The booklet offers an overall introduction to the Institute of Architectural Technology and its projects and activities, and an invitation to the reader to contact the institute or the individual researcher for further information. The research, which takes place at the Institute of Architectural...... Technology at the Roayl Danish Academy of Fine Arts, School of Architecture, reflects a spread between strategic, goal-oriented pilot projects, commissioned by a ministry, a fund or a private company, and on the other hand projects which originate from strong personal interests and enthusiasm of individual...

  18. Aplicación de un modelo de ecuaciones estructurales para analizar los sistemas de gestión en la integración de la RSC y su influencia en la estrategia y el performance de las empresas tecnológicas || Applying a structural equation model to analyze management systems in the integration of CSR and its influence on the strategy and performance of technology companies

    Directory of Open Access Journals (Sweden)

    Bernal Conesa, Juan Andrés

    2016-06-01

    Full Text Available La importancia de los sistemas de gestión para la integración de la RSC en la estrategia de la empresa es un recurso vital que ha sido poco estudiado en las empresas tecnológicas. En este artículo se propone un modelo de ecuaciones estructurales para explicar la influencia de la RSC y su integración en el sistema de gestión de la empresa, facilitada por la existencia de sistemas de gestión normalizados previos, y cómo influye dicha integración en la estrategia de la empresa y si esto tiene un reflejo en el performance económico de la empresa tecnológica. El estudio se llevó a cabo en empresas ubicadas en parques científicos y tecnológicos españoles. Los resultados del modelo revelan que existe una relación positiva, directa y estadísticamente significativas entre la integración de la RSC y la estrategia, por un lado, y la integración y el performance, por el otro. Asimismo se evidencia unas relaciones indirectas entre los sistemas de gestión normalizados previos a la implantación de la RSC y el performance y, por tanto, con implicaciones prácticas para la gestión de la RSC en empresas tecnológicas. || The importance of management systems for the integration of CSR in the company strategy is a vital resource that has been little studied in technology companies. In this paper a structural equation model is proposed in order to explain the influence of CSR and its integration into the management system of the company. This influence is facilitated by the existence of previous standardized management systems, and how this integration affects the strategy of the company and if this is a reflection on the economic performance of the technology company. The study was conducted in companies located in Spanish Science and Technology Parks. On the one hand, model results shows that there is a positive, direct and statistically significant relationship between the integration of CSR and strategy; on the other hand, performance and

  19. Development of a Virtual Technology Coach to Support Technology Integration for K-12 Educators

    Science.gov (United States)

    Sugar, William; van Tryon, Patricia J. Slagter

    2014-01-01

    In an effort to develop a virtual technology coach for K-12 educators, this article analyzed survey results from sixty teachers with regards to specific resources that a technology coach could provide within a virtual environment. A virtual technology coach was proposed as a possible solution to provide continual professional development for…

  20. Relations between the technological standards and technological appropriation

    Directory of Open Access Journals (Sweden)

    Carlos Alberto PRADO GUERRERO

    2010-06-01

    Full Text Available The objective of this study is to analyze the educational practices of using Blackboard in blended learning environments with students of higher education to understand the relationship between technological appropriation and standards of educational technology. To achieve that goal, the following research question was raised: ¿To what extent are the standards of education technology with the appropriation of technology in blended learning environments in higher educa­tion related? The contextual framework of this work includes the following topics: the institution, teaching, teachers and students. The design methodology that was used is of a correlation type. Correlations were carried out to determine the frequency and level in the technological standards as well as the appropriation of technology. In the comparison of the results obtained by the students, the teachers and the platform; we found that students in the school study showed a high degree of technology ownership and this was the same for the performance shown on the technological standards. It was established that teachers play a key role in developing the techno­logical appropriation of students and performance in technology standards.